Filip Nowacki Filip Nowacki - 2 years ago 141
PHP Question

How to import bigdata with symfony

I have one problem with importing data from csv to database.

For now, my code looks like this:

public function run()
foreach ($this->elements as $element) {

* @param array $item
private function insertCity(array $item = [])
$repository = $this->getDoctrine()->getRepository(Commune::class);
$commune = $repository->findOneByTerc($this->getTerc($item));

$district = $item['uid'] == $item['district_uid'] ? null : $item['district_uid'];

$city = new City();


Every one row I make select and insert. My csv file has 100k rows. In 1 hour, this code imports only 10k rows :(

Any ideas, how can I optimize it?


Answer Source

Use SQL.

From Doctrine website, part about mass procesing:

An ORM tool is not primarily well-suited for mass inserts, updates or deletions.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download