Filip Nowacki Filip Nowacki - 2 years ago 141
PHP Question

How to import bigdata with symfony

I have one problem with importing data from csv to database.

For now, my code looks like this:

public function run()
{
$this->startProgressBar();
foreach ($this->elements as $element) {
$this->insertCity($element);
$this->advanceProgressBar();
}
$this->finishProgressBar();
}

/**
* @param array $item
*/
private function insertCity(array $item = [])
{
$repository = $this->getDoctrine()->getRepository(Commune::class);
$commune = $repository->findOneByTerc($this->getTerc($item));

$district = $item['uid'] == $item['district_uid'] ? null : $item['district_uid'];

$city = new City();
$city->setName($item['name']);
$city->setCommuneId($commune->getId());
$city->setDistrictUid($district);
$city->setType($item['city_type']);
$city->setUid($item['uid']);

$this->getDoctrine()->getManager()->persist($city);
$this->getDoctrine()->getManager()->flush();
}


Every one row I make select and insert. My csv file has 100k rows. In 1 hour, this code imports only 10k rows :(

Any ideas, how can I optimize it?

Filip.

Answer Source

Use SQL.

From Doctrine website, part about mass procesing:

An ORM tool is not primarily well-suited for mass inserts, updates or deletions.

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download