olleh olleh - 2 months ago 39
SQL Question

Laravel. seeding large sql file

A memory exhaust happens when I run my db seed script in production.

Below is the my seed script.

class MembershipTableSeeder extends Seeder {
public function run()
{
DB::table('members')->delete();
foreach (range(1, 99) as $days){
Members::create(array( 'membership_code' => 'test'.$days));
}
DB::unprepared(file_get_contents(app_path()."/database/seeds/members.sql"));

}
}


So what I did was add a no-limit on my seed script.

ini_set('memory_limit', '-1');


Problem now is,when I run the script it logs out in the terminal the content of the sql script. (Which is very very big)

Is there a good way of running a sql dump inside my db seeds that doesn't consume much memory? What i did now was run a manual "mysql -uuser -p db < script.sql"

Answer

The problem happens because when using Db::unprepared it also logs the query to the laravel.log file, making in background much more actions then you think, from this side you have memory exhaust. If you are not running the safe mode I would stick to executing the console command like this:

exec("mysql -u ".\Config::get('database.mysql.user')." -p".\Config::get('database.mysql.password')." ".\Config::get('database.mysql.database')." < script.sql")