rram rram - 2 months ago 6
MySQL Question

Insert large data in single time or every loop?

I have datas which are 3000+ rows. So previously I need to import this data to another tabe. SO previously I was doing it like below

<?php
foreach($datas as $data){
//INSERT INTO table_name with $data
$con->query('insert query');
}

?>


Then I was thinking inserting each time consume some time by connecting to mysql and run insert queries though we have the connection created initially. So I tried to create the query first and then insert

So only one time mysql execution this avoids the each time connection and insert etc

<?php
foreach($datas as $data){
//INSERT INTO table_name with $data
//Concatenate each row and make single insert query with , separated values
}

//finally run the mysql insert with php
$con->query('insert query');
//so only one time we call mysql server
?>


I like the second method but the problem is, if there is any one row data is not valid then the whole query fails.

If it is in the foreach insert then only the current row insert will fail other data will be insert.

Is there a way to insert in bunch but if there is any single row data is not valid only that row alone shouldn't be insert and other rows should have been inserted in the table instead of whole query to fail?

NOTE: I use
,
for the whole query values. 3000+ row data are small but I will be doing this often. That's why looking for a performance. above code is just the template not the real code.

Answer

If you want performance, then minimize as much php interaction as possible. This means direct LOAD DATA INFILE or for pre-existing data from table to table, use sql-only movement. In other words, no PHP looping.

Concerning your comments to allow some to fail:

For data loads, use LOAD DATA INFILE IGNORE. Note the word ingore.

For pre-existing data movement, use INSERT IGNORE. Same as insert, just add ignore.

You will not beat this for performance compared to a PHP-driven loop approach.

Comments