Meady U. Meady U. - 1 year ago 132
SQL Question

How to import huge CSV file with 200,00 rows to MySQL (asynchronous and fast)?

I have to write a PHP script that will import data from a given CSV file into MySQL database. The given CSV file can contain up to 200,000 rows.
I tried the following but problems arise :

  1. LOAD DATA LOCAL INFILE : I cannot use LOAD DATA LOCAL INFILE statement because I wanted to do some validations first BEFORE uploading the rows, also, our DB admin doesn't want me to use that statement and I don't know why.

  2. FOR LOOP : Inserting line by line inside FOR loop will take too much time resulting to Connection Timeout.

Now, I am thinking of a solution by splitting the CSV file into smaller chunks, then inserting them asynchronously. I am already done with the splitting of CSV, but I currently have no idea how to asynchronously insert into my database for quick and safe way. But I heard that I will be using Ajax here.

Any solution you can recommend? Thanks a lot in advance!

Answer Source

Thanks to everyone who gave answers to this question. I have discovered a solution! Just wanted to share it, in case someone needs to create a PHP script that will import a huge CSV file into MySQL database (asynchronously and fast!) I have tested my code with 400,000 rows and the importing is done in seconds. I believe it would work with larger files, you just have to modify maximum upload file size.

In this example, I will be importing a CSV file that contains two columns (name, contact_number) into a MySQL DB that contains the same columns.

Your CSV file should look like this :

Ana, 0906123489

John, 0908989199

Peter, 0908298392



So, here's the solution.

First, create your table

CREATE TABLE `testdb`.`table_test`
`name` VARCHAR(100) NOT NULL ,
`contact_number` VARCHAR(100) NOT NULL ,

Second, I have 4 PHP files. All you have to do is place this into a single folder. PHP files are as follows :


<form action="upload.php" method="post" enctype="multipart/form-data">
<input type="file" name="csv" value="" />
<input type="submit" name="submit" value="Save" /></form>


//modify your connections here
$servername = "localhost";
$username = "root";
$password = "";
$dbname = "testDB";
$conn = new mysqli($servername, $username, $password, $dbname);
if ($conn->connect_error) {
    die("Connection failed: " . $conn->connect_error);


$data = $_POST['file'];
$handle = fopen($data, "r");
$test = file_get_contents($data);
if ($handle) {
    $counter = 0;
    //instead of executing query one by one,
    //let us prepare 1 SQL query that will insert all values from the batch
    $sql ="INSERT INTO table_test(name,contact_number) VALUES ";
    while (($line = fgets($handle)) !== false) {
      $sql .= "($line),";
    $sql = substr($sql, 0, strlen($sql) - 1);
     if ($conn->query($sql) === TRUE) {
    } else {
} else {  
//unlink CSV file once already imported to DB to clear directory


<script src=""></script>
//Declaration of function that will insert data into database
 function senddata(filename){
        var file = filename;
            type: "POST",
            url: "senddata.php",
            data: {file},
            async: true,
            success: function(html){
$csv = array();
$batchsize = 1000; //split huge CSV file by 1,000, you can modify this based on your needs
if($_FILES['csv']['error'] == 0){
    $name = $_FILES['csv']['name'];
    $ext = strtolower(end(explode('.', $_FILES['csv']['name'])));
    $tmpName = $_FILES['csv']['tmp_name'];
    if($ext === 'csv'){ //check if uploaded file is of CSV format
        if(($handle = fopen($tmpName, 'r')) !== FALSE) {
            $row = 0;
            while(($data = fgetcsv($handle)) !== FALSE) {
                $col_count = count($data);
                //splitting of CSV file :
                if ($row % $batchsize == 0):
                    $file = fopen("minpoints$row.csv","w");
                $csv[$row]['col1'] = $data[0];
                $csv[$row]['col2'] = $data[1];
                $min = $data[0];
                $points = $data[1];
                $json = "'$min', '$points'";
                //sending the splitted CSV files, batch by batch...
                if ($row % $batchsize == 0):
                    echo "<script> senddata('minpoints$row.csv'); </script>";
        echo "Only CSV files are allowed.";
    //alert once done.
    echo "<script> alert('CSV imported!') </script>";

That's it! You already have a pure PHP script that can import multiple number of rows in seconds! :) (Thanks to my partner who taught and gave me an idea on how to use ajax)

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download