Nicholas Nicholas - 15 days ago 5
MySQL Question

Mysql foreach into single row with json

I need some help with looping through a table and converting the rows into a single row in a new table.

Table A contains 100k+ rows. I want to condense this down in table B by adding the values to a json array.

Table A has the following structure:

+---------------+------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+---------------+------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| result_id | int(11) | NO | | 0 | |
| question_id | int(11) | NO | | 0 | |
| question_type | tinyint(4) | NO | | 0 | |
| answer | tinytext | NO | | NULL | |
| user_answer | tinytext | NO | | NULL | |
| exam_id | int(11) | NO | | NULL | |
+---------------+------------+------+-----+---------+----------------+


Table B has the following structure:

+------------+-----------+------+-----+-------------------+-----------------------------+
| Field | Type | Null | Key | Default | Extra |
+------------+-----------+------+-----+-------------------+-----------------------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| exam_id | int(11) | NO | | NULL | |
| score | int(11) | NO | | NULL | |
| exam_data | text | NO | | NULL | |
| created_at | timestamp | NO | | CURRENT_TIMESTAMP | on update CURRENT_TIMESTAMP |
| updated_at | timestamp | YES | | NULL | |
| deleted_at | timestamp | YES | | NULL | |
+------------+-----------+------+-----+-------------------+-----------------------------+


Esentially, I want to loop through all the rows in Table A and add the rows with the same
result_id
to an array. I want to then take this array and insert a new row into Table B.

Here's what I have so far, but it does not work as intended.

$array = array();
$id = null;

TableA::chunk(100, function($answers)
{
foreach($answers as $old)
{
$id = $old->result_id;

if($id == $old->result_id)
{
$array[] = array("type" => $old->question_type, "answer" => $old->user_answer);
}
}

$new = new TableB;
$new->exam_id = $id;
$new->exam_data = json_encode($array);
$new->save();
});


I want to utilize the Laravel chunk function to avoid the SQL server from timing out.

Can someone point me in the right direction?

Answer

I hate ORM for actions that different from simple CRUD operations. So I will describe an idea in plain SQL/Pseudo code.

  1. SELECT DISTINCT result_id FROM TableA; // select unique result_ids
  2. Loop over result_ids in PHP, and for each result_id do:
    1. SELECT * FROM TableA WHERE result_id = $row['result_id']; // select all rows with specified result_id
    2. Loop over results, create new TableB objects, and save it to db.

Additionally, to save memory, you can create object TableB once outside the loop, and then in the loop to change needed fields. In such way you save memory and time needed to create new object every iteration.

p.s. For more efficient way, I would use plain SQL queries. Furthermore, it's better to create queries dynamically, to be able insert data to db in portions instead of doing INSERT for each row.

I mean, first build query that contains multiple inserts, and then insert it by running just one query:

INSERT INTO Table ( Column1, Column2 ) VALUES
( Value1, Value2 ), ( Value1, Value2 )
Comments