ravi akasapu ravi akasapu - 4 months ago 15
MySQL Question

MySQL sequential read in a For Loop

I have a database with multiple tables(around 100) with records stored based on category. Each category is a table and , each table holds multiple items. Each item is having multiple transaction records for each day, whenever there is a transaction happened with multiple fields for each row.

I need to fetch the records for each item based on a condition, and do some operations(ex: aggregation of some sort) in the app(a PHP or Python program). The results are stored again in another database table.

At present I am running the operations manually for each Item.Executing the program for each item by passing the item as parameter. But I am pressed against the situation where I am getting new categories and new items every day, making me adjust the manual execution very difficult.

Below are the ways in which I have tried to automate, but none of them are working.


  1. Run the MySQL queries in a for loop for each item, but the execution is not working or execute on only one item.
    This is the controller I used for pulling data for each item, but this does not request for all items. It either only work for 1st item, or last Item.
    Also I cannot make the loop wait till the database pull is finished.




for($i=0;$i<$total_items;$i++)
{
$data['results'] = $this->scripts_model-> run_daily_stats($item, $Parameter1, $Paramet2);
//Use the Results in some operations, and then proceed with next result set.
}



  1. Create Flat file for each item and pull the records. This has worked to some extant, but pulling the records from flat file based on a condition is also seems equally difficult. And re-creating each file is not working.

  2. Put all the items in a batch job by adding new lines execute for every 30 seconds, but takes a lot of time to complete all the items and again I need to update the batch files every day.



Here is a sample batch file I am using. This has 320 rows now that runs for around 2 hrs. And I am adding multiple rows each day. So expect this will increase total execution time.

15 12 * * * wget 127.0.0.1/~home/scripts/update_daily/item1 >/dev/null 2>&1
15 12 * * * sleep 30; wget 127.0.0.1/~home/scripts/update_daily/item2 >/dev/null 2>&1
16 12 * * * wget 127.0.0.1/~home/scripts/update_daily/item3 >/dev/null 2>&1
16 12 * * * sleep 30; wget 127.0.0.1/~home/scripts/update_daily/item4 >/dev/null 2>&1

.
.
55 12 * * * wget 127.0.0.1/~home/scripts/update_daily/item234 >/dev/null 2>&1
.
.
.

4. Group multiple Items and put in a batch file, but the unable to run the program for each item.


Is there a way I can automate the execution without breaking the MySQL connection? Please suggest any technology or programming that will help me resolve the issue.

Thanks

Ravi

Answer

Before even considering this question, I would urge you to re-examine the structure of your database. If you have 100 tables with the same structure (?), each one representing a different category, it would be MUCH simpler to have a single table with an additional category column that you could then use to query the relevant rows.

Since you don't show the code you described it's difficult for anyone to say what you might have been doing wrong, since all we know is that the things you have tried "don't work".

Comments