Hirad Roshandel Hirad Roshandel - 2 years ago 83
SQL Question

Updating Large Volume of data quickly

I have made an application in Nodejs that every minute calls an endpoint and gets a json array that has about 100000 elements. I need to upsert this elements into my database such that if the element doesn't exist I insert it with column "Point" value set to 0.

So far I'm having a cron job and simple upsert query. But it's so slow:

var q = async.queue(function (data, done) {
db.query('INSERT INTO stat(`user`, `user2`, `point`) '+data.values+' ON DUPLICATE KEY UPDATE point=point+ 10',function (err, result) {
if (err) throw err;


//Cron job here Every 1 minute execute the lines below:
var values='' ;
for (var v=0;v<stats.length;v++) {

values = '("JACK","' + stats[v] + '", 0)';
q.push({values: values});

How can I do such a task in a very short amount of time. Is using mysql a wrong decision? I'm open to any other architecture or solution. Note that I have to do this every minute.

Answer Source

I fixed this problem by using Bulk Upsert (from documentation)! I managed to Upsert over 24k rows in less than 3 seconds. Basically created the query first then ran it:

 INSERT INTO table (a,b,c) VALUES (1,2,3),(4,5,6)
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download