The basic idea is every row has a unique item id which I'm using to decide which row to update, and am updating it with a quantity. I'm importing inventory from a company's api into my own database, and there are about 80k items. They have separate api endpoints for item details vs item quantities, so I need to get all the inventory from the first endpoint first, then call the second and update all the items' quantities.
My initial solution was to just loop though the results I get back and update one row per query. This is taking about an hour to an hour and a half to update all 80k items. Is this just how long it takes to update so many items or is there any sql magic I could do?
The first thing I would do is to store the update data in a separate table. Then run a query such as:
update t set . . . from t join results r on t.?? = r.??;
Assuming the column used for the
join is indexed, then this should be faster than an hour and a half.