I am using python pandas to load data from a MySQL database, change, then update another table. There are a 100,000+ rows so the UPDATE query's take some time.
Is there a more efficient way to update the data in the database than to use the
The problem here is not pandas, it is the
UPDATE operations. Each row will fire its own
UPDATE query, meaning lots of overhead for the database connector to handle.
Load it into a new table, then
DROP the old one and
RENAME the new one to the old ones name.