I recently had my website moved to a new server. I have some basic python scripts with access data in a MySQL database. On the old server we had no problems. On the new server :
- MySQLWorkbench can connect no trouble and perform all queries
- The same (SELECT) queries with python work 5% of the time and the other 95% of the time they timeout or the connection is lost during the query
- for example the table has 100000 rows, selecting the entire thing in MySQLWorkbench works fine. Returned in 3 seconds.
- in python the same query never works, when LIMIT 2999 the query works but just LIMIT 3010 causes it to timeout.
- same effect observed when the script is run locally or remotely
Been digging around for a few days now to figure out if there are some settings in either the database, the database server, the server itself that prevent python (but not MySQLWorkbench) doing its job correctly.
The Query and code in case they are interesting:
query = "SELECT * FROM wordpress.table;"
conn = MySQLConnection(**mysqlconfig)
cursor = conn.cursor()
rows = cursor.fetchall()
I don't have the details on the server but it has enough power for MySQLWorkbench to work fine, just python can't seem to be able to made to work
To see if this problem was caused by queries returning too much data for python to handle I thought of using OFFSET and LIMIT to loop through a bigger query in pieces with say 10 rows per query.
total_rows = 100000
interval = 10
data = 
for n in range(0, total_rows / interval):
q = "SELECT * FROM wordpress.table LIMIT %s OFFSET %s" % (interval, n * interval)
returned = cursor.fetchall()
rows = [[i for i in j] for j in list(returned)]
for row in rows:
print n, len(data)
Expected: this would quickly work through the bigger query in smaller pieces
Happens: It gets further than the 3000 rows it got stuck on before but ends up hitting a wall after some iterations. Also not consistently, running the script 10 times results in n reaching a different point each time.