Zip184 Zip184 - 2 years ago 372
Python Question

basic pyodbc bulk insert

In a python script, I need to run a query on one datasource and insert each row from that query into a table on a different datasource. I'd normally do this with a single insert/select statement with a tsql linked server join but I don't have a linked server connection to this particular datasource.

I'm having trouble finding a simple pyodbc example of this. Here's how I'd do it but I'm guessing executing an insert statement inside a loop is pretty slow.

result = ds1Cursor.execute(selectSql)

for row in result:
insertSql = "insert into TableName (Col1, Col2, Col3) values (?, ?, ?)"
ds2Cursor.execute(insertSql, row[0], row[1], row[2])
ds2Cursor.commit()


Is there a better bulk way to insert records with pyodbc? Or is this a relatively efficient way to do this anyways. I'm using SqlServer 2012, and the latest pyodbc and python versions.

Answer Source

The best way to handle this is to use the pyodbc function executemany.

ds1Cursor.execute(selectSql)
result = ds1Cursor.fetchall()


ds2Cursor.executemany('INSERT INTO [TableName] (Col1, Col2, Col3) VALUES (?, ?, ?)', result)
ds2Cursor.commit()
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download