Medya Gh Medya Gh - 6 months ago 45
Python Question

python postgres can I fetchall() 1 million rows?

I am using psycopg2 module in python to read from postgres database, I need to some operation on all rows in a column, that has more than 1 million rows.

I would like to know would

fail or cause my server to go down? (since my RAM might not be that big to hold all that data)

q="SELECT names from myTable;"
for row in rows:

what is the smarter way to do this?


fetchall() fetches up to the arraysize limit, so to prevent a massive hit on your database you can either fetch rows in manageable batches, or simply step through the cursor till its exhausted:

row = cur.fetchone()
while row:
   # do something with row
   row = cur.fetchone()