I am using psycopg2 module in python to read from postgres database, I need to some operation on all rows in a column, that has more than 1 million rows.
I would like to know would
q="SELECT names from myTable;"
for row in rows:
fetchall() fetches up to the
arraysize limit, so to prevent a massive hit on your database you can either fetch rows in manageable batches, or simply step through the cursor till its exhausted:
row = cur.fetchone() while row: # do something with row row = cur.fetchone()