Working with a Postgres table containing large records, nearly 7m. I understand SELECT * FROM table is large enough to fit in memory, so db connection is lost after a long delay waiting for query result (can only execute SELECT * FROM table LIMIT n).
I need to process each record going through sequentially until the last. What is the way to do this?
waiting for the query result, Instead, it is probaly caused by the application , processing the first few records, and not reading the rest. You can test this by dumping to .csv or .tsv and work from there.