I've got 1.6GB available to use in a python process. I'm writing a large csv file which data is coming from a database. The problem is: After the file is written, the memory (>1.5GB) is not released immediately which causes an error in the next bit of code (allocating memory fails because the OS cannot find enough memory to allocate).
Does any function exists which would help me release that memory? Or, do you have a better way to do it?
This is the script I'm using to write the file, is writing by chunks to deal with the memory issue:
size_to_read = 20000
sqlData = rs_cursor.fetchmany(size_to_read)
c = csv.writer(open(fname_location, "wb"))
c.writerow(headers)
print("- Generating file %s ..." % out_fname)
while sqlData:
for row in sqlData:
c.writerow(row)
sqlData = rs_cursor.fetchmany(size_to_read)
sqlData = rs_cursor.fetchmany(size_to_read)on the last line? Everything you just wrote to a file (which you haven't closed), you load it all again?del cafter the code above, which should prompt Python to recover memory used by c and close the file handle which is only referenced by c. In general, it would be better to keep a handle for the file, and then explicitly close it when you're done. See stackoverflow.com/questions/3347775/csv-writer-not-closing-file