I'm trying to read and modify each row of a number of files using Python. Each file has thousands to hundreds of thousands of rows and hence each file is processed only after another one if processed. I'm trying to read the files like:
csvReader = csv.reader(open("file","r")
for row in csvReader:
handleRow(row)
I want to use multi threading to read each of the files using a different thread parallel in order to save time. Can anyone point out if it would be useful as well as how to implement that?