import sys
import glob
import os.path
list_of_files = glob.glob('/Users/Emily/Topics/*.txt') #500 files
for file_name in list_of_files:
print(file_name)
f= open(file_name, 'r')
lst = []
for line in f:
line.strip()
line = line.replace("\n" ,'')
line = line.replace("//" , '')
lst.append(line)
f.close()
f=open(os.path.join('/Users/Emily/UpdatedTopics',
os.path.basename(file_name)) , 'w')
for line in lst:
f.write(line)
f.close()
I was able to read my files and do some pre-processing. The problem I'm facing is that when I write the files out, I can only see one file. I should get 500 files.
file_name, the last one used in your first for-loop. If you want to work on all of the names, your logic needs to be inside the loop.read_file()function to read the lines of text, (b) pass those lines to aclean_lines()function that cleans up the lines and returns a new list of lines, (c) pass the input file path to anoutput_file_path()function that returns the output file path, and finally (d) pass the output file path and the cleaned up lines to awrite_file()function that writes an output file. Good luck!