I've got a python script that searches for files in a directory and does so infinitely while the computer is running. Here is the code:
import fnmatch
import os
import shutil
import datetime
import time
import gc
# This is a python script that removes the "conflicted" copies of
# files that dropbox creates when one computer has the same copy of
# a file as another computer.
# Written by Alexander Alvonellos
# 05/10/2012
class cleanUpConflicts:
rootPath = 'D:\Dropbox'
destDir = 'D:\Conflicted'
def __init__(self):
self.removeConflicted()
return
def cerr(message):
f = open('./LOG.txt', 'a')
date = str(datetime.datetime.now())
s = ''
s += date[0:19] #strip floating point
s += ' : '
s += str(message)
s += '\n'
f.write(s)
f.close()
del f
del s
del date
return
def removeConflicted(self):
matches = []
for root, dirnames, filenames in os.walk(self.rootPath):
for filename in fnmatch.filter(filenames, '*conflicted*.*'):
matches.append(os.path.join(root, filename))
cerr(os.path.join(root, filename))
shutil.move(os.path.join(root, filename), os.path.join(destDir, filename))
del matches
return
def main():
while True:
conf = cleanUpConflicts()
gc.collect()
del conf
reload(os)
reload(fnmatch)
reload(shutil)
time.sleep(10)
return
main()
Anyway. There's a memory leak that adds nearly 1 meg every every ten seconds or so. I don't understand why the memory isn't being deallocated. By the end of it, this script will continuously eat gigs of memory without even trying. This is frustrating. Anyone have any tips? I've tried everything, I think.
Here's the updated version after making some of the changes that were suggested here:
import fnmatch
import os
import shutil
import datetime
import time
import gc
import re
# This is a python script that removes the "conflicted" copies of
# files that dropbox creates when one computer has the same copy of
# a file as another computer.
# Written by Alexander Alvonellos
# 05/10/2012
rootPath = 'D:\Dropbox'
destDir = 'D:\Conflicted'
def cerr(message):
f = open('./LOG.txt', 'a')
date = str(datetime.datetime.now())
s = ''
s += date[0:19] #strip floating point
s += ' : '
s += str(message)
s += '\n'
f.write(s)
f.close()
return
def removeConflicted():
for root, dirnames, filenames in os.walk(rootPath):
for filename in fnmatch.filter(filenames, '*conflicted*.*'):
cerr(os.path.join(root, filename))
shutil.move(os.path.join(root, filename), os.path.join(destDir, filename))
return
def main():
#while True:
for i in xrange(0,2):
#time.sleep(1)
removeConflicted()
re.purge()
gc.collect()
return
main()
I've done some research effort on this problem and it seems like there might be a bug in fnmatch, which has a regular expression engine that doesn't purge after being used. That's why I call re.purge(). I've tinkered with this for a couple of hours now.
I've also found that doing:
print gc.collect()
Returns 0 with every iteration.
Whoever downvoted me is clearly mistaken. I really need some help here. Here's the link that I was talking about: Why am I leaking memory with this python loop?
delstatements (although I can see why you might try them if you have a leak) and you certainly don't needreturnat the end of every function.def cerr(message)should really readdef cerr(self, message), and the call further down should beself.cerr(...). how can you have a leak if your code isn't even functional?cleanUpConflictsobject every time you come into the loop instead of just instantiating it once and re-using it? That seems a bit suspicious to me offhand.__init__()to make your class pseudo-callable, then not really using the class as a class at all, as there is no stateful or instance information maintained since you blow it away and recreate every loop...