1

I have a main.py file. This file uses multiprocessing to execute another file called function.py. The second one use threading to apply a function, f, to every component of a numpy array. function.py reads (only once in all the process) a file, file.txt, to read some data for f and then to clear it (write a empty file). Do I need to lock the file, file.txt, in function.py to avoid problems of having N processes, created from main.py executing function.py and reading and writing in file.txt? If so, how can it be done?

Finally I get it with a semaphore.

1 Answer 1

1

Yes, in some way it has to be locked. Having several processes reading a file is no problem as long as they are only reading it. As soon as something writes to the file you have to be sure that the reads and writes occur in the desired order.

Locking could be done by using a lockfile that is atomically created. After a process successfully creates the lockfile, it gains access to the text file. After the process is done with the text file, it deletes the lockfile. This assures that only one process can access the text file at a given time.

Sign up to request clarification or add additional context in comments.

2 Comments

You can also synchronize your process inside the program to make sure only one process at a time accesses the code where the file is edited.
best answer to the question

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.