1

I am starting a new process and redirecting its stdout to a text file. I am want to read the progress the processing from the file, however the file stays empty until the process finishes executing. Can you explain this behavior and if there is a work around it ?

p = Process(target=foo, args=(a,))
p.start()

def foo(a):
    sys.stdout = open(str(os.getpid()) + ".out", "w")
    print("test")
    time.sleep(10)
1
  • 2
    You have to use unbuffered output Commented Jan 28, 2022 at 14:00

2 Answers 2

3

The workaround is to add flush=True to your print statement.

from multiprocessing import Process
import sys
import time


def foo(a):
    sys.stdout = open(str(os.getpid()) + ".out", "w")
    print("test", flush=True)
    time.sleep(10)

if __name__ == '__main__':
    p = Process(target=foo, args=(1,))
    p.start()
Sign up to request clarification or add additional context in comments.

Comments

-1

@azelcer Is correct I need to use unbuffered output, however text i/o can' be unbuffered in python so the best way is to use a buffer of size one so the output will be flushed to the file every time a line is printed.

p = Process(target=foo, args=(a,))
p.start()

def foo(a):
    sys.stdout = open(str(os.getpid()) + ".out", "w", 1)
    print("test")
    time.sleep(10)

1 Comment

Do you realize that the reason for buffering is to avoid the performance penalty of writing to output every time? If that’s what you absolutely want, fine, but something to consider.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.