7

So I have a program, in the "main" process I fire off a new Process object which (what I want) is to read lines from stdin and append them to a Queue object.

Essentially the basic system setup is that there is a "command getting" process which the user will enter commands/queries, and I need to get those queries to other subsystems running in separate processes. My thinking is to share these via a multiprocessing.Queue which the other systems can read from.

What I have (focusing on just the getting the commands/queries) is basically:

def sub_proc(q):
    some_str = ""
    while True:
        some_str = raw_input("> ")
        if some_str.lower() == "quit":
            return
        q.put_nowait(some_str)

if __name__ == "__main__":
    q = Queue()
    qproc = Process(target=sub_proc, args=(q,))
    qproc.start()
    qproc.join()

    # now at this point q should contain all the strings entered by the user

The problem is that I get:

Process Process-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/home/blah/blah/blah/blah.py", line 325, in sub_proc
    some_str = raw_input("> ")
  File "/randompathhere/eclipse/plugins/org.python.pydev_2.1.0.2011052613/PySrc/pydev_sitecustomize/sitecustomize.py", line 181, in raw_input
    ret = original_raw_input(prompt)
EOFError: EOF when reading a line

How do?

2

4 Answers 4

12

I solved a similar issue by passing the original stdin file descriptor to the child process and re-opening it there.

def sub_proc(q,fileno):
    sys.stdin = os.fdopen(fileno)  #open stdin in this process
    some_str = ""
    while True:
        some_str = raw_input("> ")

        if some_str.lower() == "quit":
            return
        q.put_nowait(some_str)

if __name__ == "__main__":
    q = Queue()
    fn = sys.stdin.fileno() #get original file descriptor
    qproc = Process(target=sub_proc, args=(q,fn))
    qproc.start()
    qproc.join()

This worked for my relatively simple case. I was even able to use the readline module on the re-opened stream. I don't know how robust it is for more complex systems.

Sign up to request clarification or add additional context in comments.

Comments

3

In short, the main process and your second process don't share the same STDIN.

from multiprocessing import Process, Queue
import sys

def sub_proc():
    print sys.stdin.fileno()

if __name__ == "__main__":
    print sys.stdin.fileno()
    qproc = Process(target=sub_proc)
    qproc.start()
    qproc.join()

Run that and you should get two different results for sys.stdin.fileno()

Unfortunately, that doesn't solve your problem. What are you trying to do?

4 Comments

Essentially I want a system where a user can enter textual queries/commands in one process, and have those queries transferred to another process.
Why not read from stdin on the main process and push them into a queue to be handled by a secondary process?
Because of the way the system is structured that is difficult, and it seems like it should be possible to simply share the same stdin between processes.
From link: in the multiprocessing.Process._bootstrap() method — this resulted in issues with processes-in-processes. This has been changed to: sys.stdin.close() sys.stdin = open(os.devnull)
1

If you don't want to pass stdin to the target processes function, like in @Ashelly's answer, or just need to do it for many different processes, you can do it with multiprocessing.Pool via the initializer argument:

import os, sys, multiprocessing

def square(num=None):
    if not num:
        num = int(raw_input('square what? ')) 
    return num ** 2

def initialize(fd):
    sys.stdin = os.fdopen(fd)

initargs = [sys.stdin.fileno()]
pool = multiprocessing.Pool(initializer=initialize, initargs=initargs)
pool.apply(square, [3])
pool.apply(square)

the above example will print the number 9, followed by a prompt for input and then the square of the input number.

Just be careful not to have multiple child processes reading from the same descriptor at the same time or things may get... confusing.

Comments

0

You could use threading and keep it all on the same process:

from multiprocessing import Queue
from Queue import Empty
from threading import Thread

def sub_proc(q):
    some_str = ""
    while True:
        some_str = raw_input("> ")
        if some_str.lower() == "quit":
            return
        q.put_nowait(some_str)

if __name__ == "__main__":
    q = Queue()
    qproc = Thread(target=sub_proc, args=(q,))
    qproc.start()
    qproc.join()

    while True:
        try:
            print q.get(False)
        except Empty:
            break

1 Comment

But threading suffers from the GIL (only a single thread can execute at any given time). Once a command gets entered and put into the queue other processes can be working on processing that item while the "read input" process can get the next item/command to process from the user (or that's the idea). With threads this wouldn't be possible due to the Global Interpreter Lock (GIL)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.