4

I have a TCP server and client. At some point in the server script, I start a process, which needs to be able to get every new connection and send data to it. In order to do so, I have a multiprocessing.Queue(), to which I want to put every new connection from the main process, so that the process I opened can get the connections from it and send data to them. However, it seems that you cannot pass anything you want to a Queue. When I try to pass the connection (a socket object), I get:

Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/queues.py", line 266, in _feed
    send(obj)
TypeError: expected string or Unicode object, NoneType found

Are there any alternatives that I could use?

2
  • You can't do that. Queue is intended to send Python objects, while sockets involve OS-specific data (such as socket fd), so they cannot be trivially passed to a subprocess. Commented Apr 2, 2016 at 8:02
  • Well, I found a workaround to the issue. However, I'd be really interested in seeing a way of actually doing this, even though you say it's impossible. Commented Apr 2, 2016 at 8:27

1 Answer 1

11

Sending a socket through a multiprocessing.Queue works fine starting with python3.4 because from that version a ForkingPickler is used to serialize the objects to be put in the queue, and that pickler knows how to serialize sockets and other objects containing a file handle.

The multiprocessing.reduction.ForkingPickler class does already exist in python2.7 and can pickle sockets, it's just not used by multiprocessing.Queue.

If you can't switch to python3.4+ and really need similar functionality in python2.7 a workaround would be to create a function that uses the ForkingPickler to serialize objects, e.g:

from multiprocessing.reduction import ForkingPickler
import StringIO

def forking_dumps(obj):
    buf = StringIO.StringIO()
    ForkingPickler(buf).dump(obj)
    return buf.getvalue()

Instead of sending the socket directly you then need to send its pickled version and unpickle it in the consumer. Simple example:

from multiprocessing import Queue, Process
from socket import socket
import pickle

def handle(q):
    sock = pickle.loads(q.get())
    print 'rest:', sock.recv(2048)

if __name__ == '__main__':
    sock = socket()
    sock.connect(('httpbin.org', 80))
    sock.send(b'GET /get\r\n')
    # first bytes read in parent
    print 'first part:', sock.recv(50)

    q = Queue()
    proc = Process(target=handle, args=(q,))
    proc.start()
    # use the function from above to serialize socket
    q.put(forking_dumps(sock))
    proc.join()

Making sockets pickleable only makes sense here in the context of multiprocessing, it would not make sense to write it to a file and use later or try to use it on a different pc or after the original process has ended. Therefore it wouldn't be a good idea to make sockets pickleable globally (e.g. by using the copyreg mechanisms).

Sign up to request clarification or add additional context in comments.

1 Comment

That's exactly what I was looking for! This way, I can pass sockets to the process and it works just fine! Thanks!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.