6

How do I run several python commands in parallel in a python script ? As a simple example I have several sleep commands:

time.sleep(4)
time.sleep(6)
time.sleep(8)

I want all the above to be executed in parallel. I expect the control back when 8 seconds have passed (which is the max of all the sleeps above). The real commands will be different but want to get an idea with above.

In bash, I could have simple done:

sleep 4 &
pid1=$!
sleep 6 &
pid2=$!
sleep 8 &
pid3=$!
wait $pid1 $pid2 $pid3

Thanks.

3
  • Are you talking about parallel in a threaded or multiprocessing sense? In python, threading and multiprocessing can accomplish this task. Commented Aug 23, 2012 at 17:55
  • 1
    Your bash example spawns three processes. Is this what you want in the Python case as well? Commented Aug 23, 2012 at 18:01
  • In threaded mode. The responses below answer the question. Commented Aug 23, 2012 at 19:19

3 Answers 3

10

One simple example, using multiprocessing:

import multiprocessing as mp
import time

pool = mp.Pool(3)
results = pool.map(time.sleep, [4, 6, 8] )

This spawns separate processes instead of creating separate threads in the same process as demonstrated in the answer by Steven Rumbalski. multiprocessing sidesteps the GIL (in cpython) meaning that you can execute python code simultaneously (which you can't do in cpython with threading). However, it has the downside that the information you send to the other processes must be pickleable, and sharing state between processes is a bit more complicated as well (although I could be wrong about the last point -- I've never really used threading).

word of caution: Don't try this in the interactive interpreter

Sign up to request clarification or add additional context in comments.

3 Comments

why that 'word of caution'? I ran it on the interactive interpreter and it worked...
@nephewtom -- Because there can be issues doing that depending on the OS. IIRC, Windows in particular has problems if you try to run the above in an interactive interpreter.
Ok. I ran it on Linux. Thanks for the clarification!
7

Take a look at the threading module. You could have something like:

import time,threading

def mySleep( sec ):
    time.sleep( sec )
t1 = threading.Thread( target=mySleep, args=(4,) )
t2 = threading.Thread( target=mySleep, args=(6,) )
t3 = threading.Thread( target=mySleep, args=(8,) )
t1.start()
t2.start()
t3.start()

# All threads running in parallel, now we wait
t1.join()
t2.join()
t3.join()

3 Comments

Excellent. I didn't know about the args argument to threading.Thread, and have appropriated it for my answer. Note that you don't need mySleep. Just do target=time.sleep.
Duh - you're right. target=time.sleep would work just as well.
Threads in the CPython implementation have one major disadvantage: only one thread at a time can be executing python bytecode. See: wiki.python.org/moin/GlobalInterpreterLock So if the main task of the read is to run bytecode (rather than doing I/O or using extensions like numpy that are not bound by this limitation) threads will not speed up your program. The multiprocessing module than mgilson mentions below does not have that limitation.
5
from threading import Thread

threads = [Thread(target=time.sleep, args=(secs,)) for secs in (4,6,8)]
for t in threads: t.start()
for t in threads: t.join()
print 'all threads done!'

1 Comment

Nice. This does the same thing as my answer but is cleaner and scales better too. +1

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.