0

How do I run several independent processes in parallel when the argument is the same? My current (not nice) solution is:

import time
import multiprocessing

def parse_args():
   ...
   return args

def my_function(args):
    ...

if __name__ == '__main__':
    args = parse_args()
    processes = []
    for i in range(5):
        processes.append(multiprocessing.Process(target=my_function, args=(args,)))
        processes[-1].start()
    time.sleep(200)
    for i in range(5):
       processes[i].terminate()

Also, my_function runs infinetely and it doesn't return anything.

1
  • What is "not nice" about your current solution? Seems like it should work just fine.. I personally would use a semaphore to signal my_function to quit gracefully rather than terminate-ing it, but if there's no specific need to gracefully shut down, there's no problem with that. A multiprocessing.Pool would just add lots of overhead and complexity you don't seem to be using anyway. Commented Apr 18, 2022 at 18:46

3 Answers 3

1

I'd join them, and make sure they terminate on their own, like so :

processes = [Process(target=my_function, args=(args,), daemon=True) for p in range(nb_processes)] # create nb_processes running my_function
[p.start() for p in processes] # start all processes
[p.join()  for p in processes] # wait for all processes to end

You want to make sure that my_function implements some sort of timeout, because that'll wait for all processes to complete.

As for getting back their results you could use a queue, check multiprocessing.Queue, or a message broker. I personally like to use REDIS for that, but then it's very much opinion oriented.

As a side note, you probably want to take a look at asyncio if you haven't yet.

Sign up to request clarification or add additional context in comments.

Comments

0

Simply use threading as shown in this example

import threading as thd

def function_one():
    while True:
        print("alice")

def function_two():
    while True:
        print("bob")

first_thd = thd.Thread(target=function_one)
second_thd = thd.Thread(target=function_two)

first_thd.start()
second_thd.start()

you can call multiple threads at the same time

1 Comment

threading Python code doesn't run them in parallel due to the GIL.
0

Use a process pool and map your calls with something like itertools to keep it short and sweet.

import multiprocessing
import itertools

num_processes = 5

with multiprocessing.Pool(num_processes) as pool:        
    pool.map_async(my_function, itertools.repeat(args, num_processes))
    time.sleep(200)
    pool.terminate()

Note: The map_async returns a future object that allows you to call .get() if you ever need the result.

1 Comment

I searched for approximately this solution but my_function should generate a folder but if I use this solution it doesn't. I guess the function generates an error but I don't see it. I guess I will try to write a minimal reproducible example and post a new question

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.