I have a function called run_3_processes, which spawns 3 processes (duh) using multiprocessing.pool.apply, waits for their results and processes these results and returns a single result.
I have another function called run_3_processes_3_times, which should run run_3_processes 3 times in parallel, wait for all of them to return and then process all their results.
things I tried:
- using a process pool for
run_3_processes_3_times- turns out this is complicated because of Python Process Pool non-daemonic? - rewriting the entire applicative code to spawn 9 processes with the same pool - this really convoluted my code and breaks encapsulation
- using a
threadpool.applyforrun_3_processes_3_times- for some reason this makes it runs serially, not in parallel - is it because theapplyinrun_3_processesblocks the GIL?
I'm sure there's a one-liner solution I'm missing... thanks!
run_3_processesstarts 3 new processes usingfork(or some other call that eventually callsfork) then those processes can truly run in parallel with no GIL problems. But I'm not sure whatrun_3_processes_3_timesis actually doing. Can you post the code?run_3_processes_3_timesjust needs to callrun_3_processesand wait. That's why I thought lightweight threads are fitting here.run_3_processesactually spawns heavy-lifting processes (withmultiprocessing.pool.apply, but I guess you're right and it callsforkdown the road). But for some reason it won't give up the GIL, unless I useapply_asynclike below