0

i am trying to call a pool with inside apply_async function , i get error of serialize object when i tried to pass one function the pool of another function so i moved the second pool to be global but still it not worked for me, what am i missing ? my code :

from multiprocessing import Pool
b_pool = Pool(1)

def func_a(i):
    global b_pool
    print "a : {}".format(i)
    try:
        res = b_pool.apply_async(func_b, args=(i,))
    except Exception as e:
        print e

def func_b(i):
    print "b : {}".format(i)
    file = "/home/ubuntu/b_apply.txt"
    f = open(file, "a")
    f.write("b : {}".format(i))
    f.close()


if __name__ == '__main__':
    a_pool = Pool(1)
    for i in range(10):
       res =  a_pool.apply_async(func_a,args=(i,) )

    a_pool.close()
    a_pool.join()

    b_pool.close()
    b_pool.join()

in this code only a is printing 0 -9 and b not printing not even to file. i am using python 2.7

3
  • Close and join of "b_pool" must be done in "func_a" because this is a different pool in a different process than "b_pool" in main process. Commented Jul 18, 2019 at 13:10
  • not excatly waht i want , i want that b will add function to the pool after a in ending , and did not wirk can you please add code Commented Jul 18, 2019 at 13:25
  • 1
    One way can be to create a multiprocessing.Queue and send it through an initializer function when creating the first Pool. The initializer stores the queue object (which is then shared among main process and created workers) and the workers send job data for the b_pool back to the main process through the queue. Commented Jul 19, 2019 at 3:56

1 Answer 1

1

the queue was good direction , just that that multiprocessing.Queue can't be passed like this but Manager.Queue is the correct way of doing this , my code that worked :

from multiprocessing import Pool,Manager
def func_a(i,q):
    print "a : {}".format(i)
    try:
        q.put(i)
    except Exception as e:
        print e


def func_b(i,q):
    i = q.get()
    print "b : {}".format(i)


if __name__ == '__main__':
    m = Manager()
    q = m.Queue()
    a_pool = Pool(1)
    b_pool = Pool(1)

    for i in range(10):
        res = a_pool.apply_async(func_a,args=(i,q,) )
        res_2 = b_pool.apply_async(func_b, args=(i,q,))

    a_pool.close()
    a_pool.join()

    b_pool.close()
    b_pool.join()

this answer Sharing a result queue among several processes was very helpful

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.