python - How to run different methods in parallel and obtain the outputs for further processing? -


i understand how run 2 core intensive functions in parelle. functions require same input , produce different outputs.

last time used multiprocessing library, saved results file , did not need further processing.

below simple example of type of code. how parallelize functions both outputs , b can used further processing? using python 2.7.

input_dict = {'key1':'value1','key2':'value2','key3':'value3'}  def func1(dictionary):     # work     return np.array()  def func2(dictionary):     # different work     return np.array()  = func1(input_dict) b = func2(input_dict)  result = np.dot(a, b) 

is code below correct way run both functions , collect output?

from multiprocessing import process, queue  input_dict = {'key1':'value1','key2':'value2','key3':'value3'}  def func1(dictionary):     # work     return q.put(np.array())  def func2(dictionary):     # different work     return q.put(np.array())   if __name__ == '__main__':     q1 = queue()     q2 = queue()     p1 = process(target=func1, args=(input_dict,))     p2 = process(target=func2, args=(input_dict,))     p1.start()     p2.start()      = q1.get()     b = q2.get()     p1.join()     p2.join()     result = np.dot(a, b) 

it's correct, have to pass q1 , q2 when initializing processes.

also, replace return q.put(np.array()) q.put(np.array())

when getting results queue a = q1.get() remember getting single item queue, not list.

if need extract list, can this:

from multiprocessing import process, queue import numpy np  input_dict = {'key1': 'value1', 'key2': 'value2', 'key3': 'value3'}   def func1(dictionary, q):     # work     x in xrange(5):         q.put(             np.random.randint(2, size=20)         )     q.put('stop')   def func2(dictionary, q):     # different work     q.put(         np.array([1, 1, 1, 1, 0, 1, 1, 0, 1, 0,                   1, 0, 1, 0, 1, 0, 0, 1, 1, 1])     )   if __name__ == '__main__':     q1 = queue()     q2 = queue()     p1 = process(target=func1, args=(input_dict, q1))     p2 = process(target=func2, args=(input_dict, q2))     p1.start()     p2.start()      f1_results = []     # keep adding items func1 results list, until sees 'stop' item     array in iter(q1.get, 'stop'):         f1_results.append(array)      b = q2.get()     p1.join()     p2.join()      in f1_results:         result = np.dot(a, b)         print result 

Comments