Python: Multithreading using join and Queue sometimes blocks forever -


my code follows:

def predutycyclesolve(self, proccount):     z = self.crystal.z      #d1 = np.empty(len(z))     #d2 = np.empty(len(z))      d1d2q = multiprocessing.queue()     procs = []     proc in range(proccount):         p = multiprocessing.process(target=self.dutycyclesolve,                                     args=(proc,                                           z[proc::proccount],                                           d1d2q))         procs.append(p)      proc in procs:         proc.start()      proc in procs:         proc.join()      while d1d2q.empty() false:         x = d1d2q.get()         print x 

i have function, dutycyclesolve, divided , run (in case, 4 processes). issue is, depending on length of array, z, sometimes, code gets stuck , never proceeds past proc.join. i've verified (by printing text in self.dutycyclesolve self.dutycyclesolve returns , process exits function.

it appears exits function, , (sometimes) gets stuck @ join.

any ideas why? i'm new this.

thanks.

from docs:

bear in mind process has put items in queue wait before terminating until buffered items fed “feeder” thread underlying pipe. [...]

this means whenever use queue need make sure that all items have been put on queue removed before process joined. otherwise cannot sure processes have put items on queue terminate. remember non-daemonic processes automatically joined.

in other words, whenever use queues, right way go get() first, , join(). see docs example.


Comments

Popular posts from this blog

Android layout hidden on keyboard show -

google app engine - 403 Forbidden POST - Flask WTForms -

c - Why would PK11_GenerateRandom() return an error -8023? -