There is a function:
def checker(syn_list): pool = multiprocessing.Pool(PROCESS_COUNT, maxtasksperchild=10) async_result = pool.map_async(check_syn, syn_list) results = async_result.get() pool.close() pool.join() return results
check_syn checks url (urllib2.urlopen () is done there without timeout) and returns a response code or an error from exceptions.
Some URLs are broken (which one is not clear before the check), the response from the server does not come at all, an exception is not generated. Processes hang, over time, new ones simply cease to be created.
How to avoid this suspension?
PS: Python 2.7