Python 在多处理池中调用requests.get时挂起
我有以下代码:Python 在多处理池中调用requests.get时挂起,python,performance,optimization,multiprocessing,python-requests,Python,Performance,Optimization,Multiprocessing,Python Requests,我有以下代码: def process_url(url): print '111' r = requests.get(url) print '222' # <-- never even gets here return urls_to_download = [list_or_urls] PARALLEL_WORKERS = 4 pool = Pool(PARALLEL_WORKERS) pool.map_async(process_url, urls_
def process_url(url):
print '111'
r = requests.get(url)
print '222' # <-- never even gets here
return
urls_to_download = [list_or_urls]
PARALLEL_WORKERS = 4
pool = Pool(PARALLEL_WORKERS)
pool.map_async(process_url, urls_to_download)
pool.close()
pool.join()
def进程的url(url):
打印'111'
r=请求。获取(url)
打印“222”#如果只从单个进程/脚本或交互式Python shell中调用process_url()函数,会发生什么?很明显,问题出在requests.get()中的某个地方,可能是它阻塞了某些连接或输入(读取/获取)操作。如果使用已知的可访问URL,会发生什么情况?例如,可以在单个线程中运行。请尝试添加显式返回。我个人还建议不要使用来自子进程的任何打印或其他控制台I/O。更好的编程模型是让父进程作为调度程序和控制台I/O控制器,让子进程/工作进程通过线程返回结果(或者,可能让它们都写入某个DB或其他一致的数据存储)。添加显式返回似乎没有帮助。@David542这个问题已经解决了吗?我也面临同样的问题。