Python ValueError:值太多,无法解压缩多处理池
我有以下“worker”,它最初返回一个JSON对象,但我希望它返回多个JSON对象:Python ValueError:值太多,无法解压缩多处理池,python,json,multiprocessing,Python,Json,Multiprocessing,我有以下“worker”,它最初返回一个JSON对象,但我希望它返回多个JSON对象: def data_worker(data): _cats, index, total = data _breeds = {} try: url = _channels['feedUrl'] r = get(url, timeout=5) rss = etree.XML(r.content) tags = rss.xpa
def data_worker(data):
_cats, index, total = data
_breeds = {}
try:
url = _channels['feedUrl']
r = get(url, timeout=5)
rss = etree.XML(r.content)
tags = rss.xpath('//cats/item')
_cats['breeds'] = {}
for t in tags:
_cats['breeds']["".join(t.xpath('breed/@url'))] = True
_breeds['url'] = "".join(t.xpath('breed/@url'))
return [_cats, _breeds]
except:
return [_cats, _breeds]
此辅助进程是多处理池的参数:
cats, breeds = pool.map(data_worker, data, chunksize=1)
当我只使用一个输出(即_cats)运行池和worker时,它工作得很好,但是当我尝试返回多个JSON“模式”时,我会得到错误:
File "crawl.py", line 111, in addFeedData
[cats, breeds] = pool.map(data_worker, data, chunksize=1)
ValueError: too many values to unpack
如何在data\u worker中返回两个单独的JSON对象?我需要它们是单独的JSON对象。注意,我已经尝试了以下方法,但没有成功:
[cats, breeds] = pool.map(data_worker, data, chunksize=1)
(cats, breeds) = pool.map(data_worker, data, chunksize=1)
return (_cats, _breeds)
首先,我想你是想写这个:
cats, breeds = pool.map(data_worker, data, chunksize=1)
但无论如何,这是行不通的,因为data\u worker
返回一对,但map()
返回worker返回的任何内容的列表。所以你应该这样做:
cats = []
breeds = []
for cat, breed in pool.map(data_worker, data, chunksize=1):
cats.append(cat)
breeds.append(breed)
这将为您提供两个您想要的列表
换句话说,您期望有一对列表,但您得到了一个列表