Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/python-3.x/15.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 3.x Python异步循环concurrent.futures.ThreadPoolExecutor_Python 3.x_Asynchronous_Aiohttp - Fatal编程技术网

Python 3.x Python异步循环concurrent.futures.ThreadPoolExecutor

Python 3.x Python异步循环concurrent.futures.ThreadPoolExecutor,python-3.x,asynchronous,aiohttp,Python 3.x,Asynchronous,Aiohttp,我试图以异步方式从一组URL中提取数据。我希望每隔10秒(或多或少)按URL组执行请求 但我收到了这个错误: Traceback (most recent call last): File "test.py", line 34, in <module> loop.run_until_complete(future) File "/usr/lib/python3.5/asyncio/base_events.py", line 466, in run_until_comp

我试图以异步方式从一组URL中提取数据。我希望每隔10秒(或多或少)按URL组执行请求

但我收到了这个错误:

Traceback (most recent call last):
  File "test.py", line 34, in <module>
    loop.run_until_complete(future)
  File "/usr/lib/python3.5/asyncio/base_events.py", line 466, in run_until_complete
    return future.result()
  File "/usr/lib/python3.5/asyncio/futures.py", line 293, in result
    raise self._exception
  File "/usr/lib/python3.5/asyncio/tasks.py", line 241, in _step
    result = coro.throw(exc)
  File "test.py", line 27, in main
    datas_extracted = await asyncio.gather(*tasks, return_exceptions=False)
  File "/usr/lib/python3.5/asyncio/futures.py", line 380, in __iter__
    yield self  # This tells Task to wait for completion.
  File "/usr/lib/python3.5/asyncio/tasks.py", line 304, in _wakeup
    future.result()
  File "/usr/lib/python3.5/asyncio/futures.py", line 293, in result
    raise self._exception
  File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
    result = coro.send(None)
  File "test.py", line 14, in retrieve_datas
    async with session.get(url) as response:
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/client.py", line 603, in __aenter__
    self._resp = yield from self._coro
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/client.py", line 231, in _request
    conn = yield from self._connector.connect(req)
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/connector.py", line 378, in connect
    proto = yield from self._create_connection(req)
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/connector.py", line 687, in _create_connection
    _, proto = yield from self._create_direct_connection(req)
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/connector.py", line 698, in _create_direct_connection
    hosts = yield from self._resolve_host(req.url.raw_host, req.port)
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/connector.py", line 669, in _resolve_host
    self._resolver.resolve(host, port, family=self._family)
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/resolver.py", line 31, in resolve
    host, port, type=socket.SOCK_STREAM, family=family)
  File "/usr/lib/python3.5/asyncio/base_events.py", line 673, in getaddrinfo
    host, port, family, type, proto, flags)
  File "/usr/lib/python3.5/asyncio/base_events.py", line 634, in run_in_executor
    executor = concurrent.futures.ThreadPoolExecutor()
TypeError: __init__() missing 1 required positional argument: 'max_workers'
回溯(最近一次呼叫最后一次):
文件“test.py”,第34行,在
循环。运行_直到_完成(未来)
文件“/usr/lib/python3.5/asyncio/base\u events.py”,第466行,运行直到完成
返回future.result()
文件“/usr/lib/python3.5/asyncio/futures.py”,第293行,在结果中
提出自己的意见
文件“/usr/lib/python3.5/asyncio/tasks.py”,第241行,步骤
结果=核心投掷(exc)
文件“test.py”,第27行,在main中
提取的数据=等待asyncio.gather(*任务,返回异常=False)
文件“/usr/lib/python3.5/asyncio/futures.py”,第380行,在__
屈服自我——这告诉任务等待完成。
文件“/usr/lib/python3.5/asyncio/tasks.py”,第304行,在唤醒中
future.result()
文件“/usr/lib/python3.5/asyncio/futures.py”,第293行,在结果中
提出自己的意见
文件“/usr/lib/python3.5/asyncio/tasks.py”,第239行,步骤
结果=coro.send(无)
文件“test.py”,第14行,检索数据
以session.get(url)作为响应的异步:
文件“/usr/local/lib/python3.5/dist-packages/aiohttp/client.py”,第603行,在__
self.\u resp=self.\u coro的收益
文件“/usr/local/lib/python3.5/dist-packages/aiohttp/client.py”,第231行,在请求中
conn=自连接器的屈服。连接(要求)
文件“/usr/local/lib/python3.5/dist-packages/aiohttp/connector.py”,第378行,在connect中
proto=自创建连接(req)产生的收益
文件“/usr/local/lib/python3.5/dist-packages/aiohttp/connector.py”,第687行,位于创建连接中
_,proto=自产生的收益。创建直接连接(req)
文件“/usr/local/lib/python3.5/dist packages/aiohttp/connector.py”,第698行,位于“创建直接连接”中
主机=自解析主机的产量(req.url.raw\u主机,req.port)
文件“/usr/local/lib/python3.5/dist packages/aiohttp/connector.py”,第669行,在主机中
self.\u resolver.resolve(主机、端口、族=self.\u族)
文件“/usr/local/lib/python3.5/dist-packages/aiohttp/resolver.py”,第31行,在resolve中
主机,端口,类型=socket.SOCK\u流,族=族)
文件“/usr/lib/python3.5/asyncio/base_events.py”,第673行,位于getaddrinfo中
主机、端口、系列、类型、原型、标志)
文件“/usr/lib/python3.5/asyncio/base_events.py”,第634行,在run_in_executor中
executor=concurrent.futures.ThreadPoolExecutor()
TypeError:\uuuu init\uuuuu()缺少1个必需的位置参数:“max\u workers”
所以我的问题是如何修复它,但更重要的是,我认为我没有以正确的方式进行异步。奇怪的是,如果我使用IDE手动迭代(一步一步地调试),我可以在错误出现之前执行一次迭代(接收第一个URL组的数据),但是如果我直接执行此代码,异常会立即触发

编辑:


如果我使用的是Python3.6,则会消除异常。。。代码正在工作,但asyncio.sleep(10)未执行(???)我的代码从不睡眠。如果我用time.sleep(10)替换asyncio.sleep(10),它就可以工作了。我想我错过了什么。我的问题已经解决了,但如果有人能解释我为什么会出现这种关于睡眠的行为,以及我的代码是否正确地执行异步请求

错误不是由
aiohttp
引起的,而是由
asyncio
本身引起的,这很奇怪,因为代码是测试覆盖的

您使用什么python版本?是定制的吗


关于
asyncio.sleep()
——在调用之前放置
wait

错误不是由
aiohttp
引起的,而是由
asyncio
本身引起的,这非常奇怪,因为代码包含测试

您使用什么python版本?是定制的吗


关于
asyncio.sleep()
——在通话前放置
wait

您使用的是什么版本的
aiohttp
?我有版本:2.2.5您使用的是什么版本的
aiohttp
?我有版本:2.2.5arg。。。我忘记了等待:(感谢Andrew。对于我的python版本:python 3.5.3,代码正在使用python 3.5.3上的python 3.6.2
max_workers
参数不是必需的:该参数在python 3.4中是必需的,但在3.5+arg中是可选的……我忘记了等待:(感谢Andrew。对于我的python版本:python 3.5.3,代码正在使用python 3.6.2 on 3.5.3
max_workers
参数不是必需的:该参数在python 3.4中是必需的,但在3.5中是可选的+
Traceback (most recent call last):
  File "test.py", line 34, in <module>
    loop.run_until_complete(future)
  File "/usr/lib/python3.5/asyncio/base_events.py", line 466, in run_until_complete
    return future.result()
  File "/usr/lib/python3.5/asyncio/futures.py", line 293, in result
    raise self._exception
  File "/usr/lib/python3.5/asyncio/tasks.py", line 241, in _step
    result = coro.throw(exc)
  File "test.py", line 27, in main
    datas_extracted = await asyncio.gather(*tasks, return_exceptions=False)
  File "/usr/lib/python3.5/asyncio/futures.py", line 380, in __iter__
    yield self  # This tells Task to wait for completion.
  File "/usr/lib/python3.5/asyncio/tasks.py", line 304, in _wakeup
    future.result()
  File "/usr/lib/python3.5/asyncio/futures.py", line 293, in result
    raise self._exception
  File "/usr/lib/python3.5/asyncio/tasks.py", line 239, in _step
    result = coro.send(None)
  File "test.py", line 14, in retrieve_datas
    async with session.get(url) as response:
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/client.py", line 603, in __aenter__
    self._resp = yield from self._coro
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/client.py", line 231, in _request
    conn = yield from self._connector.connect(req)
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/connector.py", line 378, in connect
    proto = yield from self._create_connection(req)
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/connector.py", line 687, in _create_connection
    _, proto = yield from self._create_direct_connection(req)
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/connector.py", line 698, in _create_direct_connection
    hosts = yield from self._resolve_host(req.url.raw_host, req.port)
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/connector.py", line 669, in _resolve_host
    self._resolver.resolve(host, port, family=self._family)
  File "/usr/local/lib/python3.5/dist-packages/aiohttp/resolver.py", line 31, in resolve
    host, port, type=socket.SOCK_STREAM, family=family)
  File "/usr/lib/python3.5/asyncio/base_events.py", line 673, in getaddrinfo
    host, port, family, type, proto, flags)
  File "/usr/lib/python3.5/asyncio/base_events.py", line 634, in run_in_executor
    executor = concurrent.futures.ThreadPoolExecutor()
TypeError: __init__() missing 1 required positional argument: 'max_workers'