Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/340.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/neo4j/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python3.4异步IO任务没有';我不能完全执行_Python_Python 3.x_Python Asyncio - Fatal编程技术网

Python3.4异步IO任务没有';我不能完全执行

Python3.4异步IO任务没有';我不能完全执行,python,python-3.x,python-asyncio,Python,Python 3.x,Python Asyncio,我正在试验Python3.4的异步IO模块。由于MongoDB没有使用asyncio的生产就绪包,所以我编写了一个小包装类,在执行器中执行所有mongo查询。这是包装器: import asyncio from functools import wraps from pymongo import MongoClient class AsyncCollection(object): def __init__(self, client): self._client = c

我正在试验Python3.4的异步IO模块。由于MongoDB没有使用asyncio的生产就绪包,所以我编写了一个小包装类,在执行器中执行所有mongo查询。这是包装器:

import asyncio
from functools import wraps
from pymongo import MongoClient


class AsyncCollection(object):
    def __init__(self, client):
        self._client = client
        self._loop = asyncio.get_event_loop()

    def _async_deco(self, name):
        method = getattr(self._client, name)

        @wraps(method)
        @asyncio.coroutine
        def wrapper(*args, **kwargs):
            print('starting', name, self._client)
            r = yield from self._loop.run_in_executor(None, method, *args, **kwargs)
            print('done', name, self._client, r)
            return r

        return wrapper

    def __getattr__(self, name):
        return self._async_deco(name)


class AsyncDatabase(object):
    def __init__(self, client):
        self._client = client
        self._collections = {}


    def __getitem__(self, col):
        return self._collections.setdefault(col, AsyncCollection(self._client[col]))


class AsyncMongoClient(object):
    def __init__(self, host, port):
        self._client = MongoClient(host, port)
        self._loop = asyncio.get_event_loop()
        self._databases = {}

    def __getitem__(self, db):
        return self._databases.setdefault(db, AsyncDatabase(self._client[db]))
我希望异步执行插入,这意味着执行插入的协同程序不希望等待执行完成。asyncio manual声明任务创建时会自动安排执行。所有任务完成后,事件循环停止。,因此我构建了以下测试脚本:

from asyncdb import AsyncMongoClient
import asyncio

@asyncio.coroutine
def main():
    print("Started")
    mongo = AsyncMongoClient("host", 27017)
    asyncio.async(mongo['test']['test'].insert({'_id' : 'test'}))
    print("Done")

loop = asyncio.get_event_loop()
loop.run_until_complete(main())
运行脚本时,我得到以下结果:

Started
Done
starting insert Collection(Database(MongoClient('host', 27017), 'test'), 'test')
应该有一行指示mongo查询已完成。当我
从这个协程中产生而不是使用
asyncio.async
运行它时,我可以看到这一行。然而,真正奇怪的是,当我使用
asyncio.async
运行此协同路由时,MongoDB中实际上存在测试项,因此尽管它似乎可以工作,但我不明白为什么我看不到指示已执行查询的print语句。尽管我使用
run\u运行事件循环直到\u完成
,但它应该等待插入任务完成,即使主协程在此之前完成。

asyncio.async(mongo…)
只调度mongo查询。然后
运行,\u直到完成()
没有等待它。下面的代码示例显示了如何使用
asyncio.sleep()
coroutine:

#!/usr/bin/env python3
import asyncio
from contextlib import closing
from timeit import default_timer as timer

@asyncio.coroutine
def sleep_BROKEN(n):
    # schedule coroutine; it runs on the next yield
    asyncio.async(asyncio.sleep(n))

@asyncio.coroutine
def sleep(n):
    yield from asyncio.sleep(n)

@asyncio.coroutine
def double_sleep(n):
    f = asyncio.async(asyncio.sleep(n))
    yield from asyncio.sleep(n) # the first sleep is also started
    yield from f

n = 2
with closing(asyncio.get_event_loop()) as loop:
    start = timer()
    loop.run_until_complete(sleep_BROKEN(n))
    print(timer() - start)
    loop.run_until_complete(sleep(n))
    print(timer() - start)
    loop.run_until_complete(double_sleep(n))
    print(timer() - start)
输出 输出显示
运行直到\u完成(sleep\u breaked(n))
在不到2毫秒而不是2秒的时间内返回。然后
运行_,直到_完成(sleep(n))
正常工作:2秒钟后返回
double_sleep()
显示由
async.async()
调度的协同路由在
上运行(两个并发睡眠是并行的),即睡眠2秒,而不是4秒。如果在从
生成的第一个
之前添加延迟(不允许运行事件循环),那么您会看到从f
生成的
不会更快返回,即,
asyncio.async
不会运行协程;它只安排它们运行

0.0001221800921484828
2.002586881048046
4.005100341048092