Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/355.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何在Python中以FIFO方式处理多个WebSocket?_Python_Websocket_Python Asyncio_Coroutine - Fatal编程技术网

如何在Python中以FIFO方式处理多个WebSocket?

如何在Python中以FIFO方式处理多个WebSocket?,python,websocket,python-asyncio,coroutine,Python,Websocket,Python Asyncio,Coroutine,我正在编写一个处理两个WebSocket的函数,每个WebSocket的响应将改变一个共享数据帧df import json import asyncio import websockets @asyncio.coroutine def printResponse(df, dataSocket, quoteSocket, dataRequest, quoteRequest): yield from dataSocket.send(dataRequest) yield from

我正在编写一个处理两个WebSocket的函数,每个WebSocket的响应将改变一个共享数据帧df

import json
import asyncio
import websockets

@asyncio.coroutine
def printResponse(df, dataSocket, quoteSocket, dataRequest, quoteRequest):

    yield from dataSocket.send(dataRequest)
    yield from quoteSocket.send(quoteRequest)

    response = yield from dataSocket.recv()     # skip first response
    response = yield from quoteSocket.recv()    # skip first response

    while True:

        response = yield from dataSocket.recv()
        print("<< {}".format(json.loads(response)))
        df = changeRecord(df, response)

        response = yield from quoteSocket.recv()
        print("<< {}".format(json.loads(response)))
        df = changeRecord(df, response)
导入json
导入异步
导入WebSocket
@异步协同程序
def打印响应(df、dataSocket、quoteSocket、dataRequest、quoteRequest):
来自dataSocket.send(dataRequest)的收益
quoteSocket.send(quoteRequest)的收益
响应=从dataSocket.recv()中获得的收益#跳过第一个响应
响应=从quoteSocket.recv()中获得的收益#跳过第一个响应
尽管如此:
响应=来自dataSocket.recv()的产量

print(“,因为您在同一个循环中使用两个
yield from
语句,而它将按顺序处理它们,然后无限重复

因此,它将始终等待,直到从
dataSocket
获得响应,然后等待,直到从
quoteSocket
获得响应,然后冲洗并重复

Tasks()
对于您正在尝试做的事情非常有效,因为它们允许协同路由彼此独立运行。因此,如果您在各自的任务包装器中启动两个单独的协同路由,那么每个协同路由都将等待自己的下一个响应,而不一定会干扰对方

例如:

import json
import asyncio
import websockets

@asyncio.coroutine
def coroutine_1(df, dataSocket):
    yield from dataSocket.send(dataRequest)
    response = yield from dataSocket.recv()     # skip first response
    while True:
        response = yield from dataSocket.recv()
        print("<< {}".format(json.loads(response)))
        df = changeRecord(df, response)

@asyncio.coroutine
def coroutine_2(df, quoteSocket):
    yield from quoteSocket.send(quoteRequest)
    response = yield from quoteSocket.recv()    # skip first response
    while True:
        response = yield from quoteSocket.recv()
        print("<< {}".format(json.loads(response)))
        df = changeRecord(df, response)

@asyncio.coroutine
def printResponse(df, dataSocket, quoteSocket):

    websocket_task_1 = asyncio.ensure_future(coroutine_1(df, dataSocket))
    websocket_task_2 = asyncio.ensure_future(coroutine_2(df, quoteSocket))

    yield from asyncio.wait([websocket_task_1, websocket_task_2])
导入json
导入异步
导入WebSocket
@异步协同程序
def协程_1(df,数据存储):
来自dataSocket.send(dataRequest)的收益
响应=从dataSocket.recv()中获得的收益#跳过第一个响应
尽管如此:
响应=来自dataSocket.recv()的产量

print(“在此结构中,dataSocket和quoteSocket是在同一个df上工作还是在两个单独的df上工作?我的目标是使来自两个WebSocket的响应在同一个df上工作。由dataSocket更新的df应该可以用于quoteSocket的响应。我添加了一个共享数据帧tb。现在我们有了coroutine_1(tb,df,dataSocket)和coroutine_2(tb,df,quoteSocket)。事实证明,df仍能正常工作,但协程_1中的tb更新对协程_2不可用。您能想出导致此问题的任何潜在原因吗?@kinreyli是传递给两个协程的同一tb实例,还是您可能分别创建两个协程?相同的tb。仅数据帧不起作用。我解决了此问题将DataFrame放在列表中会产生问题。虽然我不知道为什么这样做会有帮助,但它确实修复了这个错误。