Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/319.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何在asyncio python中运行True?_Python_Asynchronous_Async Await_Python Asyncio_Aiohttp - Fatal编程技术网

如何在asyncio python中运行True?

如何在asyncio python中运行True?,python,asynchronous,async-await,python-asyncio,aiohttp,Python,Asynchronous,Async Await,Python Asyncio,Aiohttp,我正在使用asyncio/aiohttp向不同的网站发送异步GET请求。计划是从redis队列中获取100个URL,并异步向它们发送GET请求。然后再获取100个URL并重复此过程。此外,如果url失败(超时或HTTP_状态==403),进程将把它添加到队列末尾。我已经写了一个代码来实现这一点,但它冻结后一段时间。谁能告诉我如何实现它?这是我的密码: import asyncio from aiohttp import ClientSession import async_timeout imp

我正在使用
asyncio/aiohttp
向不同的网站发送异步
GET
请求。计划是从redis队列中获取100个URL,并异步向它们发送
GET
请求。然后再获取100个URL并重复此过程。此外,如果url失败(超时或HTTP_状态==403),进程将把它添加到队列末尾。我已经写了一个代码来实现这一点,但它冻结后一段时间。谁能告诉我如何实现它?这是我的密码:

import asyncio
from aiohttp import ClientSession
import async_timeout
import aiohttp
import aiosocks
import redis
import json

url_list = []


async def fetch(url, session,r_server):
    agent = 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36'
    headers = {'user-Agent': agent,'accept-Language':'en-US,en;q=0.8','accept':'text/javascript, application/javascript,
     application/ecmascript, application/x-ecmascript, */*; q=0.01',
    'accept-Encoding':'gzip, deflate, sdch, br','x-requested-with':'XMLHttpRequest'}
    with async_timeout.timeout(100):
        async with session.get(url,headers=headers) as response:
            status = response.status
            # Store status code somewhere
            ...


async def bound_fetch(sem, url, session,r_server):
    # Getter function with semaphore.
    async with sem:
        try:
            await fetch(url, session,r_server)
        except Exception as e:
            print ("In semaphore",e,url)
            # Push url in redis queue
            ...

async def run(url_list,r_server):
    tasks = []
    # create instance of Semaphore
    sem = asyncio.Semaphore(1000)

    # Create client session that will ensure we dont open new connection
    # per request.
    async with ClientSession() as session:
        for url in url_list:
            # pass Semaphore and session to every GET request
            task = asyncio.ensure_future(bound_fetch(sem, url, session,r_server))
            tasks.append(task)

        responses = asyncio.gather(*tasks)
        await responses

async def get_url_list(r_server):
        url_list = []
        # Get url list from redis: queue_list
        for docs in queue_list:
            doc = json.loads(docs.decode("utf-8"))
            url = doc["url"]
            url_list.append(url)


        loop = asyncio.get_event_loop()
        future = asyncio.ensure_future(run(url_list,r_server))
        loop.run_until_complete(future)


if __name__ == "__main__":
    r_server = redis.Redis("localhost")
    while True:
      get_url_list(r_server)
      time.sleep(5)
Ouestion: 如何在asyncio python中运行True

替换

loop.run_until_complete(future)


您的函数等待3个参数
def运行(url\u列表、标题、r\u服务器):
但您只给它2个
asyncio。请确保将来(运行(url\u列表、r\u服务器))
谢谢您的更正。编辑了问题。抱歉,在找到脚本冻结点之前,无法为您提供帮助。但使用同步redis客户端并反复运行事件循环是反模式的。你不能永远运行<代码>永远运行()不接受参数。
loop.run_forever(future)