Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/python-3.x/18.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何根据状态代码重试异步aiohttp请求_Python_Python 3.x_Python Asyncio_Aiohttp - Fatal编程技术网

Python 如何根据状态代码重试异步aiohttp请求

Python 如何根据状态代码重试异步aiohttp请求,python,python-3.x,python-asyncio,aiohttp,Python,Python 3.x,Python Asyncio,Aiohttp,我正在使用一个api,有时它会给出一些奇怪的状态代码,这些代码可以通过简单地重试相同的请求来修复。我使用aiohttp异步地向这个api发出请求 我还使用回退库重试请求,但是似乎401个请求仍然没有被重试 @backoff.on_exception(backoff.expo, aiohttp.ClientError, max_tries=11, max_time=60) async def get_user_timeline(self, session, user_id, coun

我正在使用一个api,有时它会给出一些奇怪的状态代码,这些代码可以通过简单地重试相同的请求来修复。我使用aiohttp异步地向这个api发出请求

我还使用回退库重试请求,但是似乎401个请求仍然没有被重试

   @backoff.on_exception(backoff.expo, aiohttp.ClientError, max_tries=11, max_time=60)
    async def get_user_timeline(self, session, user_id, count, max_id, trim_user, include_rts, tweet_mode):

        params = {
            'user_id': user_id,
            'trim_user': trim_user,
            'include_rts': include_rts,
            'tweet_mode': tweet_mode,
            'count': count
        }


        if (max_id and max_id != -1):
            params.update({'max_id': max_id})

        headers = {
            'Authorization': 'Bearer {}'.format(self.access_token)    
        }

        users_lookup_url = "/1.1/statuses/user_timeline.json"

        url = self.base_url + users_lookup_url

        async with session.get(url, params=params, headers=headers) as response:
            result = await response.json()
            response = {
                'result': result,
                'status': response.status,
                'headers': response.headers
            }
            return response

如果响应的状态代码不是200或429,我希望所有请求最多退出10次

默认情况下,aiohttp不会引发非200状态的异常。您应该通过
raise\u for_status=True
()将其更改为:

它应该引发任何状态400或更高的异常,从而触发
退避

代码2xx可能不应该重试,因为这些


无论如何,如果您仍然想为“200或429以外”筹集资金,您可以手动进行:

if response.status not in (200, 429,):
     raise aiohttp.ClientResponseError()

我制作了一个简单的库,可以帮助您:

这样的代码应该可以解决您的问题:

来自aiohttp导入客户端会话
从aiohttp_重试导入RetryClient
状态={x代表范围(100600)内的x}
状态。删除(200)
状态。删除(429)
将ClientSession()作为客户端进行异步:
retry\u client=RetryClient(客户端)
与重试客户端异步。获取(“https://google.com,重试\u尝试次数=10,重试\u状态=状态)作为响应:
text=等待响应。text()
打印(文本)
等待重试\u客户端。关闭()

相反,
google.com
使用你自己的
url

也许它太旧了,但是对于任何想知道如何构建这样的解决方案的人来说

RequestData和ErrorResponseData是您的自定义类,它不是内置的

class DataAPI:
    def __init__(self, api_data_converter: APIDataConverter):
        self.api_data_converter = api_data_converter

    async def _bound_fetch(self, request_data: RequestData, session):
        try:
            async with session.get(request_data.url, raise_for_status=True) as response:
                return ResponseData(await response.text())
        except aiohttp.ClientConnectionError as e:
            Logging.log_exception('Connection error: {}'.format(str(e)))
            return ErrorResponseData(url=request_data.url, request_data=request_data)
        except Exception as e:
            Logging.log_exception('Data API error: {}'.format(str(e)))
            return ErrorResponseData(url=request_data.url, request_data=request_data)

    async def _run_requests(self, request_data: List[RequestData]):
        for rd in request_data:
            Logging.log_info('Request: {}'.format(rd.url))
        async with aiohttp.ClientSession(timeout=ClientTimeout(total=80)) as session:
            tasks = []
            for rd in request_data:
                task = asyncio.ensure_future(self._bound_fetch(rd, session))
                tasks.append(task)
            responses = asyncio.gather(*tasks)
            return await responses

    def get_data(self, request_data: List[RequestData]):
        loop = asyncio.new_event_loop()
        asyncio.set_event_loop(loop)
        skipped = request_data
        responses: List[ResponseData] = []
        for _ in range(2): # specify your retry count instead of 2
            interm_responses = loop.run_until_complete(asyncio.ensure_future(self._run_requests(skipped)))
            skipped = []
            for resp in interm_responses:
                if isinstance(resp, ErrorResponseData):
                    skipped.append(resp.request_data)
                else:
                    responses.append(resp)
            if not skipped:
                break

        if skipped:
            Logging.log_critical('Failed urls remaining')

        for resp in responses:
            data = self.api_data_converter.convert(resp.response)
            if not data:
                Logging.log_exception('Data API error')
            dt = dateutil.parser.parse(data[-1]['dt'])
            resp.response = data
            resp.last_candle_dt = dt
        return responses

对循环使用
如何?@spectras我使用这个库是因为退避策略。
class DataAPI:
    def __init__(self, api_data_converter: APIDataConverter):
        self.api_data_converter = api_data_converter

    async def _bound_fetch(self, request_data: RequestData, session):
        try:
            async with session.get(request_data.url, raise_for_status=True) as response:
                return ResponseData(await response.text())
        except aiohttp.ClientConnectionError as e:
            Logging.log_exception('Connection error: {}'.format(str(e)))
            return ErrorResponseData(url=request_data.url, request_data=request_data)
        except Exception as e:
            Logging.log_exception('Data API error: {}'.format(str(e)))
            return ErrorResponseData(url=request_data.url, request_data=request_data)

    async def _run_requests(self, request_data: List[RequestData]):
        for rd in request_data:
            Logging.log_info('Request: {}'.format(rd.url))
        async with aiohttp.ClientSession(timeout=ClientTimeout(total=80)) as session:
            tasks = []
            for rd in request_data:
                task = asyncio.ensure_future(self._bound_fetch(rd, session))
                tasks.append(task)
            responses = asyncio.gather(*tasks)
            return await responses

    def get_data(self, request_data: List[RequestData]):
        loop = asyncio.new_event_loop()
        asyncio.set_event_loop(loop)
        skipped = request_data
        responses: List[ResponseData] = []
        for _ in range(2): # specify your retry count instead of 2
            interm_responses = loop.run_until_complete(asyncio.ensure_future(self._run_requests(skipped)))
            skipped = []
            for resp in interm_responses:
                if isinstance(resp, ErrorResponseData):
                    skipped.append(resp.request_data)
                else:
                    responses.append(resp)
            if not skipped:
                break

        if skipped:
            Logging.log_critical('Failed urls remaining')

        for resp in responses:
            data = self.api_data_converter.convert(resp.response)
            if not data:
                Logging.log_exception('Data API error')
            dt = dateutil.parser.parse(data[-1]['dt'])
            resp.response = data
            resp.last_candle_dt = dt
        return responses