Python 限制并发请求数aiohttp

Python 限制并发请求数aiohttp,python,image,python-requests,python-asyncio,aiohttp,Python,Image,Python Requests,Python Asyncio,Aiohttp,我正在使用aiohttp下载图像,想知道是否有办法限制尚未完成的打开请求的数量。这是我目前拥有的代码: async def get_images(url, session): chunk_size = 100 # Print statement to show when a request is being made. print(f'Making request to {url}') async with session.get(url=url) as

我正在使用aiohttp下载图像,想知道是否有办法限制尚未完成的打开请求的数量。这是我目前拥有的代码:

async def get_images(url, session):

    chunk_size = 100

    # Print statement to show when a request is being made. 
    print(f'Making request to {url}')

    async with session.get(url=url) as r:
        with open('path/name.png', 'wb') as file:
            while True:
                chunk = await r.content.read(chunk_size)
                if not chunk:
                    break
                file.write(chunk)

# List of urls to get images from
urls = [...]

conn = aiohttp.TCPConnector(limit=3)
loop = asyncio.get_event_loop()
session = aiohttp.ClientSession(connector=conn, loop=loop)
loop.run_until_complete(asyncio.gather(*(get_images(url, session=session) for url in urls)))
问题是,我抛出了一个print语句来告诉我何时发出每个请求,它同时发出了几乎21个请求,而不是我想要限制的3个请求(即,一旦图像下载完成,它可以移动到列表中的下一个url以获取)。我只是想知道我做错了什么

正好解决了这个问题

在您的情况下,它将是这样的:

semaphore = asyncio.Semaphore(3)


async def get_images(url, session):

    async with semaphore:

        print(f'Making request to {url}')

        # ...

您可能还想看看这个演示信号量工作原理的现成代码。

您的限制设置工作正常。调试时出错

正如Mikhail Gerasimov指出的,您将
print()
调用放在了错误的位置-它必须位于
session.get()上下文中

为了确信该限制得到了遵守,我在简单日志服务器上测试了您的代码,测试表明服务器接收的连接数与您在
TCPConnector
中设置的连接数完全相同。以下是测试:

import asyncio
import aiohttp
loop = asyncio.get_event_loop()


class SilentServer(asyncio.Protocol):
    def connection_made(self, transport):
        # We will know when the connection is actually made:
        print('SERVER |', transport.get_extra_info('peername'))


async def get_images(url, session):

    chunk_size = 100

    # This log doesn't guarantee that we will connect,
    # session.get() will freeze if you reach TCPConnector limit
    print(f'CLIENT | Making request to {url}')

    async with session.get(url=url) as r:
        while True:
            chunk = await r.content.read(chunk_size)
            if not chunk:
                break

urls = [f'http://127.0.0.1:1337/{x}' for x in range(20)]

conn = aiohttp.TCPConnector(limit=3)
session = aiohttp.ClientSession(connector=conn, loop=loop)


async def test():
    await loop.create_server(SilentServer, '127.0.0.1', 1337)
    await asyncio.gather(*(get_images(url, session=session) for url in urls))

loop.run_until_complete(test())

但是为什么在TCP连接器中设置限制不起作用呢?@AndriyMaletsky我想它起作用了,但后来执行流程就开始了:在打印行之后,在
会话的某个地方。获取
。我认为不将此工作委托给连接器更方便,而是在需要的地方使用信号量。