Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/309.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/python-3.x/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何在Python中将同步转换为异步_Python_Python 3.x_Asynchronous - Fatal编程技术网

如何在Python中将同步转换为异步

如何在Python中将同步转换为异步,python,python-3.x,asynchronous,Python,Python 3.x,Asynchronous,我的代码可以很好地获取n到q范围内的数据,但是当我想要处理大范围(1-100000)时,它太慢了。我的代码需要3-4秒来完成每个post请求。所以我想加快速度,下面是我的代码: import requests from tqdm import tqdm url = "https://www.example.com/info/item" headers = { 'Content-Type': "application/x-www-form-urlencoded", 'Access

我的代码可以很好地获取n到q范围内的数据,但是当我想要处理大范围(1-100000)时,它太慢了。我的代码需要3-4秒来完成每个post请求。所以我想加快速度,下面是我的代码:

import requests
from tqdm import tqdm

url = "https://www.example.com/info/item"
headers = {
    'Content-Type': "application/x-www-form-urlencoded",
    'Access-Control-Allow-Origin': "*",
    'Accept-Encoding': "gzip, deflate",
    'Accept-Language': "en-US",
    }

n = 1
q = 50
sum = 0
for i in tqdm(range(n,q)):
    payload = "item_id={}".format(i+1)
    response = requests.request("POST", url, data=payload, headers=headers)
    print(response.text)
    sum = sum + i
以上代码执行所有50个请求需要>150秒


这就是为什么?我尝试使其异步,现在发送50个请求所需的总时间是,正如他们在之前的评论中告诉您的那样,要实现您的目标,最好的方法是创建不同的线程,而不是使其异步

import requests
from tqdm import tqdm
import threading

url = "https://www.example.com/info/item"
headers = {
    'Content-Type': "application/x-www-form-urlencoded",
    'Access-Control-Allow-Origin': "*",
    'Accept-Encoding': "gzip, deflate",
    'Accept-Language': "en-US",
    }

n = 1
q = 50
sum = 0

def ThreadPOST(payload, headers):
    response = request.request("POST", url, data=payload, headers=headers)
    print(response.text)



for i in tqdm(range(n,q)):
    payload = "item_id={}".format(i+1)
    threading.Thread(target=ThreadPOST, args=(payload, headers)).start()
    sum = sum + i

这将产生50个线程,每个线程将上载其自身ID的数据。

我找到了上述异步代码的可用修复程序,并且效果非常好。

import asyncio
import requests
import aiohttp
import time

async def make_numbers(numbers, _numbers):
    for i in range(numbers, _numbers):
        yield i

async def fetch():
    url = "https://www.example.com/info/item"
    async with aiohttp.ClientSession() as session:
        post_tasks = []
        # prepare the coroutines that poat
        async for x in make_numbers(1, 100):
            post_tasks.append(do_post(session, url, x))
        # now execute them all at once
        await asyncio.gather(*post_tasks)

async def do_post(session, url, x):
    headers = {
    'Content-Type': "application/x-www-form-urlencoded",
    'Access-Control-Allow-Origin': "*",
    'Accept-Encoding': "gzip, deflate",
    'Accept-Language': "en-US"
    }
    payload = "item_id={}".format(x)
    async with  session.post(url, data =payload,headers=headers) as response:
          data = await response.text()
          print("-> Created account number %d" % x)
          print (data)

s = time.perf_counter()
loop = asyncio.get_event_loop()
try:
    loop.run_until_complete(fetch())
finally:
    loop.close()

elapsed = time.perf_counter() - s
print(f"{__file__} executed in {elapsed:0.2f} seconds.")

你能为我解决上述问题吗异步不是我的专长。堆栈溢出不是代码编写服务。其目的是回答具体问题。对于并行POST请求,我以前使用过这个库:您要寻找的答案是
多线程
他们正在进行网络I/O,asyncio非常适合此类任务。
import asyncio
import requests
import aiohttp
import time

async def make_numbers(numbers, _numbers):
    for i in range(numbers, _numbers):
        yield i

async def fetch():
    url = "https://www.example.com/info/item"
    async with aiohttp.ClientSession() as session:
        post_tasks = []
        # prepare the coroutines that poat
        async for x in make_numbers(1, 100):
            post_tasks.append(do_post(session, url, x))
        # now execute them all at once
        await asyncio.gather(*post_tasks)

async def do_post(session, url, x):
    headers = {
    'Content-Type': "application/x-www-form-urlencoded",
    'Access-Control-Allow-Origin': "*",
    'Accept-Encoding': "gzip, deflate",
    'Accept-Language': "en-US"
    }
    payload = "item_id={}".format(x)
    async with  session.post(url, data =payload,headers=headers) as response:
          data = await response.text()
          print("-> Created account number %d" % x)
          print (data)

s = time.perf_counter()
loop = asyncio.get_event_loop()
try:
    loop.run_until_complete(fetch())
finally:
    loop.close()

elapsed = time.perf_counter() - s
print(f"{__file__} executed in {elapsed:0.2f} seconds.")