Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/python-3.x/18.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 3.x Sanic web框架性能_Python 3.x_Performance_Sanic_Asyncpg - Fatal编程技术网

Python 3.x Sanic web框架性能

Python 3.x Sanic web框架性能,python-3.x,performance,sanic,asyncpg,Python 3.x,Performance,Sanic,Asyncpg,我有一个关于sanic/asyncpg性能的问题要问 在测试过程中,奇怪的事情不断发生(可能是出于设计) 首先让我解释一下测试程序。这很简单 我使用蝗虫通过设置最大用户数来尽可能多地推送服务器 测试脚本是: from locust import HttpLocust, TaskSet, task, between class UserActions(TaskSet): @task(1) def test_point_1(self): self.client.g

我有一个关于sanic/asyncpg性能的问题要问

在测试过程中,奇怪的事情不断发生(可能是出于设计)

首先让我解释一下测试程序。这很简单

我使用蝗虫通过设置最大用户数来尽可能多地推送服务器

测试脚本是:

from locust import HttpLocust, TaskSet, task, between


class UserActions(TaskSet):
    @task(1)
    def test_point_1(self):
        self.client.get(
            '/json_1',
            headers={'Content-Type': 'application/json'}
        )

    @task(2)
    def test_point_2(self):
        self.client.get(
            '/json_2',
            headers={'Content-Type': 'application/json'}
        )


class ApplicationUser(HttpLocust):
    task_set = UserActions
    wait_time = between(0, 0)

它用于测试以下代码。请注意,asyncpg正在调用potgresql睡眠函数以模拟负载:

import asyncio
import uvloop
from asyncpg import create_pool
from sanic import Sanic, response
from sanic.log import logger
import aiotask_context as context

app = Sanic(__name__)

DATABASE = {
    'type': 'postgresql',
    'host': '127.0.0.1',
    'user': 'test_user',
    'port': '5432',
    'password': 'test_password',
    'database': 'test_database'
}

conn_uri = '{0}://{1}:{2}@{3}:{4}/{5}'.format(
            'postgres',
            DATABASE['user'], DATABASE['password'], DATABASE['host'],
            DATABASE['port'], DATABASE['database'])


@app.route("/json_1")
async def handler_json_1(request):
    async with request.app.pg.acquire() as connection:
        await connection.fetchrow('SELECT pg_sleep(0.85);')
    return response.json({"foo": "bar"})


@app.route("/json_2")
async def handler_json_2(request):
    async with request.app.pg.acquire() as connection:
        await connection.fetchrow('SELECT pg_sleep(0.2);')
    return response.json({"foo": "bar"})


@app.listener("before_server_start")
async def listener_before_server_start(*args, **kwargs):
    try:

        pg_pool = await create_pool(
            conn_uri, min_size=2, max_size=10,
            server_settings={'application_name': 'test_backend'})
        app.pg = pg_pool

    except Exception as bss_error:
        logger.error('before_server_start_test erred with :{}'.format(bss_error))
        app.pg = None


@app.listener("after_server_start")
async def listener_after_server_start(*args, **kwargs):
    # print("after_server_start")
    pass


@app.listener("before_server_stop")
async def listener_before_server_stop(*args, **kwargs):
    # print("before_server_stop")
    pass


@app.listener("after_server_stop")
async def listener_after_server_stop(*args, **kwargs):
    # print("after_server_stop")
    pass


if __name__ == '__main__':
    asyncio.set_event_loop(uvloop.new_event_loop())
    server = app.create_server(host="0.0.0.0", port=8282, return_asyncio_server=True)
    loop = asyncio.get_event_loop()
    loop.set_task_factory(context.task_factory)
    task = asyncio.ensure_future(server)
    try:
        loop.run_forever()
    except Exception as lerr:
        logger.error('Loop run error: {}'.format(lerr))
        loop.stop()

问题是,在一段随机的时间后,服务器变得没有响应 (不返回503或任何其他代码)用于cca。60秒。 进程也会挂起(我可以用ps aux查看它,CTRL+C无法终止它。)

这可能是有问题的,因为对于其中一个来说,很难检测到,也很难确定向服务器发送请求的速率

这可能是配置(sanic/asyncpg)的问题吗


设置nginx/sanic请求超时是否是避免此问题的唯一选项

您的aiopg池限制为10个连接。因此,一次最多10个请求,每个请求需要0.2秒,您的最大可能负载为1秒/0.2秒*10池大小=50 RPM。在此之后,所有传入的请求都将等待连接,请求队列的增长速度将远远快于您的服务能力,您的服务器将变得不负责任