Python 我是否可以将concurrent.futures与Flask相结合来提高性能

Python 我是否可以将concurrent.futures与Flask相结合来提高性能,python,multithreading,flask,concurrent.futures,Python,Multithreading,Flask,Concurrent.futures,我想知道对Flask使用concurrent.futures是否合适。这里有一个例子 import requests from flask import Flask from concurrent.futures import ThreadPoolExecutor executor = ThreadPoolExecutor(max_workers=10) app = Flask(__name__) @app.route("/path/<xxx>") def hello(xxx):

我想知道对Flask使用
concurrent.futures
是否合适。这里有一个例子

import requests
from flask import Flask
from concurrent.futures import ThreadPoolExecutor

executor = ThreadPoolExecutor(max_workers=10)
app = Flask(__name__)

@app.route("/path/<xxx>")
def hello(xxx):
    f = executor.submit(task, xxx)
    return "OK"

def task():
    resp = requests.get("some_url")
    # save to mongodb

app.run()
导入请求
从烧瓶进口烧瓶
从concurrent.futures导入ThreadPoolExecutor
executor=线程池executor(最大工作线程数=10)
app=烧瓶(名称)
@应用程序路径(“/path/”)
你好(xxx):
f=执行者提交(任务,xxx)
返回“OK”
def任务():
resp=requests.get(“some_url”)
#保存到mongodb
app.run()
任务是IO绑定的,不需要返回值。请求不会经常出现,我想最多10/s


我测试了它,它成功了。我想知道的是,通过这种方式使用多线程是否可以提高性能。Flask会以某种方式阻止任务吗?

这取决于比Flask更多的因素,例如您在Flask前面使用的内容(gunicorn、gevent、uwsgi、nginx等)。如果您发现您对“some_url”的请求确实是一个瓶颈,那么将它推到另一个线程可能会带来提升,但这取决于您的具体情况;web堆栈中的许多元素会使过程“缓慢”

与Flask进程上的多线程(这可能会很快变得复杂)不同,将阻塞I/O推送到辅助进程可能是更好的解决方案。您可以将Redis消息发送到在asyncio事件循环上运行的进程,这样可以很好地扩展

app.py

from flask import Flask
import redis

r = redis.StrictRedis(host='127.0.0.1', port=6379)
app = Flask(__name__)

@app.route("/")
def hello():
    # send your message to the other process with redis
    r.publish('some-channel', 'some data')
    return "OK"

if __name__ == '__main__':
    app.run(port=4000, debug=True)
helper.py

import asyncio
import asyncio_redis
import aiohttp

@asyncio.coroutine
def get_page():
    # get some url
    req = yield from aiohttp.get('http://example.com')
    data = yield from req.read()

    # insert into mongo using Motor or some other async DBAPI
    #yield from insert_into_database(data) 

@asyncio.coroutine
def run():
    # Create connection
    connection = yield from asyncio_redis.Connection.create(host='127.0.0.1', port=6379)

    # Create subscriber.
    subscriber = yield from connection.start_subscribe()

    # Subscribe to channel.
    yield from subscriber.subscribe([ 'some-channel' ])

    # Inside a while loop, wait for incoming events.
    while True:
        reply = yield from subscriber.next_published()
        print('Received: ', repr(reply.value), 'on channel', reply.channel)
        yield from get_page()

    # When finished, close the connection.
    connection.close()

if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(run())

如果有不频繁的I/O绑定请求,最好使用带有reactor模式line tornado的Web服务器或类似twisted的库。这不是烧瓶的典型用例。我正在尝试决定是否应该使用它。