Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/347.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 我的烧瓶应用程序响应超时_Python_Flask_Sqlalchemy_Gunicorn_Flask Restful - Fatal编程技术网

Python 我的烧瓶应用程序响应超时

Python 我的烧瓶应用程序响应超时,python,flask,sqlalchemy,gunicorn,flask-restful,Python,Flask,Sqlalchemy,Gunicorn,Flask Restful,嗨,我正在使用docker+Flask+Gunicorn+SQLAlchemy构建一个RESTful服务。 但是,有时执行SQL时会超时 这是我的引擎: def get_sybase_us_engine(self): from spa_manager import spa_config new_sybase_engine = create_engine(spa_config["SYBASE_URL"], connect_args=spa_config["SYBASE_CONN_A

嗨,我正在使用docker+Flask+Gunicorn+SQLAlchemy构建一个RESTful服务。 但是,有时执行SQL时会超时

这是我的引擎:

def get_sybase_us_engine(self):
    from spa_manager import spa_config
    new_sybase_engine = create_engine(spa_config["SYBASE_URL"], connect_args=spa_config["SYBASE_CONN_ARGS"], echo=True, echo_pool=True, pool_size=10)
    return new_sybase_engine
没错,sybase

以下是我如何使用引擎:

import yaml
from flask import Blueprint, request
from flask_restful import Api, Resource
us_sql_test_blueprint = Blueprint('us_sql_test_blueprint', __name__)
us_sql_test_blueprint_api = Api(us_sql_test_blueprint)


class USSQLTest(Resource):

    def post(self):
        """
        SQL Test API
        ---
        parameters:
          - name: json_string
            in: body
            type: string
            required: false
        tags:
          - SQL Test API
        """
        from app.util.sql_engine import SQLEngine
        from app.util.spa_logger import get_logger
        logger = get_logger()
        # get data from request
        spa_data_str = request.get_data()  
        # convert to dict
        requestbody = yaml.safe_load(spa_data_str)  
        logger.info("SQL TEST request incoming!")
        logger.info("Req: %s" % str(requestbody))
        # here I establish connection to DB
        sql_engine = SQLEngine() 
        sybase_us_engine = sql_engine.get_sybase_us_engine()
        sybase_us_conn = sybase_us_engine.connect()
        logger.info("US connection established!")
        from app.implement.spa_db_sql_test import us_sql_test
        try:
            # query data from DB
            result_dict = us_sql_test(sybase_us_conn)
            logger.info("No hang here, query successfully for all connections")
            logger.info(str(result_dict))
            return result_dict
        except Exception, e:
            logger.exception(e.message)
            return e.message
        finally:
            # finnally close connection
            logger.info("Close us connection")
            sybase_us_conn.close()
            sybase_us_engine.dispose()

us_sql_test_blueprint_api.add_resource(USSQLTest, '/sql-test-us')
然而,在我的Gunicorn日志中,某个时间段工作超时(pid 265)

这是我的项目日志:

[2018-07-23 17:24:25,425] - spa_test_us_view.py [Line:32] - [INFO]-[thread:140514956412672]-[process:265] - US connection established!
[2018-07-23 17:24:25,426] - log.py [Line:109] - [INFO]-[thread:140514956412672]-[process:265] - 
        select top 10 * from part_master

[2018-07-23 17:24:25,426] - log.py [Line:109] - [INFO]-[thread:140514956412672]-[process:265] - ()
此进程:265挂起,没有进程:265的进一步日志

以下是我如何在dockerfile中启动Gunicorn:

ENTRYPOINT gunicorn -w 8 spa_manager:spa_app -b 0.0.0.0:80 --log-level=debug --timeout 900
顺便说一句,如果我不使用flask和Gunicorn,只需使用Python脚本,就不会发生超时


出什么事了?我不知道…

在python中,执行查询需要多长时间?@ThatBird正如您所见,这是一个非常简单的SQL,所以我没有记录。在python中未找到超时。(我使用NIFI调用脚本)
ENTRYPOINT gunicorn -w 8 spa_manager:spa_app -b 0.0.0.0:80 --log-level=debug --timeout 900