Curl 在RESTful路径中触发芹菜任务

Curl 在RESTful路径中触发芹菜任务,curl,flask,docker-compose,celery,flower,Curl,Flask,Docker Compose,Celery,Flower,我想为我的芹菜任务设置一条路线并监控它们 这是我在运行于localhost:5000的flask应用程序中的代码 background.py 任务: @celery.task(queue='cache') def cache_user_tracks_with_features(): return {'status': 'Task completed!'} @task_bp.route('/filter', methods=['GET', 'POST']) def cache_user

我想为我的
芹菜
任务设置一条路线并监控它们


这是我在运行于
localhost:5000的
flask
应用程序中的代码

background.py

任务

@celery.task(queue='cache')
def cache_user_tracks_with_features():
    return {'status': 'Task completed!'}
@task_bp.route('/filter', methods=['GET', 'POST'])
def cache_user_with_features():
    # task
    task = cache_user_tracks_with_features.apply_async()
    while not task.ready():
      sleep(2)

    response_object = {
        'status': 'fail',
        'message': 'User does not exist'
    }
    try:
        user = User.query.filter_by(id=1)).first()
        if not user:
            return jsonify(response_object), 404
        else:
            response_object = {
                'status': 'success',
                'data': {
                    'task_id': task.id,
                    'username': user.username,
                    'email': user.email,
                    'active': user.active
                }
            }
            return jsonify(response_object), 200
    except ValueError:
        return jsonify(response_object), 404
路线

@celery.task(queue='cache')
def cache_user_tracks_with_features():
    return {'status': 'Task completed!'}
@task_bp.route('/filter', methods=['GET', 'POST'])
def cache_user_with_features():
    # task
    task = cache_user_tracks_with_features.apply_async()
    while not task.ready():
      sleep(2)

    response_object = {
        'status': 'fail',
        'message': 'User does not exist'
    }
    try:
        user = User.query.filter_by(id=1)).first()
        if not user:
            return jsonify(response_object), 404
        else:
            response_object = {
                'status': 'success',
                'data': {
                    'task_id': task.id,
                    'username': user.username,
                    'email': user.email,
                    'active': user.active
                }
            }
            return jsonify(response_object), 200
    except ValueError:
        return jsonify(response_object), 404

触发尝试

我试图在终端使用
CURL
测试它,如下所示:

$ curl -X POST http://localhost:5001/filter -H "Content-Type: application/json" 
但要么我从服务器得到
curl:(52)空回复,要么它就挂起了。如果我从函数中删除
task
,并从
curl POST
中删除,我会得到:

{
  "data": {
    "active": true, 
    "email": "me@mac.com", 
    "username": "me"
  }, 
  "status": "success"
}
Docker
日志告诉我:

nginx_1    | 172.21.0.1 - - [03/Apr/2019:22:26:41 +0000] "GET /manifest.json HTTP/1.1" 304 0 "http://localhost/filter" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/73.0.3683.86 Safari/537.36" "-"

web-db_1   | 2019-04-01 19:52:52.415 UTC [1] LOG:  background worker "logical replication launcher" (PID 25) exited with exit code 1

celery_1   | worker: Warm shutdown (MainProcess)
celery_1   |  
celery_1   |  -------------- celery@fb24d4bd2089 v4.2.1 (windowlicker)
celery_1   | ---- **** ----- 
celery_1   | --- * ***  * -- Linux-4.9.125-linuxkit-x86_64-with 2019-04-06 21:34:38
celery_1   | -- * - **** --- 
celery_1   | - ** ---------- [config]
celery_1   | - ** ---------- .> app:         project:0x7f9923d8a9e8
celery_1   | - ** ---------- .> transport:   redis://redis:6379/0
celery_1   | - ** ---------- .> results:     redis://redis:6379/0
celery_1   | - *** --- * --- .> concurrency: 2 (prefork)
celery_1   | -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
celery_1   | --- ***** ----- 
celery_1   |  -------------- [queues]
celery_1   |                 .> cache            exchange=cache(direct) key=cache
celery_1   |                 
celery_1   | 
celery_1   | [tasks]
celery_1   |   . project.api.routes.background.cache_user_tracks_with_analysis
celery_1   |   . project.api.routes.background.cache_user_tracks_with_features

这就是我在我的
docker compose
文件中配置
芹菜
Flower
芹菜
监控)的方式:

docker-compose-dev.yml

web/logs/芹菜\u log

Flower
在仪表板上显示处于活动状态的工人:

芹菜实例化

问题


我错过了什么?如何触发此芹菜任务并对其进行监控?

我不知道到底是什么问题(似乎没问题)。。。有几种方法(就像芹菜一样——有很多方法可以实现目标)来实现你想要做的事情:

1) 使用apply_async()和轮询完成执行。比如:

res = cache_user_tracks_with_features.apply_async("""parameters here""")
while not res.ready():
  sleep(2)
# business logic
2) 使用apply_async()并链接到作业完成后要执行的任务

res = cache_user_tracks_with_features.apply_async(
        """parameters here""", 
        link=task_to_run_when_finished)
芹菜也有link_error参数,所以如果发生错误,您可以给它一个要执行的函数

3) 使用芹菜工作流。制作一个包含缓存的链\u用户\u使用\u功能跟踪\u,并执行其他任务


或者可能是完全不同的东西给您带来麻烦…

问题出在
config.py

REDIS_HOST = "0.0.0.0"
REDIS_PORT = 6379
BROKER_URL = os.environ.get('REDIS_URL', "redis://{host}:{port}/0".format(
                                                                host=REDIS_HOST, 
                                                                port=str(REDIS_PORT)))
INSTALLED_APPS = ['routes']
# celery config
CELERYD_CONCURRENCY = 10
CELERY_BROKER_URL = BROKER_URL #<-------- THIS WAS OVERRIDING

只需将
cellery\u BROKER\u URL
设置为
redis://redis:6379/0
config.py
中以及在
docker环境中
解决了这个问题:任务现在由worker拾取,流程由flower监控。

我尝试了解决方案1),使用了一个简化的虚拟任务(请参阅编辑),而且它不适用于
curl
POST
。它只是挂着。如果我从
response\u object
中删除
task=cache\u user\u tracks\u和\u features.apply\u async()
'task\u id':task.id
,它将返回object。所以任务仍然有问题。@dejanlekic
.delay
不会在后台轮询它只是调用
apply\u async
()更方便的方法。你可能把
delay
Task搞混了。get
你完全正确。我不知怎么把它和。get()搞混了!感谢您指出这一点。您如何实例化芹菜应用程序对象?请参阅编辑,带有
芹菜
实例化和
config.py
。您的评论为我指明了正确的方向,我解决了问题。但我无法让我的代表回答我自己的问题。因此,如果你希望对方给出类似的回答,请随意,你会得到赏金的。之后我会删除我的答案。好的回答。:)是的,这样的问题时不时地在到处蔓延。。。
res = cache_user_tracks_with_features.apply_async(
        """parameters here""", 
        link=task_to_run_when_finished)
REDIS_HOST = "0.0.0.0"
REDIS_PORT = 6379
BROKER_URL = os.environ.get('REDIS_URL', "redis://{host}:{port}/0".format(
                                                                host=REDIS_HOST, 
                                                                port=str(REDIS_PORT)))
INSTALLED_APPS = ['routes']
# celery config
CELERYD_CONCURRENCY = 10
CELERY_BROKER_URL = BROKER_URL #<-------- THIS WAS OVERRIDING
environment:
      - CELERY_BROKER=redis://redis:6379/0  #<------- THIS WAS BEING OVERRIDEN
      - CELERY_RESULT_BACKEND=redis://redis:6379/0