Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/python-2.7/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 2.7 未通过运行芹菜和RabbitMQ的气流执行作业_Python 2.7_Rabbitmq_Celery_Airflow_Apache Airflow - Fatal编程技术网

Python 2.7 未通过运行芹菜和RabbitMQ的气流执行作业

Python 2.7 未通过运行芹菜和RabbitMQ的气流执行作业,python-2.7,rabbitmq,celery,airflow,apache-airflow,Python 2.7,Rabbitmq,Celery,Airflow,Apache Airflow,下面是im使用的配置 [core] # The home folder for airflow, default is ~/airflow airflow_home = /root/airflow # The folder where your airflow pipelines live, most likely a # subfolder in a code repository dags_folder = /root/airflow/dags # The folder where ai

下面是im使用的配置

[core]
# The home folder for airflow, default is ~/airflow
airflow_home = /root/airflow

# The folder where your airflow pipelines live, most likely a
# subfolder in a code repository
dags_folder = /root/airflow/dags

# The folder where airflow should store its log files. This location
base_log_folder = /root/airflow/logs

# An S3 location can be provided for log backups
# For S3, use the full URL to the base folder (starting with "s3://...")
s3_log_folder = None

# The executor class that airflow should use. Choices include
# SequentialExecutor, LocalExecutor, CeleryExecutor
#executor = SequentialExecutor
#executor = LocalExecutor
executor = CeleryExecutor

# The SqlAlchemy connection string to the metadata database.
# SqlAlchemy supports many different database engine, more information
# their website
#sql_alchemy_conn = sqlite:////home/centos/airflow/airflow.db
sql_alchemy_conn = mysql://username:password@XXX.XXX.XXX.XXX:3306/airflow_prod

[celery]
# This section only applies if you are using the CeleryExecutor in
# [core] section above


# The app name that will be used by celery
celery_app_name = airflow.executors.celery_executor

# The concurrency that will be used when starting workers with the
# "airflow worker" command. This defines the number of task instances that
# a worker will take, so size up your workers based on the resources on
# your worker box and the nature of your tasks
celeryd_concurrency = 16

# When you start an airflow worker, airflow starts a tiny web server
# subprocess to serve the workers local log files to the airflow main
# web server, who then builds pages and sends them to users. This defines
# the port on which the logs are served. It needs to be unused, and open
# visible from the main web server to connect into the workers.
worker_log_server_port = 8793

# The Celery broker URL. Celery supports RabbitMQ, Redis and experimentally
# a sqlalchemy database. Refer to the Celery documentation for more
# information.
broker_url = pyamqp://guest:guest@XXX.XXX.XXX.XXX:5672/


# Another key Celery setting
celery_result_backend = db+mysql://username:password@XXX.XXX.XXX.XXX:3306/airflow_prod

# Celery Flower is a sweet UI for Celery. Airflow has a shortcut to start
# it `airflow flower`. This defines the port that Celery Flower runs on
flower_port = 5556

# Default queue that tasks get assigned to and that worker listen on.
default_queue = = default
但是工作不会运行。。调度程序显示它正在检查状态,如下所示

[2017-05-11 05:09:13,070] {models.py:2274} INFO - Checking state for <DagRun tutorial @ 2015-06-13 00:00:00: scheduled__2015-06-13T00:00:00, externally triggered: False>
[2017-05-11 05:09:13,072] {models.py:2274} INFO - Checking state for <DagRun tutorial @ 2015-06-14 00:00:00: scheduled__2015-06-14T00:00:00, externally triggered: False>
[2017-05-11 05:09:13,074] {models.py:2274} INFO - Checking state for <DagRun tutorial @ 2015-06-15 00:00:00: scheduled__2015-06-15T00:00:00, externally triggered: False>
[2017-05-11 05:09:13,076] {models.py:2274} INFO - Checking state for <DagRun tutorial @ 2015-06-16 00:00:00: scheduled__2015-06-16T00:00:00, externally triggered: False>
[2017-05-11 05:09:13,078] {models.py:2274} INFO - Checking state for <DagRun tutorial @ 2017-05-10 04:46:29: manual__2017-05-10T04:46:28.756946, externally triggered: True>
[2017-05-11 05:09:13,080] {models.py:2274} INFO - Checking state for <DagRun tutorial @ 2017-05-10 05:08:20: manual__2017-05-10T05:08:20.252573, externally triggered: True>
[2017-05-11 05:09:13070]{models.py:2274}信息-正在检查
[2017-05-11 05:09:13072]{models.py:2274}信息-正在检查
[2017-05-11 05:09:13074]{models.py:2274}信息-正在检查
[2017-05-11 05:09:13076]{models.py:2274}信息-正在检查
[2017-05-11 05:09:13078]{models.py:2274}信息-正在检查
[2017-05-11 05:09:13080]{models.py:2274}信息-正在检查
气流UI已启动并正在运行。Cerlery flower没有显示任何工人我的作业没有运行。

下面是我开始学习的顺序

气流调度器

气流网络服务器-p8080

气流工人


有什么我遗漏的吗?

如果不知道您运行的是哪个版本的Airflow,以及您如何配置rabbitmq服务器,就很难肯定地回答您的问题。不过,我可以提供一些东西供你研究

afflow.cfg
中的代理URL未指定虚拟主机,因此根据文档,将使用默认虚拟主机。我做了一些挖掘,但找不到pyampq的默认虚拟主机是什么,但这值得一看

或者,您可以使用
rabbitmqctl
显式配置虚拟主机。我已复制并粘贴了以下相关信息:

# Configure RabbitMQ: create user and grant privileges
rabbitmqctl add_user rabbitmq_user_name rabbitmq_password
rabbitmqctl add_vhost rabbitmq_virtual_host_name
rabbitmqctl set_user_tags rabbitmq_user_name rabbitmq_tag_name
rabbitmqctl set_permissions -p rabbitmq_virtual_host_name rabbitmq_user_name ".*" ".*" ".*"
最后,您使用的芹菜版本可能会出现问题。在发帖时,芹菜4.X.X不能很好地适应气流。尝试卸载芹菜并重新安装一个有效的版本

pip uninstall celery
pip install celery==3.1.7

当您启动worker时,我会检查worker的控制台。有什么吗?我也没有在Flower UI上看到任何工作人员。不确定我是否错过了配置方面的任何内容。“气流工人”是否足够启动芹菜工人??“气流工人”是否足够。执行此操作时,请检查控制台输出。我想你可能会在那里找到更多关于正在发生的事情的信息。