Warning: file_get_contents(/data/phpspider/zhask/data//catemap/6/apache/8.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache Airflow:将Airflow.cfg指向postgres后,它仍然尝试在MySQL上运行_Apache_Celery_Airflow - Fatal编程技术网

Apache Airflow:将Airflow.cfg指向postgres后,它仍然尝试在MySQL上运行

Apache Airflow:将Airflow.cfg指向postgres后,它仍然尝试在MySQL上运行,apache,celery,airflow,Apache,Celery,Airflow,我正在使用。到目前为止,我的DAG一直在LocalExecutor上平稳运行。现在我想放大它并使用CeleryExecutor(我仍在本地Mac上运行)我已经将它配置为在CeleryExecutor上运行,当服务器启动时,日志显示CeleryExecutor。但每当我运行airflow worker(因此可以将其用作worker)或airflow flower时,我都会遇到一个错误,它试图连接到Mysqlite,但在找不到模块时失败 我已经在本地配置了rabbitMQ,在virtualenv上配

我正在使用。到目前为止,我的DAG一直在
LocalExecutor
上平稳运行。现在我想放大它并使用
CeleryExecutor
(我仍在本地Mac上运行)我已经将它配置为在CeleryExecutor上运行,当服务器启动时,日志显示CeleryExecutor。但每当我运行airflow worker(因此可以将其用作worker)或airflow flower时,我都会遇到一个错误,它试图连接到Mysqlite,但在找不到模块时失败

我已经在本地配置了
rabbitMQ
,在
virtualenv
上配置了气流。在afflow.cfg中找到以下更新的行:

broker_url = amqp://myuser:mypassword@localhost/myvhost
result_backend = db+postgresql://localhost:5433/celery_space?user=celery_user&password=celery_user
sql_alchemy_conn = postgresql://localhost:5433/postgres?user=postgres&password=root
executor = CeleryExecutor
请查找以下工作错误:

 -------------- celery@superadmins-MacBook-Pro.local v4.2.1 (windowlicker)
---- **** ----- 
--- * ***  * -- Darwin-18.2.0-x86_64-i386-64bit 2018-12-30 09:15:00
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         airflow.executors.celery_executor:0x10c682fd0
- ** ---------- .> transport:   sqla+mysql://airflow:airflow@localhost:3306/airflow
- ** ---------- .> results:     mysql://airflow:**@localhost:3306/airflow
- *** --- * --- .> concurrency: 16 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> default          exchange=default(direct) key=default


[tasks]
  . airflow.executors.celery_executor.execute_command

[2018-12-30 09:15:00,290: INFO/MainProcess] Connected to sqla+mysql://airflow:airflow@localhost:3306/airflow
[2018-12-30 09:15:00,304: CRITICAL/MainProcess] Unrecoverable error: ModuleNotFoundError("No module named 'MySQLdb'")
Traceback (most recent call last):
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/celery/worker/worker.py", line 205, in start
    self.blueprint.start(self)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/celery/bootsteps.py", line 119, in start
    step.start(parent)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/celery/bootsteps.py", line 369, in start
    return self.obj.start()
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 317, in start
    blueprint.start(self)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/celery/bootsteps.py", line 119, in start
    step.start(parent)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/celery/worker/consumer/tasks.py", line 41, in start
    c.connection, on_decode_error=c.on_decode_error,
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/celery/app/amqp.py", line 297, in TaskConsumer
    **kw
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/messaging.py", line 386, in __init__
    self.revive(self.channel)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/messaging.py", line 408, in revive
    self.declare()
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/messaging.py", line 421, in declare
    queue.declare()
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/entity.py", line 608, in declare
    self._create_queue(nowait=nowait, channel=channel)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/entity.py", line 617, in _create_queue
    self.queue_declare(nowait=nowait, passive=False, channel=channel)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/entity.py", line 652, in queue_declare
    nowait=nowait,
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/transport/virtual/base.py", line 531, in queue_declare
    self._new_queue(queue, **kwargs)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 82, in _new_queue
    self._get_or_create(queue)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 70, in _get_or_create
    obj = self.session.query(self.queue_cls) \
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 65, in session
    _, Session = self._open()
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 56, in _open
    engine = self._engine_from_config()
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/kombu/transport/sqlalchemy/__init__.py", line 51, in _engine_from_config
    return create_engine(conninfo.hostname, **transport_options)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/sqlalchemy/engine/__init__.py", line 425, in create_engine
    return strategy.create(*args, **kwargs)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/sqlalchemy/engine/strategies.py", line 81, in create
    dbapi = dialect_cls.dbapi(**dbapi_args)
  File "/Users/deepaksaroha/Desktop/apache_2.0/nb-atom-airflow/lib/python3.7/site-packages/sqlalchemy/dialects/mysql/mysqldb.py", line 102, in dbapi
    return __import__('MySQLdb')
ModuleNotFoundError: No module named 'MySQLdb'
[2018-12-30 09:15:00,599] {__init__.py:51} INFO - Using executor SequentialExecutor
Starting flask

请告知worker如何与airflow.cfg设置同步运行。感谢您的帮助,如果需要进一步的日志或任何配置文件,请告诉我。

更改afflow.cfg后,您是否重新启动了airflow Web服务器?是的,我重新启动了服务器、计划程序和工作程序。
grep“sqla+mysql”/Users/deepaksaroha/Desktop/apache_2.0/-R
-您可能正在使用不同的配置文件运行,或者配置文件中有多个值。此外,确保您确实设置了气流环境变量;参考文档。@NinoWalker酷!这实际上起了作用。我在第1次运行时配置了主文件,但没有将其添加到.rc文件中。定义它,解决了问题。谢谢