Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/python-2.7/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 2.7 当我们从Airflow web服务器触发DAG时,Airflow scheduler崩溃_Python 2.7_Airflow_Airflow Scheduler - Fatal编程技术网

Python 2.7 当我们从Airflow web服务器触发DAG时,Airflow scheduler崩溃

Python 2.7 当我们从Airflow web服务器触发DAG时,Airflow scheduler崩溃,python-2.7,airflow,airflow-scheduler,Python 2.7,Airflow,Airflow Scheduler,如果我们开始打开DAG并从Airflow web服务器触发DAG,Airflow scheduler进程将崩溃 气流版本-**v1.10.4 Redis服务器v=5.0.7 scheduler_job.py:1325} ERROR - Exception when executing execute_helper Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/airflow/jobs/sch

如果我们开始打开DAG并从Airflow web服务器触发DAG,Airflow scheduler进程将崩溃

气流版本-**v1.10.4

Redis服务器v=5.0.7

scheduler_job.py:1325} ERROR - Exception when executing execute_helper
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/jobs/scheduler_job.py", line 1323, in _execute
    self._execute_helper()
  File "/usr/lib/python2.7/site-packages/airflow/jobs/scheduler_job.py", line 1412, in _execute_helper
    self.executor.heartbeat()
  File "/usr/lib/python2.7/site-packages/airflow/executors/base_executor.py", line 132, in heartbeat
    self.trigger_tasks(open_slots)
  File "/usr/lib/python2.7/site-packages/airflow/executors/celery_executor.py", line 203, in trigger_tasks
    cached_celery_backend = tasks[0].backend
  File "/usr/lib/python2.7/site-packages/celery/local.py", line 146, in __getattr__
    return getattr(self._get_current_object(), name)
  File "/usr/lib/python2.7/site-packages/celery/app/task.py", line 1037, in backend
    return self.app.backend
  File "/usr/lib/python2.7/site-packages/kombu/utils/objects.py", line 44, in __get__
    value = obj.__dict__[self.__name__] = self.__get(obj)
  File "/usr/lib/python2.7/site-packages/celery/app/base.py", line 1223, in backend
    return self._get_backend()
  File "/usr/lib/python2.7/site-packages/celery/app/base.py", line 940, in _get_backend
    self.loader)
  File "/usr/lib/python2.7/site-packages/celery/app/backends.py", line 74, in by_url
    return by_name(backend, loader), url
  File "/usr/lib/python2.7/site-packages/celery/app/backends.py", line 54, in by_name
    cls = symbol_by_name(backend, aliases)
  File "/usr/lib/python2.7/site-packages/kombu/utils/imports.py", line 57, in symbol_by_name
    module = imp(module_name, package=package, **kwargs)
  File "/usr/lib64/python2.7/importlib/__init__.py", line 37, in import_module
    __import__(name)
ImportError: No module named 'db
executor=CeleryExecutor

broker_url = 'redis://:password@redis-host:2287/0'
sql_alchemy_conn = postgresql+psycopg2://user:password@host/dbname

result_backend = 'db+postgresql://user:password@host/dbname'
崩溃并显示以下错误消息。

scheduler_job.py:1325} ERROR - Exception when executing execute_helper
Traceback (most recent call last):
  File "/usr/lib/python2.7/site-packages/airflow/jobs/scheduler_job.py", line 1323, in _execute
    self._execute_helper()
  File "/usr/lib/python2.7/site-packages/airflow/jobs/scheduler_job.py", line 1412, in _execute_helper
    self.executor.heartbeat()
  File "/usr/lib/python2.7/site-packages/airflow/executors/base_executor.py", line 132, in heartbeat
    self.trigger_tasks(open_slots)
  File "/usr/lib/python2.7/site-packages/airflow/executors/celery_executor.py", line 203, in trigger_tasks
    cached_celery_backend = tasks[0].backend
  File "/usr/lib/python2.7/site-packages/celery/local.py", line 146, in __getattr__
    return getattr(self._get_current_object(), name)
  File "/usr/lib/python2.7/site-packages/celery/app/task.py", line 1037, in backend
    return self.app.backend
  File "/usr/lib/python2.7/site-packages/kombu/utils/objects.py", line 44, in __get__
    value = obj.__dict__[self.__name__] = self.__get(obj)
  File "/usr/lib/python2.7/site-packages/celery/app/base.py", line 1223, in backend
    return self._get_backend()
  File "/usr/lib/python2.7/site-packages/celery/app/base.py", line 940, in _get_backend
    self.loader)
  File "/usr/lib/python2.7/site-packages/celery/app/backends.py", line 74, in by_url
    return by_name(backend, loader), url
  File "/usr/lib/python2.7/site-packages/celery/app/backends.py", line 54, in by_name
    cls = symbol_by_name(backend, aliases)
  File "/usr/lib/python2.7/site-packages/kombu/utils/imports.py", line 57, in symbol_by_name
    module = imp(module_name, package=package, **kwargs)
  File "/usr/lib64/python2.7/importlib/__init__.py", line 37, in import_module
    __import__(name)
ImportError: No module named 'db
为什么在触发DAG时调度程序会崩溃?
我试过运行
pip install DB
,但没有解决问题

如错误所述。您一定没有正确设置数据库

你做了什么

$ airflow initidb
在尝试启动
Web服务器之前

另外,您似乎正在使用
python2.7
,您确定它与您正在使用的最新版本的
兼容吗

我在使用最新的
aiffair
时使用了
python3.5.2
,它对我不起作用,因此我不得不稍微降低我的
aiffair
版本

气流与Python 2.7版不兼容 使用python 3.6运行airflow,然后创建数据库用户并授予特权,然后运行命令“airflow initdb”。这将在气流中初始化数据库