Airflow 气流应用默认值装饰器报告参数是必需的

Airflow 气流应用默认值装饰器报告参数是必需的,airflow,inspect,Airflow,Inspect,我最近遇到了一个严重的错误,气流抛出以下堆栈跟踪(我的**kwargs确实包含作业流id) 文件“”,第219行,在已删除帧的调用中 文件“/mnt/aiffort/dags/zanalytics aiffort/src/main/mysql\u import/dags/mysql\u import\u dag.py”,第23行,在 sync_dag_builder.build_sync_dag() 文件“/mnt/afflow/dags/zanalytics afflow/src/main/m

我最近遇到了一个严重的错误,气流抛出以下堆栈跟踪(我的
**kwargs
确实包含
作业流id

文件“”,第219行,在已删除帧的调用中
文件“/mnt/aiffort/dags/zanalytics aiffort/src/main/mysql\u import/dags/mysql\u import\u dag.py”,第23行,在
sync_dag_builder.build_sync_dag()
文件“/mnt/afflow/dags/zanalytics afflow/src/main/mysql\u import/dags/builders/sync\u dag\u builders/emr\u sync\u dag\u builder.py”,第26行,在build\u sync\u dag\u中
创建任务,终止任务=self.\u创建任务\u流程\u任务()
文件“/mnt/aiffair/dags/zanalytics aiffair/src/main/mysql\u import/dags/builders/sync\u dag\u builders/emr\u sync\u dag\u builder.py”,第44行,在“创建作业\u流程\u任务”中
任务id=全局常量。EMR\u终止\u步骤)
文件“/home/hadoop/.pyenv/versions/3.6.6/lib/python3.6/site-packages/afflow/utils/decorators.py”,第98行,在包装器中
结果=函数(*args,**kwargs)
文件“/mnt/aiffair/dags/zanalytics aiffair/src/main/aws/operators/emr\u terminate\u祖先\u job\u flows\u operator.py”,第31行,in\uu init__
EmrTerminateJobFlowOperator.\uuuuu初始化\uuuuu(self,*args,**kwargs)
文件“/home/hadoop/.pyenv/versions/3.6.6/lib/python3.6/site-packages/afflow/utils/decorators.py”,第98行,在包装器中
结果=函数(*args,**kwargs)
文件“/home/hadoop/.pyenv/versions/3.6.6/lib/python3.6/site packages/afflow/contrib/operators/emr\u terminate\u job\u flow\u operator.py”,第44行,在初始化中__
super(EmrTerminateJobFlowOperator,self)。\uuuuuuuuuuuuuuuu初始值(*args,**kwargs)
文件“/home/hadoop/.pyenv/versions/3.6.6/lib/python3.6/site packages/afflow/utils/decorators.py”,第94行,在包装器中
提升空气流量异常(msg)
airflow.Exception.AirflowException:参数['job\u flow\u id']是必需的

令人不安的部分是

  • 异常当前源自内置的
    \uuuuu init\uuuu
  • 早些时候,它来自,尽管它不接受
    作业流\u id
    参数;但从那以后它就消失了
查看decorators.py,我觉得这可能会把一些事情搞砸。事实上,从it中,我根本不知道函数签名缓存是如何工作的(至少它不工作)


我已尝试删除所有
\uuuu pycache\uuuu
并重新启动
调度程序
Web服务器
(我正在单独运行它们)

  • 是什么导致了错误
  • sig_cache
    是如何工作的,在任何情况下都需要强制清除吗?如果是,如何清除

环境

  • Python 3.6.6
  • 气流1.10.2
  • LocalExecutor
这是我对
气流
的查询的答案,我在
emr\u terminate\u祖先\u job\u flows\u操作符中使用了多重继承(感觉很冒险),并调用了
操作符和
操作符的单个
\u init\u
方法,如图所示。内部
\uuuu init\uuuu
调用的
装饰程序
正在产生问题。最后,我不得不转而使用单个
操作符
s(而不是融合它们)
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/mnt/airflow/dags/zanalytics-airflow/src/main/mysql_import/dags/mysql_import_dag.py", line 23, in <module>
    sync_dag_builder.build_sync_dag()
  File "/mnt/airflow/dags/zanalytics-airflow/src/main/mysql_import/dags/builders/sync_dag_builders/emr_sync_dag_builder.py", line 26, in build_sync_dag
    create_emr_task, terminate_emr_task = self._create_job_flow_tasks()
  File "/mnt/airflow/dags/zanalytics-airflow/src/main/mysql_import/dags/builders/sync_dag_builders/emr_sync_dag_builder.py", line 44, in _create_job_flow_tasks
    task_id=GlobalConstants.EMR_TERMINATE_STEP)
  File "/home/hadoop/.pyenv/versions/3.6.6/lib/python3.6/site-packages/airflow/utils/decorators.py", line 98, in wrapper
    result = func(*args, **kwargs)
  File "/mnt/airflow/dags/zanalytics-airflow/src/main/aws/operators/emr_terminate_ancestor_job_flows_operator.py", line 31, in __init__
    EmrTerminateJobFlowOperator.__init__(self, *args, **kwargs)
  File "/home/hadoop/.pyenv/versions/3.6.6/lib/python3.6/site-packages/airflow/utils/decorators.py", line 98, in wrapper
    result = func(*args, **kwargs)
  File "/home/hadoop/.pyenv/versions/3.6.6/lib/python3.6/site-packages/airflow/contrib/operators/emr_terminate_job_flow_operator.py", line 44, in __init__
    super(EmrTerminateJobFlowOperator, self).__init__(*args, **kwargs)
  File "/home/hadoop/.pyenv/versions/3.6.6/lib/python3.6/site-packages/airflow/utils/decorators.py", line 94, in wrapper
    raise AirflowException(msg)
airflow.exceptions.AirflowException: Argument ['job_flow_id'] is required