Airflow 如何从气流中的任务创建DAG

Airflow 如何从气流中的任务创建DAG,airflow,Airflow,我有一个要求,即只有一个任务的父Dag,它创建某些参数(不固定)。让我们将它们称为params1、params2和params3。现在我想从父Dag中的任务创建三个Dag,在每个任务与Dag的同文本中都有可用的参数。我正在通过以下链接创建动态DAG,并尝试了它- 我得到以下错误- [2018-05-01 09:24:27,764] {__init__.py:45} INFO - Using executor SequentialExecutor [2018-05-01 09:24:27

我有一个要求,即只有一个任务的父Dag,它创建某些参数(不固定)。让我们将它们称为params1、params2和params3。现在我想从父Dag中的任务创建三个Dag,在每个任务与Dag的同文本中都有可用的参数。我正在通过以下链接创建动态DAG,并尝试了它-

我得到以下错误-

    [2018-05-01 09:24:27,764] {__init__.py:45} INFO - Using executor SequentialExecutor
[2018-05-01 09:24:27,875] {models.py:189} INFO - Filling up the DagBag from /mnt/test_project /airflow/dags
[2018-05-01 09:25:02,074] {models.py:1197} INFO - Dependencies all met for <TaskInstance: parent_dynamic_job_dag.test_trigger_dag 2018-04-23 00:00:00 [up_for_retry]>
[2018-05-01 09:25:02,081] {base_executor.py:49} INFO - Adding to queue: airflow run parent_dynamic_job_dag test_trigger_dag 2018-04-23T00:00:00 --local -sd DAGS_FOLDER/test_dynamic_parent_child.py
[2018-05-01 09:25:07,003] {sequential_executor.py:40} INFO - Executing command: airflow run parent_dynamic_job_dag test_trigger_dag 2018-04-23T00:00:00 --local -sd DAGS_FOLDER/test_dynamic_parent_child.py
[2018-05-01 09:25:08,235] {__init__.py:45} INFO - Using executor SequentialExecutor
[2018-05-01 09:25:08,431] {models.py:189} INFO - Filling up the DagBag from /mnt/test_project /airflow/dags/test_dynamic_parent_child.py
[2018-05-01 09:26:44,207] {base_task_runner.py:115} INFO - Running: ['bash', '-c', u'airflow run parent_dynamic_job_dag test_trigger_dag 2018-04-23T00:00:00 --job_id 178 --raw -sd DAGS_FOLDER/test_dynamic_parent_child.py']
[2018-05-01 09:26:45,243] {base_task_runner.py:98} INFO - Subtask: [2018-05-01 09:26:45,242] {__init__.py:45} INFO - Using executor SequentialExecutor
[2018-05-01 09:26:45,416] {base_task_runner.py:98} INFO - Subtask: [2018-05-01 09:26:45,415] {models.py:189} INFO - Filling up the DagBag from /mnt/test_project /airflow/dags/test_dynamic_parent_child.py
[2018-05-01 09:27:49,798] {base_task_runner.py:98} INFO - Subtask: [2018-05-01 09:27:49,797] {models.py:189} INFO - Filling up the DagBag from /mnt/test_project /airflow/dags
[2018-05-01 09:27:50,108] {base_task_runner.py:98} INFO - Subtask: Traceback (most recent call last):
[2018-05-01 09:27:50,108] {base_task_runner.py:98} INFO - Subtask:   File "/Users/manishz/anaconda2/bin/airflow", line 27, in <module>
[2018-05-01 09:27:50,109] {base_task_runner.py:98} INFO - Subtask:     args.func(args)
[2018-05-01 09:27:50,109] {base_task_runner.py:98} INFO - Subtask:   File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
[2018-05-01 09:27:50,110] {base_task_runner.py:98} INFO - Subtask:     pool=args.pool,
[2018-05-01 09:27:50,110] {base_task_runner.py:98} INFO - Subtask:   File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
[2018-05-01 09:27:50,110] {base_task_runner.py:98} INFO - Subtask:     result = func(*args, **kwargs)
[2018-05-01 09:27:50,111] {base_task_runner.py:98} INFO - Subtask:   File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/models.py", line 1493, in _run_raw_task
[2018-05-01 09:27:50,111] {base_task_runner.py:98} INFO - Subtask:     result = task_copy.execute(context=context)
[2018-05-01 09:27:50,112] {base_task_runner.py:98} INFO - Subtask:   File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/operators/dagrun_operator.py", line 67, in execute
[2018-05-01 09:27:50,112] {base_task_runner.py:98} INFO - Subtask:     dr = trigger_dag.create_dagrun(
[2018-05-01 09:27:50,112] {base_task_runner.py:98} INFO - Subtask: AttributeError: 'NoneType' object has no attribute 'create_dagrun'
[2018-05-01 09:28:14,407] {jobs.py:2521} INFO - Task exited with return code 1
[2018-05-01 09:28:14,569] {jobs.py:1959} ERROR - Task instance <TaskInstance: parent_dynamic_job_dag.test_trigger_dag 2018-04-23 00:00:00 [failed]> failed
[2018-05-01 09:28:14,573] {models.py:4584} INFO - Updating state for <DagRun parent_dynamic_job_dag @ 2018-04-23 00:00:00: backfill_2018-04-23T00:00:00, externally triggered: False> considering 3 task(s)
[2018-05-01 09:28:14,576] {models.py:4631} INFO - Marking run <DagRun parent_dynamic_job_dag @ 2018-04-23 00:00:00: backfill_2018-04-23T00:00:00, externally triggered: False> failed
[2018-05-01 09:28:14,600] {jobs.py:2125} INFO - [backfill progress] | finished run 1 of 1 | tasks waiting: 0 | succeeded: 2 | kicked_off: 0 | failed: 1 | skipped: 0 | deadlocked: 0 | not ready: 0
Traceback (most recent call last):
  File "/Users/manishz/anaconda2/bin/airflow", line 27, in <module>
    args.func(args)
  File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/bin/cli.py", line 185, in backfill
    delay_on_limit_secs=args.delay_on_limit)
  File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/models.py", line 3724, in run
    job.run()
  File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/jobs.py", line 198, in run
    self._execute()
  File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/jobs.py", line 2441, in _execute
    raise AirflowException(err)
airflow.exceptions.AirflowException: ---------------------------------------------------
Some task instances failed:
%s
[2018-05-01 09:24:27764]{{uuu init.py:45}信息-使用执行器顺序执行器
[2018-05-01 09:24:27875]{models.py:189}信息-从/mnt/test_project/aiffort/dags中填充DagBag
[2018-05-01 09:25:02074]{models.py:1197}信息-所有依赖项均满足以下条件:
[2018-05-01 09:25:02081]{base_executor.py:49}信息-添加到队列:气流运行父项、动态作业、dag测试、触发器、dag 2018-04-23T00:00:00--本地-sd DAGS文件夹/测试、动态父项、子项
[2018-05-01 09:25:07003]{sequential_executor.py:40}信息-执行命令:气流运行父级动态作业dag测试dag触发器dag 2018-04-23T00:00:00-本地-sd DAGS文件夹/test\u动态父级子级.py
[2018-05-01 09:25:08235]{uuuu init.py:45}信息-使用执行器顺序执行器
[2018-05-01 09:25:08431]{models.py:189}信息-从/mnt/test_project/aiffair/dags/test_dynamic_parent_child.py填充DagBag
[2018-05-01 09:26:44207]{base_task_runner.py:115}运行信息:['bash','-c',u'afflow run parent_dynamic_job_dag test_trigger_2018-04-23T00:00:00--job_id 178--raw-sd DAGS_FOLDER/test_dynamic_parent_child.py]
[2018-05-01 09:26:45243]{base_task_runner.py:98}信息-子任务:[2018-05-01 09:26:45242]{u init_.py:45}信息-使用执行器顺序执行器
[2018-05-01 09:26:45416]{base_task_runner.py:98}信息-子任务:[2018-05-01 09:26:45415]{models.py:189}信息-从/mnt/test_project/aiffairs/dags/test_dynamic_parent_child.py填充数据包
[2018-05-01 09:27:49798]{base_task_runner.py:98}信息-子任务:[2018-05-01 09:27:49797]{models.py:189}信息-从/mnt/test_project/aiffort/dags填充DagBag
[2018-05-01 09:27:50108]{base_task_runner.py:98}信息-子任务:回溯(最近一次呼叫最后一次):
[2018-05-01 09:27:50108]{base_task_runner.py:98}信息-子任务:文件“/Users/manishz/anaconda2/bin/aiffair”,第27行,in
[2018-05-01 09:27:50109]{base_task_runner.py:98}信息-子任务:args.func(args)
[2018-05-01 09:27:50109]{base_task_runner.py:98}信息-子任务:文件“/Users/manishz/anaconda2/lib/python2.7/site packages/afflow/bin/cli.py”,第392行,正在运行
[2018-05-01 09:27:50110]{base_task_runner.py:98}INFO-子任务:pool=args.pool,
[2018-05-01 09:27:50110]{base_task_runner.py:98}信息-子任务:文件“/Users/manishz/anaconda2/lib/python2.7/site packages/aiffort/utils/db.py”,第50行,在包装器中
[2018-05-01 09:27:50110]{base_task_runner.py:98}信息-子任务:result=func(*args,**kwargs)
[2018-05-01 09:27:50111]{base_task_runner.py:98}信息-子任务:文件“/Users/manishz/anaconda2/lib/python2.7/site packages/afflow/models.py”,第1493行,在_run_raw_task中
[2018-05-01 09:27:50111]{base_task_runner.py:98}INFO-子任务:result=task_copy.execute(context=context)
[2018-05-01 09:27:50112]{base_task_runner.py:98}信息-子任务:文件“/Users/manishz/anaconda2/lib/python2.7/site packages/afflow/operators/dagrun_operator.py”,执行中第67行
[2018-05-01 09:27:50112]{base_task_runner.py:98}INFO-子任务:dr=trigger_dag.create_dagrun(
[2018-05-01 09:27:50112]{base_task_runner.py:98}信息-子任务:AttributeError:'NoneType'对象没有属性'create_dagrun'
[2018-05-01 09:28:14407]{jobs.py:2521}信息-任务已退出,返回代码为1
[2018-05-01 09:28:14569]{jobs.py:1959}错误-任务实例失败
[2018-05-01 09:28:14573]{models.py:4584}信息-考虑3项任务的更新状态
[2018-05-01 09:28:14576]{models.py:4631}信息-标记运行失败
[2018-05-01 09:28:14600]{jobs.py:2125}INFO-[backfill progress]|完成了第1次运行,共1次任务等待:0 |成功:2 |启动:0 |失败:1 |跳过:0 |死锁:0 |未准备:0
回溯(最近一次呼叫最后一次):
文件“/Users/manishz/anaconda2/bin/afflow”,第27行,in
args.func(args)
回填中的文件“/Users/manishz/anaconda2/lib/python2.7/site packages/afflow/bin/cli.py”,第185行
延迟时间限制(秒=参数。延迟时间限制)
文件“/Users/manishz/anaconda2/lib/python2.7/site packages/afflow/models.py”,第3724行,运行中
job.run()
文件“/Users/manishz/anaconda2/lib/python2.7/site packages/afflow/jobs.py”,第198行,运行中
self._execute()
文件“/Users/manishz/anaconda2/lib/python2.7/site packages/afflow/jobs.py”,第2441行,在
升高空气流量异常(err)
气流。异常。气流异常:---------------------------------------------------
某些任务实例失败:
%
但是上面的代码没有运行下面的DAG。知道这里发生了什么吗

提前谢谢
曼尼什

我明白了。。。我想知道我的气流配置出了什么问题…@tobi6:更新了描述并完成了logs@tobi6:添加了完整的代码以进行描述缺少更多代码:
XcomManager
BigQueryManager
和所有导入。此外,由于某些原因,
default_args
是缩进的,但不应该是类的一部分。这不是MVCE。除此之外,错误来自
TriggerDagRunOperator
。您的
条件触发
函数不是可以启动的DAG运行对象。我不明白为什么要创建DAG,并在以后使用TriggerDagRunOperator。这个代码示例很难理解,并且涉及很多步骤。您可能需要构建一个MVCE。
    [2018-05-01 09:24:27,764] {__init__.py:45} INFO - Using executor SequentialExecutor
[2018-05-01 09:24:27,875] {models.py:189} INFO - Filling up the DagBag from /mnt/test_project /airflow/dags
[2018-05-01 09:25:02,074] {models.py:1197} INFO - Dependencies all met for <TaskInstance: parent_dynamic_job_dag.test_trigger_dag 2018-04-23 00:00:00 [up_for_retry]>
[2018-05-01 09:25:02,081] {base_executor.py:49} INFO - Adding to queue: airflow run parent_dynamic_job_dag test_trigger_dag 2018-04-23T00:00:00 --local -sd DAGS_FOLDER/test_dynamic_parent_child.py
[2018-05-01 09:25:07,003] {sequential_executor.py:40} INFO - Executing command: airflow run parent_dynamic_job_dag test_trigger_dag 2018-04-23T00:00:00 --local -sd DAGS_FOLDER/test_dynamic_parent_child.py
[2018-05-01 09:25:08,235] {__init__.py:45} INFO - Using executor SequentialExecutor
[2018-05-01 09:25:08,431] {models.py:189} INFO - Filling up the DagBag from /mnt/test_project /airflow/dags/test_dynamic_parent_child.py
[2018-05-01 09:26:44,207] {base_task_runner.py:115} INFO - Running: ['bash', '-c', u'airflow run parent_dynamic_job_dag test_trigger_dag 2018-04-23T00:00:00 --job_id 178 --raw -sd DAGS_FOLDER/test_dynamic_parent_child.py']
[2018-05-01 09:26:45,243] {base_task_runner.py:98} INFO - Subtask: [2018-05-01 09:26:45,242] {__init__.py:45} INFO - Using executor SequentialExecutor
[2018-05-01 09:26:45,416] {base_task_runner.py:98} INFO - Subtask: [2018-05-01 09:26:45,415] {models.py:189} INFO - Filling up the DagBag from /mnt/test_project /airflow/dags/test_dynamic_parent_child.py
[2018-05-01 09:27:49,798] {base_task_runner.py:98} INFO - Subtask: [2018-05-01 09:27:49,797] {models.py:189} INFO - Filling up the DagBag from /mnt/test_project /airflow/dags
[2018-05-01 09:27:50,108] {base_task_runner.py:98} INFO - Subtask: Traceback (most recent call last):
[2018-05-01 09:27:50,108] {base_task_runner.py:98} INFO - Subtask:   File "/Users/manishz/anaconda2/bin/airflow", line 27, in <module>
[2018-05-01 09:27:50,109] {base_task_runner.py:98} INFO - Subtask:     args.func(args)
[2018-05-01 09:27:50,109] {base_task_runner.py:98} INFO - Subtask:   File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/bin/cli.py", line 392, in run
[2018-05-01 09:27:50,110] {base_task_runner.py:98} INFO - Subtask:     pool=args.pool,
[2018-05-01 09:27:50,110] {base_task_runner.py:98} INFO - Subtask:   File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/utils/db.py", line 50, in wrapper
[2018-05-01 09:27:50,110] {base_task_runner.py:98} INFO - Subtask:     result = func(*args, **kwargs)
[2018-05-01 09:27:50,111] {base_task_runner.py:98} INFO - Subtask:   File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/models.py", line 1493, in _run_raw_task
[2018-05-01 09:27:50,111] {base_task_runner.py:98} INFO - Subtask:     result = task_copy.execute(context=context)
[2018-05-01 09:27:50,112] {base_task_runner.py:98} INFO - Subtask:   File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/operators/dagrun_operator.py", line 67, in execute
[2018-05-01 09:27:50,112] {base_task_runner.py:98} INFO - Subtask:     dr = trigger_dag.create_dagrun(
[2018-05-01 09:27:50,112] {base_task_runner.py:98} INFO - Subtask: AttributeError: 'NoneType' object has no attribute 'create_dagrun'
[2018-05-01 09:28:14,407] {jobs.py:2521} INFO - Task exited with return code 1
[2018-05-01 09:28:14,569] {jobs.py:1959} ERROR - Task instance <TaskInstance: parent_dynamic_job_dag.test_trigger_dag 2018-04-23 00:00:00 [failed]> failed
[2018-05-01 09:28:14,573] {models.py:4584} INFO - Updating state for <DagRun parent_dynamic_job_dag @ 2018-04-23 00:00:00: backfill_2018-04-23T00:00:00, externally triggered: False> considering 3 task(s)
[2018-05-01 09:28:14,576] {models.py:4631} INFO - Marking run <DagRun parent_dynamic_job_dag @ 2018-04-23 00:00:00: backfill_2018-04-23T00:00:00, externally triggered: False> failed
[2018-05-01 09:28:14,600] {jobs.py:2125} INFO - [backfill progress] | finished run 1 of 1 | tasks waiting: 0 | succeeded: 2 | kicked_off: 0 | failed: 1 | skipped: 0 | deadlocked: 0 | not ready: 0
Traceback (most recent call last):
  File "/Users/manishz/anaconda2/bin/airflow", line 27, in <module>
    args.func(args)
  File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/bin/cli.py", line 185, in backfill
    delay_on_limit_secs=args.delay_on_limit)
  File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/models.py", line 3724, in run
    job.run()
  File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/jobs.py", line 198, in run
    self._execute()
  File "/Users/manishz/anaconda2/lib/python2.7/site-packages/airflow/jobs.py", line 2441, in _execute
    raise AirflowException(err)
airflow.exceptions.AirflowException: ---------------------------------------------------
Some task instances failed:
%s