Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/312.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python Airflow webserver为DAG提供cron错误,计划间隔为None_Python_Airflow_Airflow Scheduler - Fatal编程技术网

Python Airflow webserver为DAG提供cron错误,计划间隔为None

Python Airflow webserver为DAG提供cron错误,计划间隔为None,python,airflow,airflow-scheduler,Python,Airflow,Airflow Scheduler,我在LinuxAMI中使用LocalExecutor和PostgreSQL数据库运行Airflow 1.9.0。我想手动触发DAG,但每当我创建的DAG的schedule\u interval设置为None或@once,Web服务器树视图就会崩溃,并出现以下错误(我只显示最后一次调用): 此外,当我手动触发DAG时,DAG运行将启动,但任务本身从未计划。我环顾四周,但似乎只有我有这种错误。以前是否有人遇到过此错误并找到了修复程序 触发问题的最小示例: import datetime as dt

我在LinuxAMI中使用LocalExecutor和PostgreSQL数据库运行Airflow 1.9.0。我想手动触发DAG,但每当我创建的DAG的
schedule\u interval
设置为
None
@once
,Web服务器树视图就会崩溃,并出现以下错误(我只显示最后一次调用):

此外,当我手动触发DAG时,DAG运行将启动,但任务本身从未计划。我环顾四周,但似乎只有我有这种错误。以前是否有人遇到过此错误并找到了修复程序

触发问题的最小示例:

import datetime as dt
from airflow import DAG
from airflow.operators.bash_operator import BashOperator

default_args = {
    'owner': 'me'
}

bash_command = """
    echo "this is a test task"
"""

with DAG('schedule_test',
        default_args=default_args,
        start_date = dt.datetime(2018, 7, 24),
        schedule_interval='None',
        catchup=False
        ) as dag:

    first_task = BashOperator(task_id = "first_task", bash_command = bash_command)
试试这个:

  • 在没有
    '
    的情况下,将
    计划间隔设置为
    None
    ,或者干脆不在DAG中指定
    计划间隔。默认设置为
    None
    。更多信息请参见此处:--搜索计划时间间隔
  • 在dag底部为任务设置编排
像这样:

import datetime
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.dummy_operator import DummyOperator

default_args = {
    'owner': 'me'
}

bash_command = """
    echo "this is a test task"
"""

with DAG('schedule_test',
        default_args=default_args,
        start_date = datetime(2018, 7, 24),
        schedule_interval=None,
        catchup=False
        ) as dag:

t1 = DummyOperator(
    task_id='extract_data',
    dag=dag
)

t2 = BashOperator(
    task_id = "first_task", 
    bash_command = bash_command
)

#####ORCHESTRATION#####
## It is saying that in order for t2 to run, t1 must be done.
t2.set_upstream(t1)

似乎设置
@none
会产生相同的错误,但是设置Python对象
none
可能会起作用。谢谢您的帮助@Zack!我昨天指出,您需要为Python对象设置
schedule\u interval
以使其工作,哈哈。DAG的编排并不是一个真正的问题,我只是在与计划间隔作斗争。顺便说一句,虽然BaseOperator在默认情况下可能有
schedule\u interval=None
,但对于DAG对象,它被设置为
schedule\u interval=timedelta(1)
,请参见,这仍然存在将DAG设置为
schedule\u interval='@once'
的问题。我知道气流不会自行安排DAG运行(因为我没有设置结束日期),但它应该安排手动触发的DAG运行。谢谢@zack!yoru的回答对我很有帮助,但是我相信如果你删除了schedule_interval,默认值将是“@daily”。至少在气流1.10中,将
schedule\u interval=None
添加到默认参数是否有效??
import datetime
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.operators.dummy_operator import DummyOperator

default_args = {
    'owner': 'me'
}

bash_command = """
    echo "this is a test task"
"""

with DAG('schedule_test',
        default_args=default_args,
        start_date = datetime(2018, 7, 24),
        schedule_interval=None,
        catchup=False
        ) as dag:

t1 = DummyOperator(
    task_id='extract_data',
    dag=dag
)

t2 = BashOperator(
    task_id = "first_task", 
    bash_command = bash_command
)

#####ORCHESTRATION#####
## It is saying that in order for t2 to run, t1 must be done.
t2.set_upstream(t1)