Docker 气流日期错误dag.normalize\u计划类型错误

Docker 气流日期错误dag.normalize\u计划类型错误,docker,typeerror,airflow,Docker,Typeerror,Airflow,我遇到了如下的apache日期时间问题 Process DagFileProcessor238215-Process: Traceback (most recent call last): File "/usr/local/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap self.run() File "/usr/local/lib/python3.6/multiprocessing/proce

我遇到了如下的apache日期时间问题

Process DagFileProcessor238215-Process:
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 388, in helper
    pickle_dags)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 1832, in process_file
    self._process_dags(dagbag, dags, ti_keys_to_schedule)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 1422, in _process_dags
    dag_run = self.create_dag_run(dag)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/jobs.py", line 856, in create_dag_run
    next_run_date = dag.normalize_schedule(min(task_start_dates))
TypeError: '<' not supported between instances of 'str' and 'datetime.datetime'
udf
模块是我的用户定义功能

但是奇怪的事情发生了

  • 我转到
    webserver UI
    打开dag
    ,它仍然失败,我在
    计划中看到错误消息,如上所述
  • 我在cli中使用
    回填
    作为
    气流回填-s 20140101-e 20180101
    ,然后转到
    计划
    错误消息消失,所有任务开始计划或排队
我尝试了几种方法来解决这个问题,但失败了

  • 尝试将
    默认参数中的
    start\u date
    设置为
    aiffort.utils.dates.days\u ago
    对象,但失败,例如
    days\u ago(2018,9,5)
  • 尝试将
    default\u args
    中的
    start\u date
    设置为
    aiffort.utils.timezone.datetime
    对象,但失败,例如
    datetime(2018,9,5)
  • 尝试将
    DAG
    中的
    schedule\u interval
    设置为变量,如
    @daily
    但失败
  • 尝试将
    DAG
    中的
    schedule\u interval
    设置为
    datetime.timedelta
    对象,但失败

每个人都遇到过这样的问题吗?我该如何解决这个问题?

在我的Dag文件中,我用param
start\u date定义了一个任务,我将其修改为重命名param。

流程中的任何任务是否也定义了开始日期?@joeb yes,修复重命名任务para name
from airflow import DAG
from airflow.models import Variable
from airflow.operators.dummy_operator import DummyOperator
from udf.udf_hive_operator import HiveOperator
from airflow.operators.hive_to_mysql import HiveToMySqlTransfer
from udf.udf_hive_to_oracle import HiveToOracleTransfer
from udf.utils.date_utils import gen_history_date_para, today_belong_business_day
from datetime import datetime, timedelta

TMPL_SQL_PATH = Variable.get("sql_path")
HIVE_DB = "default"
NOSTRICT_HIVE_PARTITION_MODE = "set hive.exec.dynamic.partition.mode=nonstrict;\n"

default_args = {
    "owner": "xx_monitor",
    "description": "workflow for xx monitor system",
    "depends_on_past": False,
    "start_date": datetime(2014, 1, 1),
    "email": ["airflow@airflow.com"],
    "email_on_failure": False,
    "email_on_retry": False,
    "retries": 3,
    "retry_delay": timedelta(minutes=5),
    # "queue": "bash_queue",
    # "pool": "backfill",
    # "priority_weight": 10,
    # "end_date": datetime(2016, 1, 1),
}

dag = DAG(
    dag_id="drug_monitor",
    default_args=default_args,
    schedule_interval="0 18 * * *",
    template_searchpath=TMPL_SQL_PATH
)