Airflow 任务实例卡在重试模式下

Airflow 任务实例卡在重试模式下,airflow,airflow-scheduler,Airflow,Airflow Scheduler,第一个国家: We are running airflow version 1.9 in celery executor mode. Our task instances are stuck in retry mode. When the job fails, the task instance retries. After that it tries to run the task and then fall back to new retry time. 一段时间后: Task is n

第一个国家:

We are running airflow version 1.9 in celery executor mode. Our task instances are stuck in retry mode. When the job fails, the task instance retries. After that it tries to run the task and then fall back to new retry time. 
一段时间后:

Task is not ready for retry yet but will be retried automatically. Current date is 2018-08-28T03:46:53.101483 and task will be retried at 2018-08-28T03:47:25.463271.
一段时间后:它再次进入重试模式

任务尚未准备好重试,但将自动重试。当前日期为2018-08-28T03:51:48.322424,任务将在2018-08-28T03:52:57.893430重试

所有的DAG都是如此。我们创建了一个测试dag,并尝试获取调度程序和工作程序日志的日志

All dependencies are met but the task instance is not running. In most cases this just means that the task will probably be scheduled soon unless:
- The scheduler is down or under heavy load

If this task instance does not start soon please contact your Airflow administrator for assistance


如果作业一直失败,我希望它继续重试。。。您的问题是任务总是失败吗?因此,任务第一次失败时,作业将进入重试模式。然后它就永远不会排队运行了。始终存在“up_for_retry”状态。在我看来,同样的问题就像气流中的一个bug,不知道气流1.10是否更好您可以粘贴所提及的dag文件和任务日志task@SreenathKamath我添加了日志。在编辑中。请查收。
from datetime import *
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from airflow.operators.bash_operator import BashOperator

default_args = {
    'owner': 'Pramiti',
    'depends_on_past': False,
    'retries': 3,
    'retry_delay': timedelta(minutes=1)
}

dag = DAG('airflow-examples.test_failed_dag_v2', description='Failed DAG',
          schedule_interval='*/10 * * * *',
          start_date=datetime(2018, 9, 7), default_args=default_args)

b = BashOperator(
    task_id="ls_command",
    bash_command="mdr",
    dag=dag
)