Airflow 气流2示例子DAG在顺序执行器上处于运行状态

Airflow 气流2示例子DAG在顺序执行器上处于运行状态,airflow,airflow-scheduler,directed-acyclic-graphs,airflow-operator,Airflow,Airflow Scheduler,Directed Acyclic Graphs,Airflow Operator,当我运行气流子DAG示例时,第一个子DAG卡在运行状态。该第一个子DAG中的不同任务实例没有_状态,整个DAG不会进一步进行。我想我的配置一定有问题,但我不知道问题出在哪里 [core] dags_folder = /home/airflow/dags hostname_callable = socket.getfqdn default_timezone = system executor = SequentialExecutor sql_alchemy_conn = mysql+mysqldb

当我运行气流子DAG示例时,第一个子DAG卡在运行状态。该第一个子DAG中的不同任务实例没有_状态,整个DAG不会进一步进行。我想我的配置一定有问题,但我不知道问题出在哪里

[core]
dags_folder = /home/airflow/dags
hostname_callable = socket.getfqdn
default_timezone = system
executor = SequentialExecutor
sql_alchemy_conn = mysql+mysqldb://****************@localhost:3306/airflow?charset=utf8mb4
sql_engine_encoding = utf-8
sql_alchemy_pool_enabled = True
sql_alchemy_pool_size = 5
sql_alchemy_max_overflow = 10
sql_alchemy_pool_recycle = 3600
sql_alchemy_pool_pre_ping = True
sql_alchemy_schema = 
parallelism = 32
dag_concurrency = 16
dags_are_paused_at_creation = True
max_active_runs_per_dag = 16
load_examples = False
load_default_connections = False
plugins_folder = /home/airflow/plugins
execute_tasks_new_python_interpreter = True
fernet_key = ****************************
donot_pickle = False
dagbag_import_timeout = 30
dagbag_import_error_tracebacks = True
dagbag_import_error_traceback_depth = 2
dag_file_processor_timeout = 50
task_runner = StandardTaskRunner
default_impersonation = 
security = 
unit_test_mode = False
enable_xcom_pickling = True
killed_task_cleanup_time = 60
dag_run_conf_overrides_params = True
dag_discovery_safe_mode = True
default_task_retries = 1
min_serialized_dag_update_interval = 30
min_serialized_dag_fetch_interval = 10
max_num_rendered_ti_fields_per_task = 30
check_slas = False
xcom_backend = airflow.models.xcom.BaseXCom
lazy_load_plugins = True
lazy_discover_providers = True
max_db_retries = 3
remote_log_conn_id = 
encrypt_s3_logs = False
non_pooled_task_slot_count = 128

[scheduler]
job_heartbeat_sec = 5
clean_tis_without_dagrun_interval = 15.0
scheduler_heartbeat_sec = 5
num_runs = -1
processor_poll_interval = 1
min_file_process_interval = 30
dag_dir_list_interval = 300
print_stats_interval = 300
pool_metrics_interval = 5.0
scheduler_health_check_threshold = 30
orphaned_tasks_check_interval = 300.0
child_process_log_directory = /home/airflow/logs/scheduler
scheduler_zombie_task_threshold = 300
catchup_by_default = False
max_tis_per_query = 0
use_row_level_locking = True
parsing_processes = 2
use_job_schedule = True
allow_trigger_in_future = False
run_duration = -1
authenticate = False
max_dagruns_to_create_per_loop = 100
max_dagruns_per_loop_to_schedule = 100
schedule_after_task_execution = True
以下是配置的[core]和[scheduler]部分。如果需要任何其他方法来解决这个问题,我很乐意提供。从气流1.10.0切换到气流2.0.2时有点粗糙,出现错误