Airflow 气流计划程序在触发DAG时引发BlockingIOError

Airflow 气流计划程序在触发DAG时引发BlockingIOError,airflow,airflow-scheduler,Airflow,Airflow Scheduler,怎么了? 当任何DAG被触发然后熄灭时,airflow scheduler似乎抛出IO错误。心跳会无限期地将其标记为不健康 堆栈跟踪 Process ForkProcess-1: Traceback (most recent call last): File "/usr/local/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap self.run() File "/

怎么了?

当任何DAG被触发然后熄灭时,airflow scheduler似乎抛出IO错误。心跳会无限期地将其标记为不健康

堆栈跟踪

Process ForkProcess-1:
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/local/lib/python3.6/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 634, in _run_processor_manager
    processor_manager.start()
  File "/usr/local/lib/python3.6/site-packages/airflow/utils/dag_processing.py", line 917, in start
    self._signal_conn.send(dag_parsing_stat)
  File "/usr/local/lib/python3.6/multiprocessing/connection.py", line 206, in send
    self._send_bytes(_ForkingPickler.dumps(obj))
  File "/usr/local/lib/python3.6/multiprocessing/connection.py", line 404, in _send_bytes
    self._send(header + buf)
  File "/usr/local/lib/python3.6/multiprocessing/connection.py", line 368, in _send
    n = write(self._handle, buf)
BlockingIOError: [Errno 11] Resource temporarily unavailable

设置

从运行airflow的停靠版本时,会对所有示例DAG产生错误

气流信息

Apache Airflow [1.10.11]

Platform: [Linux, x86_64] uname_result(system='Linux', node='a0a5e4907572', release='4.19.76-linuxkit', version='#1 SMP Tue May 26 11:42:35 UTC 2020', machine='x86_64', processor='')
Locale: ('en_US', 'UTF-8')
Python Version: [3.6.11 (default, Jun 30 2020, 19:29:13)  [GCC 8.3.0]]
Python Location: [/usr/local/bin/python]

git: [git version 2.20.1]
ssh: [NOT AVAILABLE]
kubectl: [NOT AVAILABLE]
gcloud: [NOT AVAILABLE]
cloud_sql_proxy: [NOT AVAILABLE]
mysql: [NOT AVAILABLE]
sqlite3: [NOT AVAILABLE]
psql: [NOT AVAILABLE]

Airflow Home: [/usr/local/airflow]
System PATH: [/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin]
Python PATH: [/usr/local/bin:/usr/local/lib/python36.zip:/usr/local/lib/python3.6:/usr/local/lib/python3.6/lib-dynload:/usr/local/lib/python3.6/site-packages:/usr/local/airflow/custom_plugins:/usr/local/airflow/dags:/usr/local/airflow/config:/usr/local/airflow/plugins]
airflow on PATH: [True]

Executor: [bsd_core.BsdExecutor]
SQL Alchemy Conn: [mysql+mysqldb://airflow:airflow@mysql:3306/airflow]
DAGS Folder: [/usr/local/airflow/dags]
Plugins Folder: [/usr/local/airflow/plugins]
Base Log Folder: [/usr/local/airflow/logs]
注意:
/usr/local/afflow/logs
的目录即使在启动Web服务器和调度程序后也不存在