Airflow 带有DaskExecutor的气流实时执行器日志

Airflow 带有DaskExecutor的气流实时执行器日志,airflow,dask,Airflow,Dask,我有一个气流装置(在Kubernetes上)。我的设置使用DaskExecutor。我还将远程日志记录配置为S3。但是,当任务运行时,我看不到日志,而是出现以下错误: *** Log file does not exist: /airflow/logs/dbt/run_dbt/2018-11-01T06:00:00+00:00/3.log *** Fetching from: http://airflow-worker-74d75ccd98-6g9h5:8793/log/dbt/run_dbt/

我有一个气流装置(在Kubernetes上)。我的设置使用
DaskExecutor
。我还将远程日志记录配置为S3。但是,当任务运行时,我看不到日志,而是出现以下错误:

*** Log file does not exist: /airflow/logs/dbt/run_dbt/2018-11-01T06:00:00+00:00/3.log
*** Fetching from: http://airflow-worker-74d75ccd98-6g9h5:8793/log/dbt/run_dbt/2018-11-01T06:00:00+00:00/3.log
*** Failed to fetch log file from worker. HTTPConnectionPool(host='airflow-worker-74d75ccd98-6g9h5', port=8793): Max retries exceeded with url: /log/dbt/run_dbt/2018-11-01T06:00:00+00:00/3.log (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f7d0668ae80>: Failed to establish a new connection: [Errno -2] Name or service not known',))
我核实的是: -已验证在任务运行时日志文件是否存在并正在写入执行器上 -在executor容器上调用了
netstat-tunlp
,但未找到任何可以从中提供日志的额外端口

更新 看一看airflow cli命令-我相信它的功能完全相同


我们只需在一个工作者上启动一个python HTTP处理程序,就解决了这个问题

Dockerfile:

RUN mkdir -p $AIRFLOW_HOME/serve
RUN ln -s $AIRFLOW_HOME/logs $AIRFLOW_HOME/serve/log
worker.sh(由Docker CMD运行):

RUN mkdir -p $AIRFLOW_HOME/serve
RUN ln -s $AIRFLOW_HOME/logs $AIRFLOW_HOME/serve/log
#!/usr/bin/env bash

cd $AIRFLOW_HOME/serve
python3 -m http.server 8793 &

cd -
dask-worker $@