elasticsearch 气流弹性原木结构,elasticsearch,airflow,elasticsearch,Airflow" /> elasticsearch 气流弹性原木结构,elasticsearch,airflow,elasticsearch,Airflow" />

elasticsearch 气流弹性原木结构

elasticsearch 气流弹性原木结构,elasticsearch,airflow,elasticsearch,Airflow,我在Apache Airflow中设置弹性日志时遇到一些问题。 自版本1.10以来,弹性日志记录已添加到配置中 查看afflow.cfg文件时,我们有两个与弹性相关的部分: # Airflow can store logs remotely in AWS S3, Google Cloud Storage or Elastic Search. # Users must supply an Airflow connection id that provides access to the stora

我在Apache Airflow中设置弹性日志时遇到一些问题。 自版本1.10以来,弹性日志记录已添加到配置中

查看afflow.cfg文件时,我们有两个与弹性相关的部分:

# Airflow can store logs remotely in AWS S3, Google Cloud Storage or Elastic Search.
# Users must supply an Airflow connection id that provides access to the storage
# location. If remote_logging is set to true, see UPDATING.md for additional
# configuration requirements.
remote_logging = True
remote_log_conn_id =
remote_base_log_folder =
encrypt_s3_logs = False

[elasticsearch]
elasticsearch_host = xxx.xxx.xxx.xxx
elasticsearch_log_id_template = {dag_id}-{task_id}-{execution_date}-{try_number}
elasticsearch_end_of_log_mark = end_of_log

现在我真的不知道如何设置它。查看airflow_local_settings.py文件时,我们可以看到以下代码片段:

if REMOTE_LOGGING and REMOTE_BASE_LOG_FOLDER.startswith('s3://'):
        DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['s3'])
elif REMOTE_LOGGING and REMOTE_BASE_LOG_FOLDER.startswith('gs://'):
        DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['gcs'])
elif REMOTE_LOGGING and REMOTE_BASE_LOG_FOLDER.startswith('wasb'):
        DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['wasb'])
elif REMOTE_LOGGING and ELASTICSEARCH_HOST:
        DEFAULT_LOGGING_CONFIG['handlers'].update(REMOTE_HANDLERS['elasticsearch'])
所以从逻辑上讲,如果我将远程日志设置为True,并将弹性的主机/ip放在弹性部分,它应该可以工作。
目前没有从气流实例生成日志。

根据气流ElasticsearchTaskHandler

不幸的是,此日志处理程序不会直接将日志刷新到您的ES

    ElasticsearchTaskHandler is a python log handler that
    reads logs from Elasticsearch. Note logs are not directly
    indexed into Elasticsearch. Instead, it flushes logs
    into local files. Additional software setup is required
    to index the log into Elasticsearch, such as using
    Filebeat and Logstash.