Kubernetes Airflow无法将日志写入s3(v1.10.9)

Kubernetes Airflow无法将日志写入s3(v1.10.9),kubernetes,airflow,kubernetes-helm,Kubernetes,Airflow,Kubernetes Helm,我正在尝试在v1.10.9上的气流stable/aiffairhelm图表中设置远程日志记录,我正在使用Kubernetes executor和puckel/docker aiffair图像。这是我的values.yaml文件 airflow: image: repository: airflow-docker-local tag: 1.10.9 executor: Kubernetes service: type: LoadBalancer conf

我正在尝试在
v1.10.9
上的气流
stable/aiffair
helm图表中设置远程日志记录,我正在使用Kubernetes executor
puckel/docker aiffair
图像。这是我的
values.yaml
文件

airflow:
  image:
     repository: airflow-docker-local
     tag: 1.10.9
  executor: Kubernetes
  service:
    type: LoadBalancer
  config:
    AIRFLOW__KUBERNETES__WORKER_CONTAINER_REPOSITORY: airflow-docker-local
    AIRFLOW__KUBERNETES__WORKER_CONTAINER_TAG: 1.10.9
    AIRFLOW__KUBERNETES__WORKER_CONTAINER_IMAGE_PULL_POLICY: Never
    AIRFLOW__KUBERNETES__WORKER_SERVICE_ACCOUNT_NAME: airflow
    AIRFLOW__KUBERNETES__DAGS_VOLUME_CLAIM: airflow
    AIRFLOW__KUBERNETES__NAMESPACE: airflow
    AIRFLOW__CORE__REMOTE_LOGGING: True
    AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER: "s3://xxx"
    AIRFLOW__CORE__REMOTE_LOG_CONN_ID: "s3://aws_access_key_id:aws_secret_access_key@bucket"
    AIRFLOW__CORE__ENCRYPT_S3_LOGS: False
persistence:
  enabled: true
  existingClaim: ''
postgresql:
  enabled: true
workers:
  enabled: false
redis:
  enabled: false
flower:
  enabled: false
但我的日志不会导出到S3,我在UI上得到的只是

*** Log file does not exist: /usr/local/airflow/logs/icp_job_dag/icp-kube-job/2019-02-13T00:00:00+00:00/1.log
*** Fetching from: http://icpjobdagicpkubejob-f4144a374f7a4ac9b18c94f058bc7672:8793/log/icp_job_dag/icp-kube-job/2019-02-13T00:00:00+00:00/1.log
*** Failed to fetch log file from worker. HTTPConnectionPool(host='icpjobdagicpkubejob-f4144a374f7a4ac9b18c94f058bc7672', port=8793): Max retries exceeded with url: /log/icp_job_dag/icp-kube-job/2019-02-13T00:00:00+00:00/1.log (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f511c883710>: Failed to establish a new connection: [Errno -2] Name or service not known'))

我仍然有相同的问题。

您的远程日志连接id需要是连接表单/列表中的连接id。不是连接字符串


我遇到了同样的问题,我想我会继续跟进那些最终对我有用的东西。连接是正确的,但您需要确保worker Pod具有相同的环境变量:

airflow:
  image:
     repository: airflow-docker-local
     tag: 1.10.9
  executor: Kubernetes
  service:
    type: LoadBalancer
  connections:
  - id: my_aws
    type: aws
    extra: '{"aws_access_key_id": "xxxx", "aws_secret_access_key": "xxxx", "region_name":"us-west-2"}'
  config:
    AIRFLOW__KUBERNETES__WORKER_CONTAINER_REPOSITORY: airflow-docker-local
    AIRFLOW__KUBERNETES__WORKER_CONTAINER_TAG: 1.10.9
    AIRFLOW__KUBERNETES__WORKER_CONTAINER_IMAGE_PULL_POLICY: Never
    AIRFLOW__KUBERNETES__WORKER_SERVICE_ACCOUNT_NAME: airflow
    AIRFLOW__KUBERNETES__DAGS_VOLUME_CLAIM: airflow
    AIRFLOW__KUBERNETES__NAMESPACE: airflow

    AIRFLOW__CORE__REMOTE_LOGGING: True
    AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER: s3://airflow.logs
    AIRFLOW__CORE__REMOTE_LOG_CONN_ID: my_aws
    AIRFLOW__CORE__ENCRYPT_S3_LOGS: False
    AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__CORE__REMOTE_LOGGING: True
    AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__CORE__REMOTE_LOG_CONN_ID: my_aws
    AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__CORE__REMOTE_BASE_LOG_FOLDER: s3://airflow.logs
    AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__CORE__ENCRYPT_S3_LOGS: False

我还必须为工人设置fernet密钥(一般来说),否则我会得到一个无效的令牌错误:

airflow:
  fernet_key: "abcdefghijkl1234567890zxcvbnmasdfghyrewsdsddfd="

  config:
    AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__CORE__FERNET_KEY: "abcdefghijkl1234567890zxcvbnmasdfghyrewsdsddfd="

顺便说一句,我首先在values.yaml中指定了连接id,然后从UI创建了连接。这有区别吗?连接可以通过任何一种方式创建。我更喜欢用户界面。肯定需要一个s3连接Id才能工作。我更新了我的问题,提供了更多详细信息。我也尝试过使用连接,但同样的问题是AWS帐户中的airflow.logs存储桶?S3 bucket名称是唯一的,所以我怀疑您是否拥有afflow.logs。您是否在AWS帐户中为这些日志创建了一个bucket?这不是我实际的bucket名称。我编辑了它,将其发布在我现在从气流转移到的问题中(太麻烦了),如果其他人能确认这确实解决了问题,我很乐意接受这一回答。
airflow:
  fernet_key: "abcdefghijkl1234567890zxcvbnmasdfghyrewsdsddfd="

  config:
    AIRFLOW__KUBERNETES_ENVIRONMENT_VARIABLES__AIRFLOW__CORE__FERNET_KEY: "abcdefghijkl1234567890zxcvbnmasdfghyrewsdsddfd="