Airflow 通过cloud composer运行气流时出现授权错误

Airflow 通过cloud composer运行气流时出现授权错误,airflow,google-cloud-composer,Airflow,Google Cloud Composer,尝试使用GoogleCloudStorageToBigQueryOperator从cloud composer运行DAG时出错 最后一个错误是:{'reason':'invalid','location':'gs://xxxxxx/xxxx.csv', 当我按照URL链接找到错误时 { "error": { "code": 401, "message": "Request is missing required authentication credential. Expec

尝试使用GoogleCloudStorageToBigQueryOperator从cloud composer运行DAG时出错

最后一个错误是:{'reason':'invalid','location':'gs://xxxxxx/xxxx.csv', 当我按照URL链接找到错误时

{
  "error": {
    "code": 401,
    "message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie     or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-    project.",
    "errors": [
      {
        "message": "Login Required.",
        "domain": "global",
        "reason": "required",
        "location": "Authorization",
        "locationType": "header"
      }
    ],
    "status": "UNAUTHENTICATED"
  }
}
我已配置云存储连接

Conn Id我的云存储

Conn型谷歌云平台

项目Id xxxxxx

密钥文件路径/home/afflow/gcs/data/xxx.json

密钥文件JSON

作用域(逗号分隔)

代码

from __future__ import print_function

import datetime

from airflow import models
from airflow import DAG
from airflow.operators import bash_operator
from airflow.operators import python_operator
from airflow.contrib.operators.gcs_to_bq import GoogleCloudStorageToBigQueryOperator

default_dag_args = {
    # The start_date describes when a DAG is valid / can be run. Set this to a
    # fixed point in time rather than dynamically, since it is evaluated every
    # time a DAG is parsed. See:
    # https://airflow.apache.org/faq.html#what-s-the-deal-with-start-date
    'start_date': datetime.datetime(2019, 4, 15),
}
with models.DAG(
        'Ian_gcs_to_BQ_Test',
        schedule_interval=datetime.timedelta(days=1),
        default_args=default_dag_args) as dag:

    load_csv = GoogleCloudStorageToBigQueryOperator(
        task_id='gcs_to_bq_test',
        bucket='xxxxx',
        source_objects=['xxxx.csv'],
        destination_project_dataset_table='xxxx.xxxx.xxxx',
        google_cloud_storage_conn_id='My_Cloud_Storage',
        schema_fields=[
            {'name':'AAAA','type':'INTEGER','mode':'NULLABLE'},
            {'name':'BBB_NUMBER','type':'INTEGER','mode':'NULLABLE'},   
        ],
        write_disposition='WRITE_TRUNCATE',
        dag=dag)
好了,现在修好了。 事实证明,由于文件中的标题行,它无法工作,一旦我删除它,它就可以正常工作。
非常烦人,关于无效位置和授权的错误消息完全具有误导性。

您确定错误与BigQuery无关,因为您没有指定
BigQuery\u conn\u id
?我尝试创建了一个大查询连接,但没有任何区别。这仍然是一个授权错误,并说位置无效,但我已经仔细检查位置是否正确。