Google cloud platform 将Bigquery结果保存到Google Composer中的JSON

Google cloud platform 将Bigquery结果保存到Google Composer中的JSON,google-cloud-platform,google-bigquery,airflow,google-cloud-composer,Google Cloud Platform,Google Bigquery,Airflow,Google Cloud Composer,我已经在DAG下面创建了一个每日运行sql脚本的工具。如何将查询结果保存到JSON文件并保存在Google Composer的DAG文件夹中 import datetime import airflow from airflow.operators import bash_operator from airflow.contrib.operators import bigquery_operator START_DATE = datetime.datetime(2020, 3, 1) def

我已经在DAG下面创建了一个每日运行sql脚本的工具。如何将查询结果保存到JSON文件并保存在Google Composer的DAG文件夹中

import datetime
import airflow
from airflow.operators import bash_operator
from airflow.contrib.operators import bigquery_operator

START_DATE = datetime.datetime(2020, 3, 1)

default_args = {
    'owner': 'Alen',
    'depends_on_past': False,
    'email': [''],
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 1,
    'retry_delay': datetime.timedelta(minutes=15),
    'start_date': START_DATE,
}

with airflow.DAG(
        'Dag_Name',
        'catchup=False',
        default_args=default_args,
        schedule_interval=datetime.timedelta(days=1)) as dag:

    task_name = bigquery_operator.BigQueryOperator(
        task_id='task_name',
        sql= 'query.sql',
        use_legacy_sql=False,
        write_disposition= 'WRITE_TRUNCATE' ,        
        destination_dataset_table='Project.Dataset.destination_table')

另一种方法是使用DAG文件夹作为目标运行从BQ到GCS的导出

您可以使用bash或bq操作符

然后在脚本末尾运行类似的操作:

copy_files_to_DAG_folder = bash_operator.BashOperator(
    task_id='Copy_files_to_GCS',
    bash_command='bq extract --destination_format JSON--print_header=false 'BQ_TABLE' 
    'GCS_DAG_FOLDER_LOCATION''
从文档:

 bq --location=location extract \
 --destination_format format \
 --compression compression_type \
 --field_delimiter delimiter \
 --print_header=boolean \
 project_id:dataset.table \
 gs://bucket/filename.ext

另一种方法是使用DAG文件夹作为目标运行从BQ到GCS的导出

您可以使用bash或bq操作符

然后在脚本末尾运行类似的操作:

copy_files_to_DAG_folder = bash_operator.BashOperator(
    task_id='Copy_files_to_GCS',
    bash_command='bq extract --destination_format JSON--print_header=false 'BQ_TABLE' 
    'GCS_DAG_FOLDER_LOCATION''
从文档:

 bq --location=location extract \
 --destination_format format \
 --compression compression_type \
 --field_delimiter delimiter \
 --print_header=boolean \
 project_id:dataset.table \
 gs://bucket/filename.ext