Python 3.x 如何将GG Bigquery中存储的数据导出到GZ文件中。

Python 3.x 如何将GG Bigquery中存储的数据导出到GZ文件中。,python-3.x,google-bigquery,export,Python 3.x,Google Bigquery,Export,我使用此代码将数据导出到csv文件中,它可以工作: project_id = 'project_id' client = bigquery.Client() dataset_id = 'dataset_id' bucket_name = 'bucket_name' table_id = 'table_id' destination_uri = 'gs://{}/{}'.format(bucket_name, 'file.csv') dataset_ref = client.dataset(da

我使用此代码将数据导出到csv文件中,它可以工作:

project_id = 'project_id'
client = bigquery.Client()
dataset_id = 'dataset_id'
bucket_name = 'bucket_name'
table_id = 'table_id'

destination_uri = 'gs://{}/{}'.format(bucket_name, 'file.csv')
dataset_ref = client.dataset(dataset_id, project=project_id)
table_ref = dataset_ref.table(table_id)

extract_job = client.extract_table(
    table_ref,
    destination_uri) 
extract_job.result() 

但我更喜欢GZ文件,因为我的桌子高达700米。有人能帮我把数据导出到GZ文件吗

您需要添加一个
jobConfig
类似于:

job_config = bigquery.job.ExtractJobConfig()
job_config.compression = 'GZIP'
完整代码:

from google.cloud import bigquery
client = bigquery.Client()

project_id = 'fh-bigquery'
dataset_id = 'public_dump'
table_id = 'afinn_en_165'


bucket_name = 'your_bucket'

destination_uri = 'gs://{}/{}'.format(bucket_name, 'file.csv.gz')

dataset_ref = client.dataset(dataset_id, project=project_id)
table_ref = dataset_ref.table(table_id)

job_config = bigquery.job.ExtractJobConfig()
job_config.compression = 'GZIP'
extract_job = client.extract_table(
    table_ref,
    destination_uri,
    job_config = job_config
) 
extract_job.result()

你能提出一个新问题吗?