Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/google-cloud-platform/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
在Google Cloud Composer DAG python操作符中使用GCP连接对象_Python_Google Cloud Platform_Connection_Airflow_Google Cloud Composer - Fatal编程技术网

在Google Cloud Composer DAG python操作符中使用GCP连接对象

在Google Cloud Composer DAG python操作符中使用GCP连接对象,python,google-cloud-platform,connection,airflow,google-cloud-composer,Python,Google Cloud Platform,Connection,Airflow,Google Cloud Composer,我有一系列python脚本,它们通过oauth2client库使用json服务帐户密钥文件授权的gspread库从Google Sheets中提取数据: import gspread from oauth2client.service_account import ServiceAccountCredentials scopes = ['https://spreadsheets.google.com/feeds','https://www.googleapis.com/auth/drive']

我有一系列python脚本,它们通过oauth2client库使用json服务帐户密钥文件授权的gspread库从Google Sheets中提取数据:

import gspread
from oauth2client.service_account import ServiceAccountCredentials

scopes = ['https://spreadsheets.google.com/feeds','https://www.googleapis.com/auth/drive']
creds = ServiceAccountCredentials.from_json_keyfile_name(gcp_config_yaml_path,scopes)
client = gspread.authorize(creds)
cur = config['tables_to_load'][i]

sheet = client.open_by_url(cur['spreadsheet_url']).worksheet(cur['sheet_name'])
df=pd.DataFrame(sheet.get_all_records())
我需要使用Google Cloud Composer将其转换为气流DAG,我想利用气流()的连接功能

我已经上传了json密钥文件对象,并在Airflow UI中创建了连接对象(根据“#2创建新连接”中步骤“d-iv”中的选项“I”),我可以在代码中使用以下方法引用该对象:

client = BaseHook.get_connection('google_cloud_default')
但这大概是我能做到的。每次我尝试调用连接中的参数时,都会得到一个错误,即该参数不存在(keyfile_json、keyfile_dict、scopes、keyfile_path、client、spreadsheet等),而且我似乎找不到任何关于对象中应该提供哪些属性的文档:()

任何关于在GCP Cloud Composer气流环境中授权Google Sheets连接的方法的见解都将是巨大的帮助

非常感谢