Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/svg/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 2.7 从google bigquery读取文件时遇到内存错误_Python 2.7_Pandas_Google Bigquery - Fatal编程技术网

Python 2.7 从google bigquery读取文件时遇到内存错误

Python 2.7 从google bigquery读取文件时遇到内存错误,python-2.7,pandas,google-bigquery,Python 2.7,Pandas,Google Bigquery,我试图用python从google bigquery中读取180万行数据,但面临内存错误问题。因为我正在使用pandas_gbq import pandas_gbq as pgq a= pgq.read_gbq(viewsql, project_id = env_config['projectid'], private_key = env_config['service_account_key_file']+".json") 在这种情况下,您可以读取块。df=pd.read\u c

我试图用python从google bigquery中读取180万行数据,但面临内存错误问题。因为我正在使用pandas_gbq

import pandas_gbq as pgq


a= pgq.read_gbq(viewsql, project_id = env_config['projectid'], private_key = 
   env_config['service_account_key_file']+".json")

在这种情况下,您可以读取块。df=pd.read\u csv('filename',chunksize=(您想要的任何块数,例如:5000)@anky\u 91我们可以对Bigquery表执行相同的操作吗?我想可以。您可以在web上查找,或者此链接可能有帮助: