将dataframe作为JSON对象上载到云存储中

将dataframe作为JSON对象上载到云存储中,json,python-3.x,pandas,google-cloud-functions,google-cloud-storage,Json,Python 3.x,Pandas,Google Cloud Functions,Google Cloud Storage,我一直在尝试使用Cloud函数将Pandas数据帧上传到云存储中的JSON对象。以下是我的代码- def upload_blob(bucket_name, source_file_name, destination_blob_name): """Uploads a file to the bucket.""" storage_client = storage.Client() bucket = storage_client.get_bucket(bucket_name) bl

我一直在尝试使用Cloud函数将Pandas数据帧上传到云存储中的JSON对象。以下是我的代码-

def upload_blob(bucket_name, source_file_name, destination_blob_name):
"""Uploads a file to the bucket."""
    storage_client = storage.Client()
   bucket = storage_client.get_bucket(bucket_name)
   blob = bucket.blob(destination_blob_name)

   blob.upload_from_file(source_file_name)

   print('File {} uploaded to {}.'.format(
    source_file_name,
    destination_blob_name))

final_file = pd.concat([df, df_second], axis=0)
final_file.to_json('/tmp/abc.json')
with open('/tmp/abc.json', 'r') as file_obj:
   upload_blob('test-bucket',file_obj,'abc.json') 
我在第行收到以下错误-blob.upload\u from\u filesource\u file\u name

Deployment failure:
Function failed on loading user code. Error message: Code in file main.py 
can't be loaded.
Detailed stack trace: Traceback (most recent call last):
File "/env/local/lib/python3.7/site- 
packages/google/cloud/functions/worker.py", line 305, in 
check_or_load_user_function
_function_handler.load_user_function()
File "/env/local/lib/python3.7/site- 
packages/google/cloud/functions/worker.py", line 184, in load_user_function
spec.loader.exec_module(main)
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/user_code/main.py", line 6, in <module>
import datalab.storage as gcs
File "/env/local/lib/python3.7/site-packages/datalab/storage/__init__.py", 
line 16, in <module>
from ._bucket import Bucket, Buckets
File "/env/local/lib/python3.7/site-packages/datalab/storage/_bucket.py", 
line 21, in <module>
import datalab.context
File "/env/local/lib/python3.7/site-packages/datalab/context/__init__.py", 
line 15, in <module>
from ._context import Context
File "/env/local/lib/python3.7/site-packages/datalab/context/_context.py", 
line 20, in <module>
from . import _project
File "/env/local/lib/python3.7/site-packages/datalab/context/_project.py", 
line 18, in <module>
import datalab.utils
File "/env/local/lib/python3.7/site-packages/datalab/utils/__init__.py", 
line 15
from ._async import async, async_function, async_method
                        ^
SyntaxError: invalid syntax

错误可能是什么?

使用bucket对象而不是字符串


类似于upload_blobconn.get_bucketmyback,'/tmp/abc.json','abc.json'}

您要向其传递字符串,但此方法需要一个file对象。你可能想用它来代替。检查样品

或者,您可以获取file对象,并继续使用blob.upload\u from\u file,但这是不必要的额外行

打开“/tmp/abc.json”,“r”作为文件\u obj: 上传'test-bucket',文件'abc.json'
我在“上传blob”功能中使用了它。你可以用一句话来完成。将datalab.storage导入为gcs gcs.Bucket'bucketName'。项'to/data.csv'。写入\u to简化\u dataframe。添加此项后,未部署'text/csv'函数。我收到错误-加载用户代码时函数失败。错误消息:无法加载main.py文件中的代码。请尝试运行以前的代码,但只需将blob.upload\u从\u文件更改为blob.upload\u从\u文件名。