Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/333.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/amazon-s3/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何使用python boto3将数据附加到AWS S3中的现有csv文件_Python_Amazon S3_Boto3 - Fatal编程技术网

如何使用python boto3将数据附加到AWS S3中的现有csv文件

如何使用python boto3将数据附加到AWS S3中的现有csv文件,python,amazon-s3,boto3,Python,Amazon S3,Boto3,我在s3中有一个csv文件,但每次调用函数时都必须将数据附加到该文件中,但我无法这样做 df = pd.DataFrame(data_list) bytes_to_write = df.to_csv(None, header=None, index=False).encode() file_name = "Words/word_dictionary.csv" # Not working the below line s3_client.put_object(Body=bytes_to_write

我在s3中有一个csv文件,但每次调用函数时都必须将数据附加到该文件中,但我无法这样做

df = pd.DataFrame(data_list)
bytes_to_write = df.to_csv(None, header=None, index=False).encode()
file_name = "Words/word_dictionary.csv" # Not working the below line
s3_client.put_object(Body=bytes_to_write, Bucket='recengine', Key=file_name)

此代码直接替换文件中的数据而不是追加,任何解决方案?

s3
都没有追加功能。您需要从s3读取文件,将数据附加到代码中,然后将完整的文件上载到s3中的同一个密钥

查看此以了解详细信息

代码可能如下所示:

df = pd.DataFrame(data_list)
bytes_to_write = df.to_csv(None, header=None, index=False).encode()
file_name = "Words/word_dictionary.csv"

# get the existing file
curent_data = s3_client.get_object(Bucket='recengine', Key=file_name)
# append
appended_data = current_data + bytes_to_write
# overwrite
s3_client.put_object(Body=appended_data, Bucket='recengine', Key=file_name)