Python Boto3 AWS多部分上载语法

Python Boto3 AWS多部分上载语法,python,amazon-web-services,amazon-s3,boto,boto3,Python,Amazon Web Services,Amazon S3,Boto,Boto3,我正在使用AWS成功地进行身份验证,并在Bucket对象上使用“put_object”方法上载文件。现在,我想使用多部分API来完成对大型文件的访问。我在这个问题上找到了公认的答案: 但在尝试实现时,我得到了“未知方法”错误。我做错了什么?我的代码如下。谢谢 ## Get an AWS Session self.awsSession = Session(aws_access_key_id=accessKey, aws_secret_access_key=secretKey, aws_sessi

我正在使用AWS成功地进行身份验证,并在Bucket对象上使用“put_object”方法上载文件。现在,我想使用多部分API来完成对大型文件的访问。我在这个问题上找到了公认的答案:

但在尝试实现时,我得到了“未知方法”错误。我做错了什么?我的代码如下。谢谢

## Get an AWS Session
self.awsSession = Session(aws_access_key_id=accessKey,
aws_secret_access_key=secretKey,
aws_session_token=session_token,
region_name=region_type)  

 ...          

# Upload the file to S3
s3 = self.awsSession.resource('s3')
s3.Bucket('prodbucket').put_object(Key=fileToUpload, Body=data) # WORKS
#s3.Bucket('prodbucket').upload_file(dataFileName, 'prodbucket', fileToUpload) # DOESNT WORK
#s3.upload_file(dataFileName, 'prodbucket', fileToUpload) # DOESNT WORK

upload_file方法尚未移植到bucket资源。现在,您需要直接使用客户机对象来执行此操作:

client = self.awsSession.client('s3')
client.upload_file(...)
透明地为您处理所有部件的拆分和上载

使用upload_object_via_stream方法执行此操作:

from libcloud.storage.types import Provider
from libcloud.storage.providers import get_driver

# Path to a very large file you want to upload
FILE_PATH = '/home/user/myfile.tar.gz'

cls = get_driver(Provider.S3)
driver = cls('api key', 'api secret key')

container = driver.get_container(container_name='my-backups-12345')

# This method blocks until all the parts have been uploaded.
extra = {'content_type': 'application/octet-stream'}

with open(FILE_PATH, 'rb') as iterator:
    obj = driver.upload_object_via_stream(iterator=iterator,
                                          container=container,
                                          object_name='backup.tar.gz',
                                          extra=extra)

有关S3多部分功能的官方文档,请参阅。

您是否见过boto3中用于文件上传的新的高级界面?有关详细信息,请参阅,但它使多部分上载变得更加容易。