Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/amazon-s3/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Amazon web services 如何将非常大的文件上传到S3?_Amazon Web Services_Amazon S3 - Fatal编程技术网

Amazon web services 如何将非常大的文件上传到S3?

Amazon web services 如何将非常大的文件上传到S3?,amazon-web-services,amazon-s3,Amazon Web Services,Amazon S3,我有一个大约100GB的postgres备份,我想把它加载到欧盟法兰克福的S3中,并在云数据库中恢复它 我无法访问AWS导入/导出服务。在Ubuntu笔记本电脑上 我尝试过的策略 1) management console upload, at least 2 weeks needed 2) bucket explore multi-upload, task failed due to java memory error every time 3) SDK multi-upload(boto,

我有一个大约100GB的postgres备份,我想把它加载到欧盟法兰克福的S3中,并在云数据库中恢复它

我无法访问AWS导入/导出服务。在Ubuntu笔记本电脑上

我尝试过的策略

1) management console upload, at least 2 weeks needed
2) bucket explore multi-upload, task failed due to java memory error every time
3) SDK multi-upload(boto, boto3, java SDK), do not show the progress bar. can not estimate how long it needs
4) other windows explore, do not have Linux version

将其加载到S3中的最快方法是什么?或python或java中的代码片段。非常感谢

最简单的解决方案是使用AWS CLI()

您还可以使用以下工具:


mc cp backup.zip s3/bucketname

如果在上传大文件(比如50 GB文件)之间失去连接,该怎么办。是的,我们需要重新上传整个文件
aws s3 cp /PATH_TO_BACKUP/BACKUP_FILE s3://BUCKETNAME