Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/amazon-s3/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 3.x 如何使用python从Linode S3 boto中删除所有文件?_Python 3.x_Amazon S3_Boto3 - Fatal编程技术网

Python 3.x 如何使用python从Linode S3 boto中删除所有文件?

Python 3.x 如何使用python从Linode S3 boto中删除所有文件?,python-3.x,amazon-s3,boto3,Python 3.x,Amazon S3,Boto3,我成功地在Linode中的对象存储桶中创建了文件。但在删除该存储中的所有文件时,会提示一个错误 import boto3 cfg = { "aws_access_key_id":"XXXXXXXXXXXXXXXXXX", "aws_secret_access_key": "XXXXXXXXXXXXXXXXXXXXXXXX", "endpoint_url": "**

我成功地在Linode中的对象存储桶中创建了文件。但在删除该存储中的所有文件时,会提示一个错误

import boto3
cfg = {
    "aws_access_key_id":"XXXXXXXXXXXXXXXXXX",
    "aws_secret_access_key": "XXXXXXXXXXXXXXXXXXXXXXXX",
    "endpoint_url": "*********************",
}

S3_BUCKET = "test"

# empty existing bucket
def empty_s3_bucket():
  client = boto3.client(
    's3',
    **cfg,
  )
  response = client.list_objects_v2(Bucket=S3_BUCKET)
  if 'Contents' in response:
    for item in response['Contents']:
      print('deleting file', item['Key'])
      client.delete_object(Bucket=S3_BUCKET, Key=item['Key'])
      while response['KeyCount'] == 1000:
        response = client.list_objects_v2(
          Bucket=S3_BUCKET,
          StartAfter=response['Contents'][0]['Key'],
        )
        for item in response['Contents']:
          print('deleting file', item['Key'])
          client.delete_object(Bucket=S3_BUCKET, Key=item['Key'])

empty_s3_bucket()
上述代码无法删除该对象存储中的所有文件,但可以使用不同的逻辑删除单个文件。在上述代码上生成以下错误:

Traceback (most recent call last):
  File "c:/********/linode_empty.py", line 30, in <module>
    empty_s3_bucket()
  File "c:/*********/linode_empty.py", line 16, in empty_s3_bucket
    response = client.list_objects_v2(Bucket=S3_BUCKET)        
  File "C:\********\venv\lib\site-packages\botocore\client.py", line 357, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "C:\*******\venv\lib\site-packages\botocore\client.py", line 676, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.errorfactory.NoSuchKey: An error occurred (NoSuchKey) when calling the ListObjectsV2 operation: Unknown
回溯(最近一次呼叫最后一次):
文件“c://********/linode_empty.py”,第30行,在
空_s3_bucket()
文件“c://********/linode_empty.py”,第16行,在空的_s3_存储桶中
response=client.list\u objects\u v2(Bucket=S3\u Bucket)
文件“C:\******\venv\lib\site packages\botocore\client.py”,第357行,在api调用中
返回self.\u make\u api\u调用(操作名称,kwargs)
文件“C:\******\venv\lib\site packages\botocore\client.py”,第676行,在\u make\u api\u调用中
引发错误\u类(解析的\u响应、操作\u名称)
botocore.errorfactory.NoSuchKey:调用ListObjectsV2操作时发生错误(NoSuchKey):未知
我在stackoverflow上尝试了旧帖子中建议的不同代码,但得到了相同的错误。

尝试一下: 它将收集所有密钥并一次批量删除1000个

import math
s3sr = boto3.resource('s3')
s3sc = boto3.client('s3')

def get_list_of_keys_from_prefix(bucket, prefix):
    """gets list of keys for given bucket and prefix"""
    keys_list = []
    paginator = s3sr.meta.client.get_paginator('list_objects_v2')
    # use Delimiter to limit search to that level of hierarchy
    for page in paginator.paginate(Bucket=bucket, Prefix=prefix, Delimiter='/'):
        keys = [content['Key'] for content in page.get('Contents')]
        # print('keys in page: ', len(keys))
        keys_list.extend(keys)
        # print(keys_list)
    print('total keys in bucket: ', len(keys_list))
    return keys_list

bucket = 'test'
prefix = '' #if you have 'subfolders' enter the prefix, otherwise use ''

keys_list = get_list_of_keys_from_prefix(bucket, prefix)
# print(keys_list)

total_keys = len(keys_list)
chunk_size = 1000
num_batches = math.ceil(total_keys / chunk_size) 



for b in range(0, num_batches):
    batch_to_delete = []
    for k in keys_list[chunk_size*b:chunk_size*b+chunk_size]:
        batch_to_delete.append({'Key': k})
        # print({'Key': k})
    # print(batch_to_delete)
    s3sc.delete_objects(Bucket=bucket,  Delete={'Objects': batch_to_delete,},'Quiet': True)