Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/amazon-s3/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Amazon s3 如何使用AWS S3 put对象绕过减速错误_Amazon S3 - Fatal编程技术网

Amazon s3 如何使用AWS S3 put对象绕过减速错误

Amazon s3 如何使用AWS S3 put对象绕过减速错误,amazon-s3,Amazon S3,我正在尝试访问一个文件,并在S3中用boto更新它,但即使在两个请求之间暂停,仍然会出现减速错误,如下代码所示。我该怎么做 body = b'Here we have some more data' s3.put_object(Body=body,Bucket=bucket, Key=key) time.sleep(10) response = s3.get_object(Bucket=bucket, Key=key) time.sleep(10) print(response["Body"].

我正在尝试访问一个文件,并在S3中用boto更新它,但即使在两个请求之间暂停,仍然会出现减速错误,如下代码所示。我该怎么做

body = b'Here we have some more data'
s3.put_object(Body=body,Bucket=bucket, Key=key)
time.sleep(10)
response = s3.get_object(Bucket=bucket, Key=key)
time.sleep(10)
print(response["Body"].read().decode('utf-8'))
currFile = response["Body"].read().decode('utf-8')
newFile = currFile + "\n" + "New Stuff!!!"
newFileB = newFile.encode('utf-8')
time.sleep(60)
s3.put_object(Body=newFileB,Bucket=bucket, Key=key)
time.sleep(10)
response = s3.get_object(Bucket=bucket, Key=key)
print(response["Body"].read().decode('utf-8'))
以下是错误:

Details
The area below shows the result returned by your function execution.
{
"errorMessage": "An error occurred (SlowDown) when calling the PutObject operation (reached max retries: 4): Please reduce your request rate.",
"errorType": "ClientError",
"stackTrace": [
[
"/var/task/lambda_function.py",
43,
"lambda_handler",
"raise e"
],
[
"/var/task/lambda_function.py",
20,
"lambda_handler",
"s3.put_object(Body=body,Bucket=bucket, Key=key)"
],
[
"/var/runtime/botocore/client.py",
314,
"_api_call",
"return self._make_api_call(operation_name, kwargs)"
],
[
"/var/runtime/botocore/client.py",
612,
"_make_api_call",
"raise error_class(parsed_response, operation_name)"
]
]
}

我确实遇到了这个问题,我不知道为什么会发生这种情况,但这是生产代码一直要处理的事情。解决办法是继续努力,在相当长的一段时间内不要放弃。故障时,循环以1秒的延迟开始,然后在每个循环中增加1秒的延迟(延迟增量),最后计时每个循环30秒的最大延迟,实际上,当它最终放弃时,最大总延迟为7.5分钟。当然,你可以随意安排时间。到目前为止,这对我来说是成功的

即使对于NAS文件服务器,我也必须执行类似的操作,因为有时我需要等待一段时间才能读取文件

def put_s3_core(bucket, key, strobj, content_type=None):  
    """ write strobj to s3 bucket, key
        content_type can be:
            binary/octet-stream (default)
            text/plain
            text/html
            text/csv
            image/png
            image/tiff
            application/pdf
            application/zip
            
    """
    delay = 1       # initial delay
    delay_incr = 1  # additional delay in each loop
    max_delay = 30  # max delay of one loop. Total delay is (max_delay**2)/2
    
    while delay < max_delay:   
        try:
            s3 = boto3.resource('s3')
            request_obj = s3.Object(bucket, key)
            if content_type:
                request_obj.put(Body=strobj, ContentType=content_type)        
            else:
                request_obj.put(Body=strobj)
            break               
                
        except ClientError:
            time.sleep(delay)
            delay += delay_incr
    else:
        raise
    ```
def put_s3_core(桶、键、strobj、内容类型=无):
“”“将strobj写入s3存储桶,键
内容类型可以是:
二进制/八位字节流(默认)
文本/纯文本
文本/html
文本/csv
图像/png
图像/tiff
申请表格/pdf
应用程序/zip
"""
延迟=1#初始延迟
延迟增量=1#每个环路中的额外延迟
最大延迟=30#一个环路的最大延迟。总延迟为(最大延迟**2)/2
延迟<最大延迟时:
尝试:
s3=boto3.resource('s3')
请求_obj=s3.对象(bucket,key)
如果内容类型为:
请求对象放置(Body=strobj,ContentType=content\u-type)
其他:
请求对象放置(主体=strobj)
打破
除客户错误外:
时间。睡眠(延迟)
延迟+=延迟增量
其他:
提升
```

这种情况发生多久了?这个桶有多旧?您当前的请求速率(每秒请求数)是多少?存储桶已经存在几天了。每秒跟踪请求的最佳位置在哪里?