Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/amazon-web-services/12.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/spring/13.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Amazon web services 使用create_export_任务将日志从cloudwatch导出到s3不丢失一些日志_Amazon Web Services_Amazon S3 - Fatal编程技术网

Amazon web services 使用create_export_任务将日志从cloudwatch导出到s3不丢失一些日志

Amazon web services 使用create_export_任务将日志从cloudwatch导出到s3不丢失一些日志,amazon-web-services,amazon-s3,Amazon Web Services,Amazon S3,我试图将cloudwatch中过去15分钟的访问日志导出到S3 bucket中,但在运行它时,它成功地将日志存储到S3中,但丢失了很多日志。例如,在cloudwatch过去15分钟中,有30个日志,而在S3中只有大约3个日志 group_name = '/aws/elasticbeanstalk/my-env/var/app/current/storage/logs/laravel.log' s3 = boto3.client('s3') log_file = boto3.client('lo

我试图将cloudwatch中过去15分钟的访问日志导出到S3 bucket中,但在运行它时,它成功地将日志存储到S3中,但丢失了很多日志。例如,在cloudwatch过去15分钟中,有30个日志,而在S3中只有大约3个日志

group_name = '/aws/elasticbeanstalk/my-env/var/app/current/storage/logs/laravel.log'

s3 = boto3.client('s3')
log_file = boto3.client('logs')

now = datetime.now()
deletionDate = now - timedelta(minutes=15)

start = math.floor(deletionDate.replace(second=0, microsecond=0).timestamp()*1000)
end = math.floor(now.replace(second=0, microsecond=0).timestamp()*1000)

destination_bucket = 'past15mins-log'
prefix = 'lambda2-test-log/'+str(start)+'-'+str(end)
# # TODO implement
response = log_file.create_export_task(
                logGroupName = group_name,
                fromTime = start, 
                to = end, 
                destination = destination_bucket,
                destinationPrefix = prefix
            )
if not response['ResponseMetadata']['HTTPStatusCode'] == 200:
    raise Exception('fail ' + str(start)+" - "+ str(end))
在文档中,它说这是一个异步调用,我猜是因为有3台服务器可以从中获取日志,这是导致问题的原因吗

提前谢谢