KeyError:';awslogs'。。。outEvent=str(事件[';awslogs';][';数据';])-Python
我得到以下错误: 回溯(最近一次呼叫最后一次): lambda_处理程序中的文件“/var/task/lambda_function.py”,第22行 outEvent=str(事件['awslogs']['data']) KeyError:“awslogs” 正在使用的代码:KeyError:';awslogs'。。。outEvent=str(事件[';awslogs';][';数据';])-Python,python,amazon-s3,aws-lambda,Python,Amazon S3,Aws Lambda,我得到以下错误: 回溯(最近一次呼叫最后一次): lambda_处理程序中的文件“/var/task/lambda_function.py”,第22行 outEvent=str(事件['awslogs']['data']) KeyError:“awslogs” 正在使用的代码: import boto3 import logging import json import gzip import urllib import time from io import StringIO logger
import boto3
import logging
import json
import gzip
import urllib
import time
from io import StringIO
logger = logging.getLogger()
logger.setLevel(logging.INFO)
s3 = boto3.client('s3')
def lambda_handler(event, context):
#set the name of the S3 bucket
bucketS3 = 'test-flowlogs'
folderS3 = 'ArcSight'
prefixS3 = 'AW1Logs_'
#capture the CloudWatch log data
outEvent = str(event['awslogs']['data'])
#decode and unzip the log data
outEvent = gzip.GzipFile(fileobj=StringIO(outEvent.decode('base64','strict'))).read()
#convert the log data from JSON into a dictionary
cleanEvent = json.loads(outEvent)
#create a temp file
tempFile = open('/tmp/file', 'w+')
#Create the S3 file key
key = folderS3 + '/' + prefixS3 + str(int(time.time())) + ".log"
#loop through the events line by line
for t in cleanEvent['logEvents']:
#Transform the data and store it in the temp file.
tempFile.write("CEF:0|AWS CloudWatch|FlowLogs|1.0|src=" + str(t['extractedFields']['srcaddr']) + "|spt=" + str(t['extractedFields']['srcport']) + "|dst=" + str(t['extractedFields']['dstaddr']) + "|dpt=" + str(t['extractedFields']['dstport'])+ "|proto=" + str(t['extractedFields']['protocol'])+ "|start=" + str(t['extractedFields']['start'])+ "|end=" + str(t['extractedFields']['end'])+ "|out=" + str(t['extractedFields']['bytes'])+"\n")
#close the temp file
tempFile.close()
#write the files to s3
s3Results = s3.upload_file('/tmp/file', bucketS3, key)
print s3Results
我正在使用以下代码:
错误屏幕截图:
正在尝试将CloudWatch日志获取到S3存储桶
感谢您的帮助!
谢谢
Shane您正试图用一个没有对象事件['awslogs']['data']的事件测试您的函数 此事件在CloudWatch触发lambda函数时生成,如下例所示:
{
"awslogs": {
"data": "H4sIAAAAAAAAAHWPwQqCQBCGX0Xm7EFtK+smZBEUgXoLCdMhFtKV3akI8d0bLYmibvPPN3wz00CJxmQnTO41whwWQRIctmEcB6sQbFC3CjW3XW8kxpOpP+OC22d1Wml1qZkQGtoMsScxaczKN3plG8zlaHIta5KqWsozoTYw3/djzwhpLwivWFGHGpAFe7DL68JlBUk+l7KSN7tCOEJ4M3/qOI49vMHj+zCKdlFqLaU2ZHV2a4Ct/an0/ivdX8oYc1UVX860fQDQiMdxRQEAAA=="
}
}
如果要手动测试功能,请确保进入“操作”、“配置事件测试”和“示例事件模板”中选择“CloudWatch日志”选项