Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/312.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/amazon-web-services/13.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python AWS Lamda:ClientError:调用HeadObject操作时出错(403):禁止_Python_Amazon Web Services_Amazon S3_Amazon Sns - Fatal编程技术网

Python AWS Lamda:ClientError:调用HeadObject操作时出错(403):禁止

Python AWS Lamda:ClientError:调用HeadObject操作时出错(403):禁止,python,amazon-web-services,amazon-s3,amazon-sns,Python,Amazon Web Services,Amazon S3,Amazon Sns,我是AWS的新手。我正在尝试使用以下AWS服务处理xlsx文件S3Bucket,存储文件,SNS订阅AWSLamda用Python编写的函数。下面是我的代码: import json import boto3 import pandas as pd import os, tempfile import sys import uuid from urllib.parse import unquote_plus s3_client = boto3.client('s3') def lambda_

我是AWS的新手。我正在尝试使用以下
AWS
服务处理
xlsx
文件
S3
Bucket,存储文件,
SNS
订阅
AWS
Lamda
Python
编写的函数。下面是我的代码:

import json
import boto3
import pandas as pd
import os, tempfile
import sys
import uuid
from urllib.parse import unquote_plus

s3_client = boto3.client('s3')

def lambda_handler(event, context):
    print('coming here')
    message = event['Records'][0]['Sns']['Message']
    bucket = 'bucket1'
    newbucket = 'bucket2'
    jmessage = json.loads(message)
    key = unquote_plus(jmessage["Records"][0]['s3']['object']['key'])
    directory_name = tempfile.mkdtemp()
    download_path = os.path.join(directory_name, 'EXAMPLE2.xlsx')
    print(download_path)
    newkey= 'cleaned.csv'
    upload_path = os.path.join(directory_name, newkey)
    s3_client.download_file(bucket, key, download_path)
    df = pd.read_Excel(download_path, skiprows=3)
    header2 = ['K', 'GEN STATUS']
    df.to_csv(upload_path, columns=header2, index=False)
    s3_client.upload_file(upload_path, newbucket, newkey)

    sns = boto3.client('sns')
    response = sns.publish(
        TopicArn='arn:aws:lambda:us-east-1:<id>:function:DataClean',
        Message='Data is cleaned and save into bucket Cleaned-data. Auto data ingestion is running.'
    )
    return {
        'statusCode': 200,
        'body': json.dumps('Done with cleansing!!')
    }
这就是我的桶策略的样子:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::<id>:root"
            },
            "Action": "s3:*",
            "Resource": "arn:aws:s3:::bucket1/*"
        }
    ]
}
{
“版本”:“2012-10-17”,
“声明”:[
{
“效果”:“允许”,
“委托人”:{
“AWS”:“arn:AWS:iam:::root”
},
“行动”:“s3:*”,
“资源”:“arn:aws:s3:::bucket1/*”
}
]
}

那么这里出了什么问题?

您的lambda角色无权访问
S3
存储桶来下载对象

{
“版本”:“2012-10-17”,
“声明”:[
{
“效果”:“允许”,
“行动”:[
“日志:PutLogEvents”,
“日志:CreateLogGroup”,
“日志:CreateLogStream”
],
“资源”:“arn:aws:logs:**”
},
{
“效果”:“允许”,
“行动”:[
“s3:GetObject”
],
“资源”:“arn:aws:s3:::bucket1/*”
},
{
“效果”:“允许”,
“行动”:[
“s3:PutObject”
],
“资源”:“arn:aws:s3:::bucket2/*”
}
]
}         
注意
:确保(IAM角色和S3存储桶策略)都允许使用lambda


我如何授予Lambda下载对象的权限?我是AWS的新手,所以请容忍我。我还用s3桶策略编辑了这个问题。@Saani我把文档链接附在了答案上,看一看。我可以帮你解决这个问题。@Saani在你的政策中你允许一个bucket,但在代码中你有两个bucket
bucket='bucket1'newbucket='bucket2'
@Saani如果这对你有效,你是否可以标记这个问题已解决。这有助于社区和任何在未来偶然发现这个问题的人。是的,我会这样做,但仍在努力实施解决方案。仍然得到相同的错误。我试着按照你建议的教程去做。
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::<id>:root"
            },
            "Action": "s3:*",
            "Resource": "arn:aws:s3:::bucket1/*"
        }
    ]
}