Aws lambda 如何使用CloudFormation在Lambda函数中传递变量

Aws lambda 如何使用CloudFormation在Lambda函数中传递变量,aws-lambda,amazon-cloudformation,Aws Lambda,Amazon Cloudformation,如何使用CloudFormation在Lambda函数中传递变量 没有找到传递变量的方法,我们稍后将通过os.environ['key']访问这些变量 --- AWSTemplateFormatVersion: '2010-09-09' Description: 'objects from Prod bucket to Dev data bucket ' Parameters: CustomerName: Description: Customer Name Type: St

如何使用CloudFormation在Lambda函数中传递变量

没有找到传递变量的方法,我们稍后将通过os.environ['key']访问这些变量

---
AWSTemplateFormatVersion: '2010-09-09'
Description: 'objects from Prod bucket to Dev data bucket '
Parameters:
  CustomerName:
    Description: Customer Name
    Type: String
    Default: incoming
  ProjectName:
    Description: Project Name
    Type: String
    Default: TEST
  ENV:
    Description: Environment (dev, prd)
    Type: String
    Default: dev
  srcBucket:
    Description: Source Bucket that receives data from outside
    Default: source1
    Type: String
  dstBucket:
    Description: Destination Bucket that will receive 
    Type: String
    Default: destination1
Resources:
  LambdaRole:
    Type: AWS::IAM::Role
    Properties:
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
        - Effect: Allow
          Principal:
            Service:
            - lambda.amazonaws.com
            - s3.amazonaws.com
          Action:
          - sts:AssumeRole
      Path:
        Fn::Sub: "/${ProjectName}/"
      Policies:
      - PolicyName:
          Fn::Sub: "${AWS::StackName}"
        PolicyDocument:
          Version: '2012-10-17'
          Statement:
          - Sid: AllowLogging
            Effect: Allow
            Action:
            - logs:CreateLogGroup
            - logs:CreateLogStream
            - logs:PutLogEvents
            Resource: "*"
          - Sid: SrcBucketPrivs
            Action:
            - s3:GetObject
            - s3:List*
            Resource:
            - Fn::Sub: arn:aws:s3:::${srcBucket}/*
            - Fn::Sub: arn:aws:s3:::${srcBucket}
            Effect: Allow
          - Sid: DstBucketPrivs
            Action:
            - s3:PutObject
            - s3:List*
            Resource:
            - Fn::Sub: arn:aws:s3:::${dstBucket}/*
            - Fn::Sub: arn:aws:s3:::${dstBucket}
            Effect: Allow
  LambdaFunction:
    Type: AWS::Lambda::Function
    DependsOn: LambdaRole
    Properties:
      Code:
        ZipFile: |
           from __future__ import print_function
           import os
           import json
           import boto3
           import time
           import string
           import urllib
           print('Loading function')
           s3 = boto3.client('s3')
           def handler(event, context):
              source_bucket = event['Records'][0]['s3']['bucket']['name']
              key = event['Records'][0]['s3']['object']['key']


              target_bucket     =  Ref: dstBucket
              copy_source = {'Bucket':source_bucket, 'Key':key}

              try:
                s3.copy_object(Bucket=target_bucket, Key=key, CopySource=copy_source)

              except Exception as e:
                print(e)
                print('Error getting object {} from bucket {}. Make sure they exist '
                   'and your bucket is in the same region as this '
                   'function.'.format(key, source_bucket))
                raise e

      Description: Copies objects from srcBucket to dstBucket based on S3 Event Trigger
      FunctionName:
        Fn::Sub: "${AWS::StackName}"
      Handler: index.handler
      MemorySize: 128
      Role:
        Fn::GetAtt:
        - LambdaRole
        - Arn
      Runtime: python3.6
      Timeout: 60
  LambdaInvokePermission:
    Type: AWS::Lambda::Permission
    DependsOn: LambdaFunction
    Properties:
      FunctionName:
        Fn::GetAtt:
        - LambdaFunction
        - Arn
      Action: lambda:InvokeFunction
      Principal: s3.amazonaws.com
      SourceAccount:
        Ref: AWS::AccountId
      SourceArn:
        Fn::Sub: arn:aws:s3:::${srcBucket}
如何使用CloudFormation在Lambda函数中传递变量

没有找到传递变量的方法,我们稍后将通过os.environ['key']访问这些变量

---
AWSTemplateFormatVersion: '2010-09-09'
Description: 'objects from Prod bucket to Dev data bucket '
Parameters:
  CustomerName:
    Description: Customer Name
    Type: String
    Default: incoming
  ProjectName:
    Description: Project Name
    Type: String
    Default: TEST
  ENV:
    Description: Environment (dev, prd)
    Type: String
    Default: dev
  srcBucket:
    Description: Source Bucket that receives data from outside
    Default: source1
    Type: String
  dstBucket:
    Description: Destination Bucket that will receive 
    Type: String
    Default: destination1
Resources:
  LambdaRole:
    Type: AWS::IAM::Role
    Properties:
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
        - Effect: Allow
          Principal:
            Service:
            - lambda.amazonaws.com
            - s3.amazonaws.com
          Action:
          - sts:AssumeRole
      Path:
        Fn::Sub: "/${ProjectName}/"
      Policies:
      - PolicyName:
          Fn::Sub: "${AWS::StackName}"
        PolicyDocument:
          Version: '2012-10-17'
          Statement:
          - Sid: AllowLogging
            Effect: Allow
            Action:
            - logs:CreateLogGroup
            - logs:CreateLogStream
            - logs:PutLogEvents
            Resource: "*"
          - Sid: SrcBucketPrivs
            Action:
            - s3:GetObject
            - s3:List*
            Resource:
            - Fn::Sub: arn:aws:s3:::${srcBucket}/*
            - Fn::Sub: arn:aws:s3:::${srcBucket}
            Effect: Allow
          - Sid: DstBucketPrivs
            Action:
            - s3:PutObject
            - s3:List*
            Resource:
            - Fn::Sub: arn:aws:s3:::${dstBucket}/*
            - Fn::Sub: arn:aws:s3:::${dstBucket}
            Effect: Allow
  LambdaFunction:
    Type: AWS::Lambda::Function
    DependsOn: LambdaRole
    Properties:
      Code:
        ZipFile: |
           from __future__ import print_function
           import os
           import json
           import boto3
           import time
           import string
           import urllib
           print('Loading function')
           s3 = boto3.client('s3')
           def handler(event, context):
              source_bucket = event['Records'][0]['s3']['bucket']['name']
              key = event['Records'][0]['s3']['object']['key']


              target_bucket     =  Ref: dstBucket
              copy_source = {'Bucket':source_bucket, 'Key':key}

              try:
                s3.copy_object(Bucket=target_bucket, Key=key, CopySource=copy_source)

              except Exception as e:
                print(e)
                print('Error getting object {} from bucket {}. Make sure they exist '
                   'and your bucket is in the same region as this '
                   'function.'.format(key, source_bucket))
                raise e

      Description: Copies objects from srcBucket to dstBucket based on S3 Event Trigger
      FunctionName:
        Fn::Sub: "${AWS::StackName}"
      Handler: index.handler
      MemorySize: 128
      Role:
        Fn::GetAtt:
        - LambdaRole
        - Arn
      Runtime: python3.6
      Timeout: 60
  LambdaInvokePermission:
    Type: AWS::Lambda::Permission
    DependsOn: LambdaFunction
    Properties:
      FunctionName:
        Fn::GetAtt:
        - LambdaFunction
        - Arn
      Action: lambda:InvokeFunction
      Principal: s3.amazonaws.com
      SourceAccount:
        Ref: AWS::AccountId
      SourceArn:
        Fn::Sub: arn:aws:s3:::${srcBucket}

知道加载项控制台,但希望通过云形成脚本传递模板的lambda部分如下所示:

MySnsTopic:
  Type: 'AWS::SNS::Topic'
  Properties:
    DisplayName: MySnsTopic
    TopicName: MySnsTopic    
LambdaFunction:
  Type: AWS::Lambda::Function
  DependsOn: LambdaRole
  Properties:
    Code:
      ZipFile: |
        from __future__ import print_function
        import os
        import json
        import boto3
        import time
        import string
        import urllib
        print('Loading function')
        s3 = boto3.client('s3')
        sns = boto3.client('sns')
        def handler(event, context):
          source_bucket = event['Records'][0]['s3']['bucket']['name']
          key = event['Records'][0]['s3']['object']['key']

          target_bucket     =  Ref: dstBucket
          copy_source = {'Bucket':source_bucket, 'Key':key}

          try:
            s3.copy_object(Bucket=target_bucket, Key=key, CopySource=copy_source)

          response = sns.publish(
            TopicArn=os.environ['NotificationTopicARN'],    
            Message='Andrew is at the bowlo.  Brought to you by http://IsAndrewAtTheBowlo.com'
          )    

          except Exception as e:
            print(e)
            print('Error getting object {} from bucket {}. Make sure they exist '
               'and your bucket is in the same region as this '
               'function.'.format(key, source_bucket))
            raise e

Description: Copies objects from srcBucket to dstBucket based on S3 Event Trigger
FunctionName:
  Fn::Sub: "${AWS::StackName}"
Handler: index.handler
Environment:
  Variables:
    NotificationTopicARN: !Ref MySnsTopic
MemorySize: 128
Role:
  Fn::GetAtt:
  - LambdaRole
  - Arn
Runtime: python3.6
Timeout: 60
您需要添加这样的策略

    - PolicyDocument:
        Version: 2012-10-17
        Statement:
          - Action:
              - 'SNS:Publish'
            Effect: Allow
            Resource:
              - !Ref MySnsTopic
      PolicyName: lambdaSNS
可能重复的