Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/cmake/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
terraform有条件地创建日志配置和后续容器定义_Terraform_Amazon Ecs_Terraform Provider Aws - Fatal编程技术网

terraform有条件地创建日志配置和后续容器定义

terraform有条件地创建日志配置和后续容器定义,terraform,amazon-ecs,terraform-provider-aws,Terraform,Amazon Ecs,Terraform Provider Aws,我有如下的地形配置,我正在使用它在各种prod/uat/dev环境中构建ECS集群 在我们的下文中,我们根据目标环境配置容器定义的logConfiguration 如果infra由开发者和env:dev启动,我们将logConfiguration配置为使用awslogs作为日志驱动程序及其后续日志驱动程序选项(即awslogs区域、awslogs组、awslogs流前缀) 如果我们的连续部署管道使用env:prod或uat启动infra,我们将logConfiguration配置为使用awsfi

我有如下的地形配置,我正在使用它在各种prod/uat/dev环境中构建ECS集群

在我们的下文中,我们根据目标环境配置容器定义的
logConfiguration

如果infra由开发者和
env:dev
启动,我们将
logConfiguration
配置为使用
awslogs
作为日志驱动程序及其后续日志驱动程序选项(即awslogs区域、awslogs组、awslogs流前缀)

如果我们的连续部署管道使用
env:prod或uat
启动infra,我们将
logConfiguration
配置为使用
awsfirelens
logDriver及其后续选项,我们还将添加fluentbit容器定义(我们使用DataDog作为日志记录/监视解决方案,因此我们在服务容器旁边使用fluentbit容器将日志转发到DataDog端点)

到今天为止,下面的配置运行得还不错。但是,对于每个日志配置用例,它都有我的服务容器定义的代码重复,因为我使用三元运算符来决定基于env生成哪个容器定义。这使得我的代码不必太长,并且容易出错(如果我为我的服务添加/删除环境变量,我必须在两个用例中都这样做。如果我将来更改我的服务配置,我必须在两个用例中都保持不变,等等。)

资源“aws\u ecs\u任务\u定义”“此”{
family=“这个”
执行\u角色\u arn=var.this
任务\u角色\u arn=var.this
网络模式=“awsvpc”
需要_兼容性=[“FARGATE”]
cpu=256
内存=var.env!=“产品”?512:1024
标记=本地.通用\u标记
#如果datadog正在prod/uat环境帐户中运行,请登录到datadog。
容器定义=(
local.current_env==prod?
    resource "aws_ecs_task_definition" "this" {
      family                   = "this"
      execution_role_arn       = var.this
      task_role_arn            = var.this
      network_mode             = "awsvpc"
      requires_compatibilities = ["FARGATE"]
      cpu                      = 256
      memory                   = var.env != "prod" ? 512 : 1024
      tags                     = local.common_tags
      # Log to datadog if it's running in the prod/uat environment account.
      container_definitions = (
        local.current_env == prod ? <<TASK_DEFINITION
    [
        {
            "essential": true,
            "image": "this",
            "cpu":0,
            "environment" :[
few environment variables for service to start
],
            "name": "this",
            "mountPoints": [],
            "volumesFrom": [],
            "logConfiguration": {
                "logDriver": "awsfirelens",
                "options": {
                    "Name": "datadog",
                    "apikey": "this",
                    "Host": "http-intake.logs.datadoghq.com",
                    "dd_service": "this",
                    "dd_source": "this",
                    "dd_message_key": "log",
                    "dd_tags": "tags",
                    "TLS": "on",
                    "provider": "ecs"
                }
            },
            "portMappings": [
                {
                    "containerPort": 443,
                    "hostPort": 443,
                    "protocol":"tcp"
                }
            ]
        },
        {
            "essential": true,
            "cpu":0,
            "environment":[],
            "mountPoints":[],
            "portMappings":[],
            "user":"0",
            "volumesFrom":[],
            "image": "amazon/aws-for-fluent-bit:latest",
            "name": "log_router",
            "firelensConfiguration": {
                "type": "fluentbit",
                "options": { "enable-ecs-log-metadata": "true" }
            }
        }
    ]
    TASK_DEFINITION
        : <<TASK_DEFINITION
    [
        {
            "essential": true,
            "image": "this",
            "cpu":0,
            "mountPoints": [],
            "volumesFrom": [],
            "environment" :[
few environment variables for service to start
],
            "name": "this",
               "logConfiguration" : {
                    "logDriver" :"awslogs",
                    "options" : {
                        "awslogs-region"        : "${var.region}",
                        "awslogs-group"         : "${aws_cloudwatch_log_group.this.name}",
                        "awslogs-stream-prefix" : "this-service"
                    }
            },
            "portMappings": [
                {
                    "containerPort": 443,
                    "hostPort": 443,
                    "protocol":"tcp"
                }
            ]
        }
    ]
    TASK_DEFINITION
      )
    }