Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/mysql/64.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
AWS datapipeline Mysql RDS到S3 dataformat不适用于ColumnSeptor_Mysql_Amazon Web Services_Amazon Data Pipeline - Fatal编程技术网

AWS datapipeline Mysql RDS到S3 dataformat不适用于ColumnSeptor

AWS datapipeline Mysql RDS到S3 dataformat不适用于ColumnSeptor,mysql,amazon-web-services,amazon-data-pipeline,Mysql,Amazon Web Services,Amazon Data Pipeline,我通过一个简单的select语句设置了一个数据管道。默认的列分隔符是逗号。datapipe行运行正常,但我需要将ColumnSeptor更改为管道。因此,我在定义中增加了一个新的部分 新的部分以“id”开头:“CustomDataFormatExample”… 这没用。它不会出错。它只是用逗号作为列分隔符继续输出 { "objects": [ { "*password": "#{*myRDSPassword}", "name": "rds_mysql",

我通过一个简单的select语句设置了一个数据管道。默认的列分隔符是逗号。datapipe行运行正常,但我需要将ColumnSeptor更改为管道。因此,我在定义中增加了一个新的部分

新的部分以“id”开头:“CustomDataFormatExample”…

这没用。它不会出错。它只是用逗号作为列分隔符继续输出

{
  "objects": [
    {
      "*password": "#{*myRDSPassword}",
      "name": "rds_mysql",
      "jdbcProperties": "allowMultiQueries=true",
      "id": "rds_mysql",
      "type": "RdsDatabase",
      "rdsInstanceId": "#{myRDSInstanceId}",
      "username": "#{myRDSUsername}"
    },
    {
      "output": {
        "ref": "S3OutputLocation"
      },
      "input": {
        "ref": "SourceRDSTable"
      },
      "name": "RDStoS3CopyActivity",
      "runsOn": {
        "ref": "Ec2Instance"
      },
      "id": "RDStoS3CopyActivity",
      "type": "CopyActivity"
    },
    {
      "database": {
        "ref": "rds_mysql"
      },
      "name": "SourceRDSTable",
      "id": "SourceRDSTable",
      "type": "SqlDataNode",
      "table": "#{myRDSTableName}",
      "selectQuery": "#{myRDSSqlStatement}"
    },
    {
      "failureAndRerunMode": "CASCADE",
      "schedule": {
        "ref": "DefaultSchedule"
      },
      "resourceRole": "DataPipelineDefaultResourceRole",
      "role": "DataPipelineDefaultRole",
      "scheduleType": "cron",
      "name": "Default",
      "id": "Default"
    },
    {
      "instanceType": "#{myEC2InstanceType}",
      "name": "Ec2Instance",
      "actionOnTaskFailure": "terminate",
      "securityGroups": "#{myEc2RdsSecurityGrps}",
      "id": "Ec2Instance",
      "type": "Ec2Resource",
      "terminateAfter": "2 Hours"
    },
    {
      "period": "1 days",
      "startDateTime": "2020-04-05T23:10:00",
      "name": "Every 1 day",
      "id": "DefaultSchedule",
      "type": "Schedule"
    },
    {
      "filePath": "#{myOutputS3Loc}/#{myOutputS3FileName}",
      "name": "S3OutputLocation",
      "id": "S3OutputLocation",
      "type": "S3DataNode"
    },
    {
      "id" : "CustomDataFormatExample",
      "name" : "CustomDataFormatExample",
      "type" : "TSV",
      "columnSeparator" : "|",
      "column":[
        "aci_code STRING",
        "address_id STRING",
        "created STRING",
        "created_by STRING",
        "additional_address_line STRING",
        "correspondence STRING",
        "line_1 STRING",
        "line_2 STRING",
        "line_3 STRING",
        "line_4 STRING",
        "line_5 STRING",
        "postcode STRING",
        "postcode_nospace STRING",
        "country_code STRING"
        ],
      "escapeChar": "\\",
      "recordSeparator": "\\n"
    }
  ],
  "parameters": [
        // Some parameters in here.
  ],
  "values": {
        // Some values in here.
  }
}

您必须在
S3DataNode
中明确设置
dataFormat
字段

    ...
    {
      "filePath": "#{myOutputS3Loc}/#{myOutputS3FileName}",
      "name": "S3OutputLocation",
      "id": "S3OutputLocation",
      "type": "S3DataNode",
      "dataFormat": { "ref": "CustomDataFormatExample" }
    },
    ...

您必须在
S3DataNode
中明确设置
dataFormat
字段

    ...
    {
      "filePath": "#{myOutputS3Loc}/#{myOutputS3FileName}",
      "name": "S3OutputLocation",
      "id": "S3OutputLocation",
      "type": "S3DataNode",
      "dataFormat": { "ref": "CustomDataFormatExample" }
    },
    ...