Azure data factory 将@trigger().startTime插入Azure DataFactory中的sql表

Azure data factory 将@trigger().startTime插入Azure DataFactory中的sql表,azure-data-factory,azure-data-factory-2,azure-sqldw,azure-sql-data-warehouse,Azure Data Factory,Azure Data Factory 2,Azure Sqldw,Azure Sql Data Warehouse,我有一个数据工厂,其中源是CSV,目标是Azure SQL数据仓库 Azure SQL数据仓库中的表有一个额外的DateTime列,用于记录触发的时间 在映射模式时,如何使其工作 注意:在Azure SQL数据仓库中,不可能像在Azure SQL数据库中那样具有默认值GETDATE()的列 SQL数据仓库中的列为“InsertedOn” 我的管道如下所示: { "name": "Pipeline01", "properties": { "activities":

我有一个数据工厂,其中源是CSV,目标是Azure SQL数据仓库

Azure SQL数据仓库中的表有一个额外的DateTime列,用于记录触发的时间

在映射模式时,如何使其工作

注意:在Azure SQL数据仓库中,不可能像在Azure SQL数据库中那样具有默认值GETDATE()的列

SQL数据仓库中的列为“InsertedOn”

我的管道如下所示:

{
    "name": "Pipeline01",
    "properties": {
        "activities": [
            {
                "name": "CopyCSVtoDW",
                "type": "Copy",
                "policy": {
                    "timeout": "7.00:00:00",
                    "retry": 0,
                    "retryIntervalInSeconds": 30,
                    "secureOutput": false,
                    "secureInput": false
                },
                "typeProperties": {
                    "source": {
                        "type": "BlobSource",
                        "recursive": true
                    },
                    "sink": {
                        "type": "SqlDWSink",
                        "allowPolyBase": false,
                        "writeBatchSize": 10000
                    },
                    "enableStaging": false,
                    "enableSkipIncompatibleRow": false,
                    "translator": {
                        "type": "TabularTranslator",
                        "columnMappings": {
                            "Id": "pointconnectnativeid",
                            "ValueDate": "valuedate",
                            "Value": "value",
                            "InsertedOn": "insertedon",
                            "forecastDate": "forecastDate"
                        }
                    }
                },
                "inputs": [
                    {
                        "referenceName": "SourceCSV",
                        "type": "DatasetReference"
                    }
                ],
                "outputs": [
                    {
                        "referenceName": "DestinationDW",
                        "type": "DatasetReference"
                    }
                ]
            }
        ]
    },
    "type": "Microsoft.DataFactory/factories/pipelines"
}
以下是我的资料来源:

{
    "name": "SourceCSV",
    "properties": {
        "linkedServiceName": {
            "referenceName": "skdwstorage",
            "type": "LinkedServiceReference"
        },
        "parameters": {
            "triggerDateTime": {
                "type": "Object",
                "defaultValue": "@trigger().startTime"
            }
        },
        "type": "AzureBlob",
        "structure": [
            {
                "name": "Id",
                "type": "String"
            },
            {
                "name": "ValueDate",
                "type": "DateTime",
                "format": "dd.MM.yyyy HH:mm:ss"
            },
            {
                "name": "Value",
                "type": "Decimal"
            },
            {
                "name": "InsertedOn",
                "type": "DateTime",
                "description": "@trigger().startTime",
                "format": "dd.MM.yyyy HH:mm:ss"
            },
            {
                "name": "forecastDate",
                "type": "DateTime",
                "format": "dd.MM.yyyy HH:mm:ss"
            }
        ],
        "typeProperties": {
            "format": {
                "type": "TextFormat",
                "columnDelimiter": "|",
                "rowDelimiter": "\n",
                "quoteChar": "\"",
                "nullValue": "\\N",
                "encodingName": null,
                "treatEmptyAsNull": true,
                "skipLineCount": 0,
                "firstRowAsHeader": true
            },
            "fileName": "",
            "folderPath": "csv"
        }
    },
    "type": "Microsoft.DataFactory/factories/datasets"
}

在.json触发器定义中,可以定义名为TriggerStartTime的参数:

"parameters": {
" TriggerStartTime": "@trigger().startTime"
}
以您的情况为例:

{
    "name": "Pipeline01Trigger",
    "properties": {
        "runtimeState": "Started",
        "pipelines": [
            {
                "pipelineReference": {
                    "referenceName": "Pipeline01",
                    "type": "PipelineReference"
                },
                "parameters": {
                    "TriggerStartTime": "@trigger().startTime"
                }
            }
        ],
        "type": "ScheduleTrigger",
        "typeProperties": {
            "recurrence": {
                "frequency": "Hour",
                "interval": 1,
                "startTime": "2019-01-01T00:00:00Z",
                "timeZone": "UTC"
            }
        }
    }
}
在Pipeline01参数的部分中,必须为参数设置默认值

在复制活动之后,您可以将此参数映射为:

@pipeline().parameters.TriggerStartTime
在您的情况下,类似于:

"columnMappings": {
    "Id": "pointconnectnativeid",
    "ValueDate": "valuedate",
    "Value": "value",
    "InsertedOn": "@pipeline().parameters.TriggerStartTime",
    "forecastDate": "forecastDate"
}
在这里您可以找到一些信息:


在.json触发器定义中,您可以定义一个名为TriggerStartTime的参数:

"parameters": {
" TriggerStartTime": "@trigger().startTime"
}
以您的情况为例:

{
    "name": "Pipeline01Trigger",
    "properties": {
        "runtimeState": "Started",
        "pipelines": [
            {
                "pipelineReference": {
                    "referenceName": "Pipeline01",
                    "type": "PipelineReference"
                },
                "parameters": {
                    "TriggerStartTime": "@trigger().startTime"
                }
            }
        ],
        "type": "ScheduleTrigger",
        "typeProperties": {
            "recurrence": {
                "frequency": "Hour",
                "interval": 1,
                "startTime": "2019-01-01T00:00:00Z",
                "timeZone": "UTC"
            }
        }
    }
}
在Pipeline01参数的部分中,必须为参数设置默认值

在复制活动之后,您可以将此参数映射为:

@pipeline().parameters.TriggerStartTime
在您的情况下,类似于:

"columnMappings": {
    "Id": "pointconnectnativeid",
    "ValueDate": "valuedate",
    "Value": "value",
    "InsertedOn": "@pipeline().parameters.TriggerStartTime",
    "forecastDate": "forecastDate"
}
在这里您可以找到一些信息:


理论上,我理解你的答案,但不确定在哪里添加它们。。。我张贴了我的pipeline@Data-根据我编辑的答案,我希望现在它在理论上是清晰的,我理解你的答案,不知道确切地在哪里添加它们。。。我张贴了我的pipeline@Data-我已经编辑了答案,希望现在答案清楚了