Azure data factory Azure Data Factory将数据流映射到CSV接收器会导致零字节文件
我正在改进我的Azure数据工厂印章,比较复制活动性能与将数据流写入Azure Blob存储中的单个CSV文件的映射 当我通过Azure Blob存储链接服务(azureBlobLinkedService)通过数据集(azureBlobSingleCSVFileNameDataset)写入单个CSV时,使用复制活动在Blob存储容器中获得我期望的输出。例如,容器MyContainer中/output/csv/singleFiles文件夹下的MyData.csv输出文件 当我通过同一Blob存储链接服务,但通过不同的数据集(azureBlobSingleCSVNoFileNameDataset),使用映射数据流写入单个CSV时,我得到以下结果:Azure data factory Azure Data Factory将数据流映射到CSV接收器会导致零字节文件,azure-data-factory,azure-data-factory-2,azure-blob-storage,Azure Data Factory,Azure Data Factory 2,Azure Blob Storage,我正在改进我的Azure数据工厂印章,比较复制活动性能与将数据流写入Azure Blob存储中的单个CSV文件的映射 当我通过Azure Blob存储链接服务(azureBlobLinkedService)通过数据集(azureBlobSingleCSVFileNameDataset)写入单个CSV时,使用复制活动在Blob存储容器中获得我期望的输出。例如,容器MyContainer中/output/csv/singleFiles文件夹下的MyData.csv输出文件 当我通过同一Blob存储链
- MyContainer/output/csv/singleFiles(零长度文件)
- MyContainer/output/csv/singleFiles/MyData.csv(包含我期望的数据)
{
"name": "azureBlobLinkedService",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"type": "AzureBlobStorage",
"parameters": {
"azureBlobConnectionStringSecretName": {
"type": "string"
}
},
"annotations": [],
"typeProperties": {
"connectionString": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "AzureKeyVaultLinkedService",
"type": "LinkedServiceReference"
},
"secretName": "@{linkedService().azureBlobConnectionStringSecretName}"
}
}
}
}
dataset/azureBlobSingleCSVFileNameDataset
{
"name": "azureBlobSingleCSVFileNameDataset",
"properties": {
"linkedServiceName": {
"referenceName": "azureBlobLinkedService",
"type": "LinkedServiceReference",
"parameters": {
"azureBlobConnectionStringSecretName": {
"value": "@dataset().azureBlobConnectionStringSecretName",
"type": "Expression"
}
}
},
"parameters": {
"azureBlobConnectionStringSecretName": {
"type": "string"
},
"azureBlobSingleCSVFileName": {
"type": "string"
},
"azureBlobSingleCSVFolderPath": {
"type": "string"
},
"azureBlobSingleCSVContainerName": {
"type": "string"
}
},
"annotations": [],
"type": "DelimitedText",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"fileName": {
"value": "@dataset().azureBlobSingleCSVFileName",
"type": "Expression"
},
"folderPath": {
"value": "@dataset().azureBlobSingleCSVFolderPath",
"type": "Expression"
},
"container": {
"value": "@dataset().azureBlobSingleCSVContainerName",
"type": "Expression"
}
},
"columnDelimiter": ",",
"escapeChar": "\\",
"firstRowAsHeader": true,
"quoteChar": "\""
},
"schema": []
},
"type": "Microsoft.DataFactory/factories/datasets"
}
管道/Azure SQL表到Blob单个CSV拷贝管道(这将产生预期结果)
dataset/azureBlobSingleCSVNoFileNameDataset:(映射数据流所需的数据集中没有文件名,在映射数据流中设置)
数据流/azureSqlDatabaseTableToAzureBlobSingleCSVDataFlow
{
"name": "azureSqlDatabaseTableToAzureBlobSingleCSVDataFlow",
"properties": {
"type": "MappingDataFlow",
"typeProperties": {
"sources": [
{
"dataset": {
"referenceName": "azureSqlDatabaseTableDataset",
"type": "DatasetReference"
},
"name": "readFromAzureSqlDatabase"
}
],
"sinks": [
{
"dataset": {
"referenceName": "azureBlobSingleCSVNoFileNameDataset",
"type": "DatasetReference"
},
"name": "writeToAzureBlobSingleCSV"
}
],
"transformations": [
{
"name": "enrichWithRuntimeMetadata"
}
],
"script": "\nparameters{\n\tsourceConnectionSecretName as string,\n\tsinkConnectionStringSecretName as string,\n\tsourceObjectName as string,\n\tsinkObjectName as string,\n\tdataFactoryName as string,\n\tdataFactoryPipelineName as string,\n\tdataFactoryPipelineRunId as string,\n\tsinkFileNameNoPath as string\n}\nsource(allowSchemaDrift: true,\n\tvalidateSchema: false,\n\tisolationLevel: 'READ_UNCOMMITTED',\n\tformat: 'table') ~> readFromAzureSqlDatabase\nreadFromAzureSqlDatabase derive({__sourceConnectionStringSecretName} = $sourceConnectionSecretName,\n\t\t{__sinkConnectionStringSecretName} = $sinkConnectionStringSecretName,\n\t\t{__sourceObjectName} = $sourceObjectName,\n\t\t{__sinkObjectName} = $sinkObjectName,\n\t\t{__dataFactoryName} = $dataFactoryName,\n\t\t{__dataFactoryPipelineName} = $dataFactoryPipelineName,\n\t\t{__dataFactoryPipelineRunId} = $dataFactoryPipelineRunId) ~> enrichWithRuntimeMetadata\nenrichWithRuntimeMetadata sink(allowSchemaDrift: true,\n\tvalidateSchema: false,\n\tpartitionFileNames:[($sinkFileNameNoPath)],\n\tpartitionBy('hash', 1),\n\tquoteAll: true) ~> writeToAzureBlobSingleCSV"
}
}
}
pipeline/Azure SQL表到Blob单个CSV数据流管道(这将生成预期结果,并在文件夹路径处生成零字节文件。)
获取0个长度(字节)文件的原因意味着,虽然管道可能已成功运行,但它没有返回或生成任何输出
更好的技术之一是预览每个阶段的输出,以确保每个阶段都有一个预期的输出。(这会产生预期的结果,并在文件夹路径上加上零字节文件。)我得到预期的输出,在每个文件夹点加上零字节文件。当我通过同一个Blob存储链接服务写入单个CSV时,但是通过另一个数据集(azureBlobSingleCSVNoFileNameDataset),使用映射数据流,我得到了以下内容:MyContainer/output/csv/singleFiles(零长度文件)MyContainer/output/csv/singleFiles/MyData.csv(包含我期望的数据)我不明白为什么在使用映射数据流时会生成零长度文件。有关于此问题的更新吗?我也有同样的经历。
{
"name": "azureBlobSingleCSVNoFileNameDataset",
"properties": {
"linkedServiceName": {
"referenceName": "azureBlobLinkedService",
"type": "LinkedServiceReference",
"parameters": {
"azureBlobConnectionStringSecretName": {
"value": "@dataset().azureBlobConnectionStringSecretName",
"type": "Expression"
}
}
},
"parameters": {
"azureBlobConnectionStringSecretName": {
"type": "string"
},
"azureBlobSingleCSVFolderPath": {
"type": "string"
},
"azureBlobSingleCSVContainerName": {
"type": "string"
}
},
"annotations": [],
"type": "DelimitedText",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"folderPath": {
"value": "@dataset().azureBlobSingleCSVFolderPath",
"type": "Expression"
},
"container": {
"value": "@dataset().azureBlobSingleCSVContainerName",
"type": "Expression"
}
},
"columnDelimiter": ",",
"escapeChar": "\\",
"firstRowAsHeader": true,
"quoteChar": "\""
},
"schema": []
},
"type": "Microsoft.DataFactory/factories/datasets"
}
{
"name": "azureSqlDatabaseTableToAzureBlobSingleCSVDataFlow",
"properties": {
"type": "MappingDataFlow",
"typeProperties": {
"sources": [
{
"dataset": {
"referenceName": "azureSqlDatabaseTableDataset",
"type": "DatasetReference"
},
"name": "readFromAzureSqlDatabase"
}
],
"sinks": [
{
"dataset": {
"referenceName": "azureBlobSingleCSVNoFileNameDataset",
"type": "DatasetReference"
},
"name": "writeToAzureBlobSingleCSV"
}
],
"transformations": [
{
"name": "enrichWithRuntimeMetadata"
}
],
"script": "\nparameters{\n\tsourceConnectionSecretName as string,\n\tsinkConnectionStringSecretName as string,\n\tsourceObjectName as string,\n\tsinkObjectName as string,\n\tdataFactoryName as string,\n\tdataFactoryPipelineName as string,\n\tdataFactoryPipelineRunId as string,\n\tsinkFileNameNoPath as string\n}\nsource(allowSchemaDrift: true,\n\tvalidateSchema: false,\n\tisolationLevel: 'READ_UNCOMMITTED',\n\tformat: 'table') ~> readFromAzureSqlDatabase\nreadFromAzureSqlDatabase derive({__sourceConnectionStringSecretName} = $sourceConnectionSecretName,\n\t\t{__sinkConnectionStringSecretName} = $sinkConnectionStringSecretName,\n\t\t{__sourceObjectName} = $sourceObjectName,\n\t\t{__sinkObjectName} = $sinkObjectName,\n\t\t{__dataFactoryName} = $dataFactoryName,\n\t\t{__dataFactoryPipelineName} = $dataFactoryPipelineName,\n\t\t{__dataFactoryPipelineRunId} = $dataFactoryPipelineRunId) ~> enrichWithRuntimeMetadata\nenrichWithRuntimeMetadata sink(allowSchemaDrift: true,\n\tvalidateSchema: false,\n\tpartitionFileNames:[($sinkFileNameNoPath)],\n\tpartitionBy('hash', 1),\n\tquoteAll: true) ~> writeToAzureBlobSingleCSV"
}
}
}
{
"name": "Azure SQL Table to Blob Single CSV Data Flow Pipeline",
"properties": {
"activities": [
{
"name": "Copy Sql Database Table To Blob Single CSV Data Flow",
"type": "ExecuteDataFlow",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"dataflow": {
"referenceName": "azureSqlDatabaseTableToAzureBlobSingleCSVDataFlow",
"type": "DataFlowReference",
"parameters": {
"sourceConnectionSecretName": {
"value": "'@{pipeline().parameters.sourceAzureSqlDatabaseConnectionStringSecretName}'",
"type": "Expression"
},
"sinkConnectionStringSecretName": {
"value": "'@{pipeline().parameters.sinkAzureBlobConnectionStringSecretName}'",
"type": "Expression"
},
"sourceObjectName": {
"value": "'@{concat('[', pipeline().parameters.sourceAzureSqlDatabaseTableSchemaName, '].[', pipeline().parameters.sourceAzureSqlDatabaseTableTableName, ']')}'",
"type": "Expression"
},
"sinkObjectName": {
"value": "'@{concat(pipeline().parameters.sinkAzureBlobSingleCSVContainerName, '/', pipeline().parameters.sinkAzureBlobSingleCSVFolderPath, '/', \npipeline().parameters.sinkAzureBlobSingleCSVFileName)}'",
"type": "Expression"
},
"dataFactoryName": {
"value": "'@{pipeline().DataFactory}'",
"type": "Expression"
},
"dataFactoryPipelineName": {
"value": "'@{pipeline().Pipeline}'",
"type": "Expression"
},
"dataFactoryPipelineRunId": {
"value": "'@{pipeline().RunId}'",
"type": "Expression"
},
"sinkFileNameNoPath": {
"value": "'@{pipeline().parameters.sinkAzureBlobSingleCSVFileName}'",
"type": "Expression"
}
},
"datasetParameters": {
"readFromAzureSqlDatabase": {
"azureSqlDatabaseConnectionStringSecretName": {
"value": "@pipeline().parameters.sourceAzureSqlDatabaseConnectionStringSecretName",
"type": "Expression"
},
"azureSqlDatabaseTableSchemaName": {
"value": "@pipeline().parameters.sourceAzureSqlDatabaseTableSchemaName",
"type": "Expression"
},
"azureSqlDatabaseTableTableName": {
"value": "@pipeline().parameters.sourceAzureSqlDatabaseTableTableName",
"type": "Expression"
}
},
"writeToAzureBlobSingleCSV": {
"azureBlobConnectionStringSecretName": {
"value": "@pipeline().parameters.sinkAzureBlobConnectionStringSecretName",
"type": "Expression"
},
"azureBlobSingleCSVFolderPath": {
"value": "@pipeline().parameters.sinkAzureBlobSingleCSVFolderPath",
"type": "Expression"
},
"azureBlobSingleCSVContainerName": {
"value": "@pipeline().parameters.sinkAzureBlobSingleCSVContainerName",
"type": "Expression"
}
}
}
},
"compute": {
"coreCount": 8,
"computeType": "General"
}
}
}
],
"parameters": {
"sourceAzureSqlDatabaseConnectionStringSecretName": {
"type": "string"
},
"sourceAzureSqlDatabaseTableSchemaName": {
"type": "string"
},
"sourceAzureSqlDatabaseTableTableName": {
"type": "string"
},
"sinkAzureBlobConnectionStringSecretName": {
"type": "string"
},
"sinkAzureBlobSingleCSVContainerName": {
"type": "string"
},
"sinkAzureBlobSingleCSVFolderPath": {
"type": "string"
},
"sinkAzureBlobSingleCSVFileName": {
"type": "string"
}
},
"annotations": []
},
"type": "Microsoft.DataFactory/factories/pipelines"
}