Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/jsp/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Azure data factory 在数据工厂管道中构建JSON的合适方法是什么_Azure Data Factory_Azure Data Factory 2 - Fatal编程技术网

Azure data factory 在数据工厂管道中构建JSON的合适方法是什么

Azure data factory 在数据工厂管道中构建JSON的合适方法是什么,azure-data-factory,azure-data-factory-2,Azure Data Factory,Azure Data Factory 2,在我之前的文章中,有人建议我使用“适当的方法”来构建json字符串,该字符串将插入SQL Server表中以用于日志记录。在前面的文章中,我使用字符串连接来构建json字符串 在数据工厂管道中构建json的合适工具/功能是什么?我已经研究了json()和string()函数,但它们仍然依赖于连接 澄清:我试图生成一条日志消息,如下所示:现在我正在使用字符串连接来生成日志json。有没有更好、更优雅(但更轻量级)的方法来生成json数据 { "EventType": "DataFactoryP

在我之前的文章中,有人建议我使用“适当的方法”来构建json字符串,该字符串将插入SQL Server表中以用于日志记录。在前面的文章中,我使用字符串连接来构建json字符串

在数据工厂管道中构建json的合适工具/功能是什么?我已经研究了json()和string()函数,但它们仍然依赖于连接

澄清:我试图生成一条日志消息,如下所示:现在我正在使用字符串连接来生成日志json。有没有更好、更优雅(但更轻量级)的方法来生成json数据

{   "EventType": "DataFactoryPipelineRunActivity",    
    "DataFactoryName":"fa603ea7-f1bd-48c0-a690-73b92d12176c",   
    "DataFactoryPipelineName":"Import Blob Storage Account Key CSV file into generic SQL table using Data Flow Activity Logging to Target SQL Server",   
    "DataFactoryPipelineActivityName":"Copy Generic CSV Source to Generic SQL Sink",   
    "DataFactoryPipelineActivityOutput":"{runStatus:{computeAcquisitionDuration:316446,dsl: source() ~> ReadFromCSVInBlobStorage  ReadFromCSVInBlobStorage derive() ~> EnrichWithDataFactoryMetadata  EnrichWithDataFactoryMetadata sink() ~> WriteToTargetSqlTable,profile:{ReadFromCSVInBlobStorage:{computed:[],lineage:{},dropped:0,drifted:1,newer:1,total:1,updated:0},EnrichWithDataFactoryMetadata:{computed:[],lineage:{},dropped:0,drifted:1,newer:6,total:7,updated:0},WriteToTargetSqlTable:{computed:[],lineage:{__DataFactoryPipelineName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryPipelineName]}]},__DataFactoryPipelineRunId:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryPipelineRunId]}]},id:{mapped:true,from:[{source:ReadFromCSVInBlobStorage,columns:[id]}]},__InsertDateTimeUTC:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__InsertDateTimeUTC]}]},__DataFactoryName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__DataFactoryName]}]},__FileName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__FileName]}]},__StorageAccountName:{mapped:false,from:[{source:EnrichWithDataFactoryMetadata,columns:[__StorageAccountName]}]}},dropped:0,drifted:1,newer:0,total:7,updated:7}},metrics:{WriteToTargetSqlTable:{rowsWritten:4,sinkProcessingTime:1436,sources:{ReadFromCSVInBlobStorage:{rowsRead:4}},stages:[{stage:3,partitionTimes:[621],bytesWritten:0,bytesRead:24,streams:{WriteToTargetSqlTable:{type:sink,count:4,partitionCounts:[4],cached:false},EnrichWithDataFactoryMetadata:{type:derive,count:4,partitionCounts:[4],cached:false},ReadFromCSVInBlobStorage:{type:source,count:4,partitionCounts:[4],cached:false}},target:WriteToTargetSqlTable,time:811}]}}},effectiveIntegrationRuntime:DefaultIntegrationRuntime (East US)}",   
    "DataFactoryPipelineRunID":"63759585-4acb-48af-8536-ae953efdbbb0",   
    "DataFactoryPipelineTriggerName":"Manual",   
    "DataFactoryPipelineTriggerType":"Manual",   
    "DataFactoryPipelineTriggerTime":"2019-11-05T15:27:44.1568581Z",   
    "Parameters":{    
        "StorageAccountName":"fa603ea7",     
        "FileName":"0030_SourceData1.csv",    
        "TargetSQLServerName":"5a128a64-659d-4481-9440-4f377e30358c.database.windows.net",     
        "TargetSQLDatabaseName":"TargetDatabase",     
        "TargetSQLUsername":"demoadmin"   
    },    
    "InterimValues":{    
        "SchemaName":"utils",     
        "TableName":"vw_0030_SourceData1.csv-2019-11-05T15:27:57.643"   
    }  
}

您可以使用数据流,它可以帮助您在数据工厂的管道中构建JSON字符串

以下是数据流教程:

它可以帮助您:


  • 希望这有帮助。

    @MarcJellinek不客气,你可以试试这个。如果我的答案对你有帮助,你能接受它作为答案吗?。这可能对其他社区成员有益。谢谢。为了使其能够生成用于日志记录的json,我需要一个具有单行的数据集,然后使用上面的方法生成编码为json的日志消息,然后读取json元素作为日志记录参数值。我想试一下,但这似乎是一个沉重的负担。请记住,我只是试图生成一条如下所示的日志消息(请参阅原始的后期编辑)