卡夫卡连接JDBC接收器。展平JSON记录时出错

卡夫卡连接JDBC接收器。展平JSON记录时出错,jdbc,apache-kafka,apache-kafka-connect,confluent-platform,Jdbc,Apache Kafka,Apache Kafka Connect,Confluent Platform,我正在使用Kafka连接将主题中存储的数据连接到SQL Server表中。数据需要展平。我根据Confluent的示例创建了一个SQL Server表和一个JSON记录 我的记录是这样的: { "payload":{ "id": 42, "name": { "first": "David" } }, "schema": { "fields": [ {

我正在使用Kafka连接将主题中存储的数据连接到SQL Server表中。数据需要展平。我根据Confluent的示例创建了一个SQL Server表和一个JSON记录

我的记录是这样的:

{
    "payload":{ 
        "id": 42,
        "name": {
          "first": "David"
        }
    },
    "schema": {
        "fields": [
            {
                "field": "id",
                "optional": true,
                "type": "int32"
            },
            {
                "name": "name",
                "optional": "false",
                "type": "struct",
                "fields": [
                    {
                        "field": "first",
                        "optional": true,
                        "type": "string"
                    }
                ]
            }
        ],
        "name": "Test",
        "optional": false,
        "type": "struct"
    }   
}
如您所见,我希望将连接分隔符“\u1”的字段展平。因此,我的接收器连接器配置如下:

connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
table.name.format=MyTable
transforms.flatten.type=org.apache.kafka.connect.transforms.Flatten$Value
topics=myTopic
tasks.max=1
transforms=flatten
value.converter.schemas.enable=true
value.converter=org.apache.kafka.connect.json.JsonConverter
connection.url=jdbc:sqlserver:[url]
transforms.flatten.delimiter=_
当我在主题中写入该记录时,会出现以下异常:

org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
    at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178)
    at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.convertAndTransformRecord(WorkerSinkTask.java:487)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:464)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:320)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.DataException: Struct schema's field name not specified properly
    at org.apache.kafka.connect.json.JsonConverter.asConnectSchema(JsonConverter.java:512)
    at org.apache.kafka.connect.json.JsonConverter.toConnectData(JsonConverter.java:360)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.lambda$convertAndTransformRecord$1(WorkerSinkTask.java:487)
    at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128)
    at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162)
    ... 13 more
对于不需要展平的记录,接收器连接器工作正常。配置有什么问题吗?是否可以使用模式展平JSON文件

p.S.卡夫卡连接版本:5.3.0-css


非常感谢您的帮助。

好的,问题是嵌套字段的字段名。正确的字段名称是“字段”,而不是“名称”:

{
    "payload":{ 
        "id": 42,
        "name": {
          "first": "David"
        }
    },
    "schema": {
        "fields": [
            {
                "field": "id",
                "optional": true,
                "type": "int32"
            },
            {
                **"field": "name",**
                "optional": "false",
                "type": "struct",
                "fields": [
                    {
                        "field": "first",
                        "optional": true,
                        "type": "string"
                    }
                ]
            }
        ],
        "name": "Test",
        "optional": false,
        "type": "struct"
    }   
}