Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache kafka SerializationException:反序列化Avro消息时出错_Apache Kafka_Apache Kafka Connect - Fatal编程技术网

Apache kafka SerializationException:反序列化Avro消息时出错

Apache kafka SerializationException:反序列化Avro消息时出错,apache-kafka,apache-kafka-connect,Apache Kafka,Apache Kafka Connect,创建Kafka JdbcSinkConnector时出错(我的任务是将数据从Kafka主题传输到Postgres表): 原因:org.apache.kafka.common.errors.SerializationException: 反序列化id为-1的Avro消息时出错 id-1是什么意思 连接器的设置: { "name": "MVM Test", "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",

创建Kafka JdbcSinkConnector时出错(我的任务是将数据从Kafka主题传输到Postgres表):

原因:org.apache.kafka.common.errors.SerializationException: 反序列化id为-1的Avro消息时出错

id-1是什么意思

连接器的设置:

{
  "name": "MVM Test",
  "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
  "key.converter": "org.apache.kafka.connect.json.JsonConverter",
  "errors.log.enable": "true",
  "errors.log.include.messages": "true",
  "topics": [
    "mvm_test_events"
  ],
  "connection.url": "jdbc:connection",
  "connection.user": "user",
  "connection.password": "*************"
}
我还描述了Control Center中(值)主题“mvm\U测试\U事件”的模式:

{
  "type": "record",
  "name": "event",
  "namespace": "mvm",
  "fields": [
    {
      "name": "id",
      "type": "int"
    },
    {
      "name": "series_storage",
      "type": "int"
    },
    {
      "name": "type",
      "type": "int"
    },
    {
      "name": "entity_id",
      "type": "int"
    },
    {
      "name": "processing_ts",
      "type": "double"
    },
    {
      "name": "from_ts",
      "type": "double"
    },
    {
      "name": "to_ts",
      "type": "string"
    },
    {
      "name": "context",
      "type": {
        "type": "record",
        "name": "context",
        "fields": [
          {
            "name": "trainName",
            "type": "string"
          }
        ]
      }
    }
  ]
}
错误日志:

> [2020-01-22 06:45:10,380] ERROR Error encountered in task
> mvm-test-events-0. Executing stage 'VALUE_CONVERTER' with class
> 'io.confluent.connect.avro.AvroConverter', where consumed record is
> {topic='mvm_test_events', partition=0, offset=14,
> timestamp=1579615711794, timestampType=CreateTime}.
> (org.apache.kafka.connect.runtime.errors.LogReporter)
> org.apache.kafka.connect.errors.DataException: Failed to deserialize
> data for topic mvm_test_events to Avro:   at
> io.confluent.connect.avro.AvroConverter.toConnectData(AvroConverter.java:107)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.lambda$convertAndTransformRecord$1(WorkerSinkTask.java:487)
>   at
> org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128)
>   at
> org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162)
>   at
> org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.convertAndTransformRecord(WorkerSinkTask.java:487)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:464)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:320)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
>   at
> org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
>   at
> org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
>   at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:266)     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>   at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>   at java.lang.Thread.run(Thread.java:748) Caused by:
> org.apache.kafka.common.errors.SerializationException: Error
> deserializing Avro message for id -1 Caused by:
> org.apache.kafka.common.errors.SerializationException: Unknown magic
> byte! [2020-01-22 06:45:10,381] ERROR
> WorkerSinkTask{id=mvm-test-events-0} Task threw an uncaught and
> unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask)
> org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded
> in error handler  at
> org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:178)
>   at
> org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:104)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.convertAndTransformRecord(WorkerSinkTask.java:487)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:464)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:320)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
>   at
> org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
>   at
> org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
>   at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>   at java.util.concurrent.FutureTask.run(FutureTask.java:266)     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>   at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>   at java.lang.Thread.run(Thread.java:748) Caused by:
> org.apache.kafka.connect.errors.DataException: Failed to deserialize
> data for topic mvm_test_events to Avro:   at
> io.confluent.connect.avro.AvroConverter.toConnectData(AvroConverter.java:107)
>   at
> org.apache.kafka.connect.runtime.WorkerSinkTask.lambda$convertAndTransformRecord$1(WorkerSinkTask.java:487)
>   at
> org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:128)
>   at
> org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:162)
>   ... 13 more Caused by:
> org.apache.kafka.common.errors.SerializationException: Error
> deserializing Avro message for id -1 Caused by:
> org.apache.kafka.common.errors.SerializationException: Unknown magic
> byte! [2020-01-22 06:45:10,381] ERROR
> WorkerSinkTask{id=mvm-test-events-0} Task is being killed and will not
> recover until manually restarted
> (org.apache.kafka.connect.runtime.WorkerTask)
据我所知,它试图用io.confluent.connect.avro.AvroConverter转换主题中的记录。现在,我应该在连接器设置“值转换器类”中定义模式(我在主题设置中描述)名称吗?

您会看到错误

org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1
org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
当您在Kafka Connect中使用AvroConverter来读取尚未序列化为Avro的数据时

因此,您需要修复producer(并将数据正确序列化为Avro),或者如果您不打算使用Avro,请修复Kafka Connect connector配置以使用适当的转换器

了解更多信息

编辑:根据您更新的问题,您打算以Avro的身份写作,因此使用AvroConverter是正确的。您尚未将其包括在连接器配置中,因此我假设它已在Kafka Connect worker属性中设置(
“value.converter”:“io.confluent.Connect.avro.AvroConverter”
)。 不知何故,你得到的关于你的主题的数据不是Avro。我建议您设置一个将这些消息路由到一个新主题以供检查,同时使您的接收器能够继续处理Avro的消息

您得到了错误

org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1
org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
当您在Kafka Connect中使用AvroConverter来读取尚未序列化为Avro的数据时

因此,您需要修复producer(并将数据正确序列化为Avro),或者如果您不打算使用Avro,请修复Kafka Connect connector配置以使用适当的转换器

了解更多信息

编辑:根据您更新的问题,您打算以Avro的身份写作,因此使用AvroConverter是正确的。您尚未将其包括在连接器配置中,因此我假设它已在Kafka Connect worker属性中设置(
“value.converter”:“io.confluent.Connect.avro.AvroConverter”
)。
不知何故,你得到的关于你的主题的数据不是Avro。我建议您设置一个将这些消息路由到一个新主题以供检查,同时使您的接收器能够继续处理Avro的消息

您是否可以编辑问题以包括连接器配置和错误的完整堆栈跟踪please@RobinMoffatt,我已经根据你的新信息更新了我的答案。你能编辑你的问题以包括连接器配置和错误的完整堆栈跟踪吗please@RobinMoffatt,我已经根据你的新信息更新了我的答案。非常感谢你的回答!谢谢你的回答!