Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 弹簧云流Kafka流过程中的Serde错误_Java_Apache Kafka_Apache Kafka Streams_Spring Cloud Stream_Confluent Platform - Fatal编程技术网

Java 弹簧云流Kafka流过程中的Serde错误

Java 弹簧云流Kafka流过程中的Serde错误,java,apache-kafka,apache-kafka-streams,spring-cloud-stream,confluent-platform,Java,Apache Kafka,Apache Kafka Streams,Spring Cloud Stream,Confluent Platform,我正在尝试使用SpringCloudStream3.0.3.RELEASE处理一些Kafka记录,但我遇到了Serdes配置的问题,即记录一进入流管道就会出现错误 这是堆栈跟踪: 30-03-2020 19:28:33 ERROR org.apache.kafka.streams.KafkaStreams [application-local,,,]: stream-client [joinPriorityData-applicationId-1302980a-a016

我正在尝试使用SpringCloudStream3.0.3.RELEASE处理一些Kafka记录,但我遇到了Serdes配置的问题,即记录一进入流管道就会出现错误

这是堆栈跟踪:

30-03-2020 19:28:33 ERROR org.apache.kafka.streams.KafkaStreams              [application-local,,,]: stream-client [joinPriorityData-applicationId-1302980a-a016-4167-9c0b-750ffb5d107a] All stream threads have died. The instance will be in error state and should be closed.
Exception in thread "joinPriorityData-applicationId-1302980a-a016-4167-9c0b-750ffb5d107a-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=0_0, processor=KSTREAM-SOURCE-0000000001, topic=moaii.security.pe.incidence.queue, partition=0, offset=18108, stacktrace=org.apache.kafka.streams.errors.StreamsException: A serializer (key: org.apache.kafka.common.serialization.StringSerializer / value: org.apache.kafka.common.serialization.ByteArraySerializer) is not compatible to the actual key or value type (key type: java.lang.String / value type: com.vcp.moaii.cep.dto.IncidenceState). Change the default Serdes in StreamConfig or provide correct Serdes via method parameters.
    at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:94)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.kstream.internals.KStreamMapValues$KStreamMapProcessor.process(KStreamMapValues.java:41)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:117)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.kstream.internals.KStreamMap$KStreamMapProcessor.process(KStreamMap.java:42)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:117)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:43)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:117)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.kstream.internals.KStreamMapValues$KStreamMapProcessor.process(KStreamMapValues.java:41)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:117)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.kstream.internals.KTableSource$KTableSourceProcessor.process(KTableSource.java:116)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:117)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:87)
    at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:363)
    at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:199)
    at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:425)
    at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:912)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:819)
    at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:788)
Caused by: java.lang.ClassCastException: class com.vcp.moaii.cep.dto.IncidenceState cannot be cast to class [B (com.vcp.moaii.cep.dto.IncidenceState is in unnamed module of loader 'app'; [B is in module java.base of loader 'bootstrap')
    at org.apache.kafka.common.serialization.ByteArraySerializer.serialize(ByteArraySerializer.java:19)
    at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62)
    at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:163)
    at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:103)
    at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:89)
    ... 35 more

    at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:380)
    at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:199)
    at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:425)
    at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:912)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:819)
    at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:788)
Caused by: org.apache.kafka.streams.errors.StreamsException: A serializer (key: org.apache.kafka.common.serialization.StringSerializer / value: org.apache.kafka.common.serialization.ByteArraySerializer) is not compatible to the actual key or value type (key type: java.lang.String / value type: com.vcp.moaii.cep.dto.IncidenceState). Change the default Serdes in StreamConfig or provide correct Serdes via method parameters.
    at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:94)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.kstream.internals.KStreamMapValues$KStreamMapProcessor.process(KStreamMapValues.java:41)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:117)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.kstream.internals.KStreamMap$KStreamMapProcessor.process(KStreamMap.java:42)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:117)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:43)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:117)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.kstream.internals.KStreamMapValues$KStreamMapProcessor.process(KStreamMapValues.java:41)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:117)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.kstream.internals.KTableSource$KTableSourceProcessor.process(KTableSource.java:116)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:117)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:201)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:180)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:133)
    at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:87)
    at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:363)
    ... 5 more
Caused by: java.lang.ClassCastException: class com.vcp.moaii.cep.dto.IncidenceState cannot be cast to class [B (com.vcp.moaii.cep.dto.IncidenceState is in unnamed module of loader 'app'; [B is in module java.base of loader 'bootstrap')
    at org.apache.kafka.common.serialization.ByteArraySerializer.serialize(ByteArraySerializer.java:19)
    at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62)
    at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:163)
    at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:103)
    at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:89)
    ... 35 more
这是我的职责:

@Bean
public Function<KTable<String, IncidenceItem>, KStream<String, ?>> joinPriorityData() {
    return incidenceStream -> incidenceStream
            .toStream()
            .filter((key, value) -> filterZoneTypeAndPriority(value))
            .selectKey((key, value) -> value.getsInc())
            .mapValues((readOnlyKey, value) -> stateMapper.toIncidendeState(value, null));
}
我在很多地方尝试了很多不同的配置,但似乎都能奏效。正如您在stacktrace中看到的,默认值Serdes被忽略,并使用ByteArray代替

我也创建了一些自定义serde,将它们声明为bean并在配置文件中使用,如图所示,但结果相同

无论如何,我感觉其他配置也被忽略了,例如,我可以看到Ktable如何跳过一些带有null键的旧记录,然后在读取第一个非null记录时失败,即使我有
auto.offset.reset:latest

无法确定这是卡夫卡问题还是弹簧问题,但无法解决

编辑: 在此日志中,您可以看到活页夹如何为入站而不是出站获取正确的serde:

31-03-2020 11:54:24 INFO  o.s.c.s.b.k.streams.KafkaStreamsFunctionProcessor  [application-local,,,]: Key Serde used for joinPriorityData-in-0: org.apache.kafka.common.serialization.Serdes$StringSerde
31-03-2020 11:54:24 INFO  o.s.c.s.b.k.streams.KafkaStreamsFunctionProcessor  [application-local,,,]: Value Serde used for joinPriorityData-in-0: com.vcp.moaii.cep.broker.serde.MoaiiSerdes$IncidenceItemSerde

31-03-2020 11:54:27 INFO  o.s.c.stream.binder.kafka.streams.KStreamBinder    [application-local,,,]: Key Serde used for (outbound) moaii.security.pe.incidence.state: org.apache.kafka.common.serialization.Serdes$StringSerde    
31-03-2020 11:54:27 INFO  o.s.c.stream.binder.kafka.streams.KStreamBinder    [application-local,,,]: Value Serde used for (outbound) moaii.security.pe.incidence.state: org.apache.kafka.common.serialization.Serdes$ByteArraySerde

在出站中,甚至没有提到绑定“joinPriorityData-out-0”,因为您可以看到IncidenceState使用两个不同的类加载器加载了两次。它用app(系统类加载器)加载一次,用引导类加载器加载一次

这是卡夫卡的问题。检查这个 还有这个


关于如何修复它,还有一项工作要做。

我想知道这个问题是否与您的配置和功能方法有关。尝试以下功能和配置,看看这是否会有所不同

@Bean
public Function<KTable<String, IncidenceItem>, KStream<String, IncidenceState>> 
  //same as your original code
}
我不确定您是否同时需要卡夫卡和卡夫卡流活页夹。从您共享的设置来看,情况并非如此。因此,将所有配置移动到Kafka Streams绑定器配置下


看看这些变化是否有什么不同

我之前注意到我必须从输出参数中删除通配符,但我忘记在这里发布它,这不是问题所在。但是配置就像一个魅力!似乎Kafka和Kafka Streams绑定器会覆盖某些其他配置。我实际上只使用流,但我认为这对配置两者都是必要的。所有人的坦克@sobychacko!!无论如何,我将需要通配符用于以后的流处理,因为我用两个不同的输出类来分支流,所以我的输出KStream是
KStream[]
。我会遇到同样的问题吗?如果您不能指定该输出类型,请确保在配置中显式指定SERDE。我在该过程中更改了“Object”的通配符,它可以正常工作。谢谢谢谢你的帮助!问题是我的配置。我猜Kafka和Kafka streams binder的配置都会产生冲突并触发类装入器问题。
@Bean
public Function<KTable<String, IncidenceItem>, KStream<String, IncidenceState>> 
  //same as your original code
}
spring.json.value.default.type: RawAccounting
  spring.cloud.stream:
    function.definition: joinPriorityData
    bindings:
      joinPriorityData-in-0:
        destination: moaii.security.pe.incidence.queue
      joinPriorityData-out-0:
        destination: moaii.security.pe.incidence.state
    kafka:
      streams:
        binder:
          applicationId: moaii-cep
          configuration:
            auto.commit.interval.ms: 100
            auto.offset.reset: latest
            default:
              key.serde: org.apache.kafka.common.serialization.Serdes$StringSerde
              value.serde:  org.springframework.kafka.support.serializer.JsonSerde