Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Serialization 运行Streams应用程序时Serdes中出现序列化错误_Serialization_Apache Kafka_Apache Kafka Streams - Fatal编程技术网

Serialization 运行Streams应用程序时Serdes中出现序列化错误

Serialization 运行Streams应用程序时Serdes中出现序列化错误,serialization,apache-kafka,apache-kafka-streams,Serialization,Apache Kafka,Apache Kafka Streams,我正在编写一个应用程序,处理传入的坐标记录,并监视它们在定义的地理围栏中的进出。中间,我使用聚合来维护当前坐标是否在特定地理围栏的内部/外部的状态 以下是运行此操作时错误的堆栈跟踪: Exception in thread "geofence-events-990be707-d44d-410d-a8a7-b2a3b90cb84f-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Exception caught i

我正在编写一个应用程序,处理传入的坐标记录,并监视它们在定义的地理围栏中的进出。中间,我使用聚合来维护当前坐标是否在特定地理围栏的内部/外部的状态

以下是运行此操作时错误的堆栈跟踪:

Exception in thread "geofence-events-990be707-d44d-410d-a8a7-b2a3b90cb84f-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=0_0, processor=KSTREAM-SOURCE-0000000003, topic=vehicle-sensor-data, partition=0, offset=0
    at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:367)
    at org.apache.kafka.streams.processor.internals.AssignedStreamsTasks.process(AssignedStreamsTasks.java:104)
    at org.apache.kafka.streams.processor.internals.TaskManager.process(TaskManager.java:413)
    at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:862)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:777)
    at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:747)
Caused by: org.apache.kafka.streams.errors.StreamsException: A serializer (key: org.apache.kafka.common.serialization.StringSerializer / value: org.apache.kafka.common.serialization.StringSerializer) is not compatible to the actual key or value type (key type: java.lang.String / value type: com.loconav.GeoFenceDTO). Change the default Serdes in StreamConfig or provide correct Serdes via method parameters.
    at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:94)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
    at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:43)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
    at org.apache.kafka.streams.kstream.internals.KStreamMap$KStreamMapProcessor.process(KStreamMap.java:42)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
    at org.apache.kafka.streams.kstream.internals.KStreamPeek$KStreamPeekProcessor.process(KStreamPeek.java:44)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
    at org.apache.kafka.streams.kstream.internals.KStreamFlatMapValues$KStreamFlatMapValuesProcessor.process(KStreamFlatMapValues.java:42)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
    at org.apache.kafka.streams.kstream.internals.KStreamKTableJoinProcessor.process(KStreamKTableJoinProcessor.java:73)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
    at org.apache.kafka.streams.kstream.internals.KStreamFlatMapValues$KStreamFlatMapValuesProcessor.process(KStreamFlatMapValues.java:42)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
    at org.apache.kafka.streams.kstream.internals.KStreamMap$KStreamMapProcessor.process(KStreamMap.java:42)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:115)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:146)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:129)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:93)
    at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:84)
    at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:351)
    ... 5 more
Caused by: java.lang.ClassCastException: class com.loconav.GeoFenceDTO cannot be cast to class java.lang.String (com.loconav.GeoFenceDTO is in unnamed module of loader 'app'; java.lang.String is in module java.base of loader 'bootstrap')
    at org.apache.kafka.common.serialization.StringSerializer.serialize(StringSerializer.java:28)
    at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:60)
    at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:157)
    at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:101)
    at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:89)
    ... 45 more
我在flatMapValues和第一条记录被打印出来之后写了一个peek,然后出现了这个异常。下面是拓扑代码

vehicleGeoFences
  .flatMapValues(
      (readOnlyKey, value) -> {
        List<GeoFenceDTO> geoFenceDTOS = new ArrayList<>();
        // some logic
        return geoFenceDTOS;
      })
  .groupBy((key, value) -> value.getGeofenceId() + "," + value.getVehicleId())
  .aggregate(
      GeoFenceDTO::new,
      (key, value, aggregate) -> {

        //setting values
        return geofenceDTO;
      },
      Materialized.with(Serdes.String(), getGeofenceDTOValueSerde()))
  .toStream()
  .mapValues(
      (readOnlyKey, value) -> {
        // changing object
      })
  .to(
      Topics.VEHICLE_EVENTS.name(),
      Produced.with(vehicleEventsKeySerde, vehicleEventsValueSerde));
车辆围栏
.flatMapValues(
(readOnlyKey,value)->{
List geofinedtos=new ArrayList();
//一些逻辑
返回geoFenceDTOS;
})
.groupBy((键,值)->value.getGeofenceId()+“,“+value.getVehicleId())
.合计(
GeoFenceDTO::新,
(键、值、聚合)->{
//设定值
返回geofenceDTO;
},
具体化的.with(Serdes.String(),getGeofenceDTOValueSerde())
.toStream()
.mapValues(
(readOnlyKey,value)->{
//改变对象
})
.到(
Topics.VEHICLE_EVENTS.name(),
使用(vehicleEventsKeySerde、vehicleEventsValueSerde)生成;
我在GeofenceDTO上编写了一个自定义Serde,然后使用jackson ObjectMapper将对象转换为字节,反之亦然。不知何故,似乎没有选取物化文档中提供的参数。应用程序的默认键值Serde为StringSerde


谢谢你的帮助。蒂亚。我尝试将自定义对象更改为Avro生成,但相同的错误仍然存在。

堆栈跟踪与您的程序不匹配。它显示的是
KStreamKTableJoinProcessor
,但我在您的程序中看不到任何连接。我跳过了该部分。在此之前,我加入了一个带有KStream的GlobalKTable,这将导致
车辆围栏
。如果我还需要补充的话,请告诉我。我明白了。我还不清楚哪个处理器是问题所在。它似乎不是聚合,因为堆栈跟踪中没有相应的节点。如果删除最后一个
to()
,会发生什么?group by和AGGRATE都没有堆栈跟踪。我以为这就是应该的。我怀疑是聚合,因为GeoFenceDTO是由该步骤返回的,可能是为了存储生成的Ktable,它需要序列化对象,但在这方面失败了。这就是为什么我添加了materialized以使其明确,但仍然没有任何内容。我通过添加默认值serde作为geofensedtoserde来实现这一点。但这是一种黑客行为,我还没有找到为什么在删除并恢复为字符串作为默认serde后,original仍然不起作用,错误依然存在。也许可以在
Serializer#serialize()
中设置一个断点,查看它试图写入什么主题以进行进一步调试?堆栈跟踪与您的程序不匹配。它显示的是
KStreamKTableJoinProcessor
,但我在您的程序中看不到任何连接。我跳过了该部分。在此之前,我加入了一个带有KStream的GlobalKTable,这将导致
车辆围栏
。如果我还需要补充的话,请告诉我。我明白了。我还不清楚哪个处理器是问题所在。它似乎不是聚合,因为堆栈跟踪中没有相应的节点。如果删除最后一个
to()
,会发生什么?group by和AGGRATE都没有堆栈跟踪。我以为这就是应该的。我怀疑是聚合,因为GeoFenceDTO是由该步骤返回的,可能是为了存储生成的Ktable,它需要序列化对象,但在这方面失败了。这就是为什么我添加了materialized以使其明确,但仍然没有任何内容。我通过添加默认值serde作为geofensedtoserde来实现这一点。但这是一种黑客行为,我还没有找到为什么原始版本在作为默认serde删除到并还原到字符串后仍然不起作用,错误仍然存在。也许可以在
序列化程序#serialize()
中设置一个断点,查看它试图写入什么主题以进行进一步调试?