Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/336.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 使用Spring boot的Kafka avro使用者序列化错误_Java_Apache Kafka_Avro_Spring Kafka_Confluent Schema Registry - Fatal编程技术网

Java 使用Spring boot的Kafka avro使用者序列化错误

Java 使用Spring boot的Kafka avro使用者序列化错误,java,apache-kafka,avro,spring-kafka,confluent-schema-registry,Java,Apache Kafka,Avro,Spring Kafka,Confluent Schema Registry,我创建了一个Kafka Avroproducer和consumer,使用spring boot作为两个不同的项目。在使用数据时,我得到以下异常 Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition bookavro-0 at offset 3. If needed, please seek past the record to c

我创建了一个
Kafka Avro
producer和consumer,使用spring boot作为两个不同的项目。在使用数据时,我得到以下异常

Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing key/value 
for partition bookavro-0 at offset 3. If needed, please seek past the record to continue 
consumption.
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message 
for id 1
Caused by: org.apache.kafka.common.errors.SerializationException: Could not find class 
com.dailycodebuffer.kafka.apachekafkaproducerdemo.BookAvro specified in writer's schema whilst 
finding reader's schema for a SpecificRecord.

2020-12-30 18:44:09.032 ERROR 22344 --- [ntainer#0-0-C-1] essageListenerContainer$ListenerConsumer 
: Consumer exception

java.lang.IllegalStateException: This error handler cannot process 'SerializationException's 
directly; please consider configuring an 'ErrorHandlingDeserializer' in the value and/or key deserializer
at org.springframework.kafka.listener.SeekUtils.seekOrRecover(SeekUtils.java:145) ~[spring-kafka-2.6.4.jar:2.6.4]
at org.springframework.kafka.listener.SeekToCurrentErrorHandler.handle(SeekToCurrentErrorHandler.java:113) ~[spring-kafka-2.6.4.jar:2.6.4]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.handleConsumerException(KafkaMessageListenerContainer.java:1425) [spring-kafka-2.6.4.jar:2.6.4]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1122) [spring-kafka-2.6.4.jar:2.6.4]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_202]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_202]
at java.lang.Thread.run(Thread.java:813) [na:1.8.0_202]
com.dailycodebuffer.kafka.apachekafkaproducerdemo.BookAvro
是producer项目中的包

以下是我的消费者配置:

@Bean
公共ConsumerFactory BookconsumerFactory(){
System.out.println(“hi”);
Map configProps=new HashMap();
configProps.put(ConsumerConfig.BOOTSTRAP\u SERVERS\u CONFIG,“localhost:9092”);
configProps.put(ConsumerConfig.KEY\u反序列化程序\u CLASS\u CONFIG,StringDeserializer.CLASS);
//put(ConsumerConfig.KEY,StringDeserializer.class);
configProps.put(ConsumerConfig.VALUE_反序列化器_CLASS_CONFIG,“io.confluent.kafka.serializers.kafkavrodeserializer”);
//configProps.put(“value.deserializer”、“org.springframework.kafka.support.serializer.JsonDeserializer”);
//configProps.put(JsonDeserializer.ADD\u TYPE\u INFO\u头,false);
configProps.put(ConsumerConfig.GROUP_ID_CONFIG,“GROUP_json”);
configProps.put(“自动偏移、重置”、“最早”);
configProps.put(KafkaAvroDeserializerConfig.SCHEMA\u REGISTRY\u URL\u CONFIG,“http://localhost:8081");
configProps.put(kafkaavroodeserializerconfig.SPECIFIC_AVRO_READER_CONFIG,“true”);
System.out.println(configProps.toString());
返回新的DefaultKafka消费者工厂(configProps);
}
@豆子
公共ConcurrentKafkaListenerContainerFactory kafkaListenerContainerFactory(){
ConcurrentKafkListenerContainerFactory=新ConcurrentKafkListenerContainerFactory();
setConsumerFactory(BookconsumerFactory());
System.out.println(factory.toString());
//factory.getContainerProperties().setAckMode(AckMode.MANUAL_IMMEDIATE);
返回工厂;
}
以下是生产者配置:

@Bean
公共生产工厂生产工厂(){
Map configProps=new HashMap();
configProps.put(ProducerConfig.BOOTSTRAP\u SERVERS\u CONFIG,“localhost:9092”);
configProps.put(ProducerConfig.KEY\u SERIALIZER\u CLASS\u CONFIG,StringSerializer.CLASS);
configProps.put(ProducerConfig.VALUE\u SERIALIZER\u CLASS\u CONFIG,KafkaAvroSerializer.CLASS.getName());
configProps.put(kafkaavroserializarconfig.SCHEMA\u REGISTRY\u URL\u CONFIG,“http://localhost:8081");
//configProps.put(KafkaAvroDeserializerConfig.SPECIFIC\u AVRO\u READER\u CONFIG,true);
configProps.put(JsonSerializer.ADD\u TYPE\u INFO\u头,false);
//configProps.put(KafkaAvroSerializerConfig.,“true”);
返回新的DefaultKafkaProducerFactory(configProps);
}
@豆子
公共卡夫卡模板卡夫卡模板(){
返回新的卡夫卡模板(producerFactory());
}
以下是卡夫卡列表:

@KafkaListener(groupId=“group_json”,topics=“bookavro”)
公共书籍(BookAvro book){
System.out.println(“message3”+book.toString());
}

BookAvro是使用Avsc文件创建的Avro类。有人能帮我解决这个异常吗?

“是producer项目中的包吗”-然后您必须将这个类作为依赖项添加到consumer项目中。否则,您必须使用GenericRecord对象,而不是SpecificRecord实现我需要在pom文件中提到的吗?这是一种方法。或者您可以复制并编译相同的avsc文件