Java 如何在我的kafka使用者中使用对象反序列化json列表?
我的卡夫卡制作人正在发送Json格式的对象列表 我正试图找出如何让我的消费者对列表进行反序列化。我可以接收单个对象并读取它,但当我将代码更改为“类型列表”时,会出现以下错误:Java 如何在我的kafka使用者中使用对象反序列化json列表?,java,spring,apache-kafka,spring-kafka,json-deserialization,Java,Spring,Apache Kafka,Spring Kafka,Json Deserialization,我的卡夫卡制作人正在发送Json格式的对象列表 我正试图找出如何让我的消费者对列表进行反序列化。我可以接收单个对象并读取它,但当我将代码更改为“类型列表”时,会出现以下错误: Error:(32, 47) java: incompatible types: cannot infer type arguments for org.springframework.kafka.core.DefaultKafkaConsumerFactory<> reason: inference
Error:(32, 47) java: incompatible types: cannot infer type arguments for org.springframework.kafka.core.DefaultKafkaConsumerFactory<>
reason: inference variable V has incompatible equality constraints java.util.List<nl.domain.X>,nl.domain.X
错误:(32,47)java:不兼容类型:无法推断org.springframework.kafka.core.DefaultKafkaConsumerFactory的类型参数
原因:推断变量V具有不兼容的等式约束java.util.List、nl.domain.X
编辑
通过将TypeReference添加到JSONDeserializer,可以解决此错误
当前问题:
@Service
public class KafkaConsumer {
@KafkaListener(topics = "test", groupId = "group_json", containerFactory = "xKafkaListenerFactory")
public void consume(List<X> x) {
// In reality it is consuming LinkedHashMap which isn't what i want
x.forEach(i ->
System.out.println("Consumed message: " + i.getName()));
}
}
在使用消息时,它不是我定义的类型(即List@EnableKafka
@Configuration
public class KafkaConfiguration {
@Bean
public ConsumerFactory<String, List<X>> xConsumerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
config.put(ConsumerConfig.GROUP_ID_CONFIG, "group_json");
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
return new DefaultKafkaConsumerFactory<>(config, new StringDeserializer(),
new JsonDeserializer<>(new TypeReference<List<X>>() {}));
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, List<X>> xKafkaListenerFactory() {
ConcurrentKafkaListenerContainerFactory<String, List<X>> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(xConsumerFactory());
return factory;
}
}
@Configuration
public class KafkaConfiguration {
@Bean
public ProducerFactory<String, List<X>> producerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return new DefaultKafkaProducerFactory<>(config);
}
@Bean
public KafkaTemplate<String, List<X>> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
@EnableKafka
@配置
公共类卡夫卡配置{
@豆子
公共消费者工厂xConsumerFactory(){
Map config=newhashmap();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_config,“127.0.0.1:9092”);
config.put(ConsumerConfig.GROUP_ID_config,“GROUP_json”);
config.put(ConsumerConfig.KEY\u反序列化程序\u CLASS\u config,StringDeserializer.CLASS);
config.put(ConsumerConfig.VALUE\u反序列化程序\u类\u配置,JsonDeserializer.CLASS);
返回新的DefaultKafkanConsumerFactory(配置,新的StringDeserializer(),
新的JsonDeserializer(新的TypeReference(){});
}
@豆子
公共ConcurrentKafkaListenerContainerFactory xKafkaListenerFactory(){
ConcurrentKafkListenerContainerFactory=新ConcurrentKafkListenerContainerFactory();
setConsumerFactory(xConsumerFactory());
返回工厂;
}
}
这是消费者:
@Service
public class KafkaConsumer {
@KafkaListener(topics = "test", groupId = "group_json", containerFactory = "xKafkaListenerFactory")
public void consume(List<X> x) {
// In reality it is consuming LinkedHashMap which isn't what i want
x.forEach(i ->
System.out.println("Consumed message: " + i.getName()));
}
}
@服务
公共类卡夫卡消费者{
@KafkaListener(topics=“test”,groupId=“group_json”,containerFactory=“xKafkaListenerFactory”)
公共空间消耗(列表x){
//实际上,它正在消费LinkedHashMap,这不是我想要的
x、 forEach(i->
System.out.println(“消耗的消息:+i.getName());
}
}
这是生产者的配置:
@EnableKafka
@Configuration
public class KafkaConfiguration {
@Bean
public ConsumerFactory<String, List<X>> xConsumerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
config.put(ConsumerConfig.GROUP_ID_CONFIG, "group_json");
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
return new DefaultKafkaConsumerFactory<>(config, new StringDeserializer(),
new JsonDeserializer<>(new TypeReference<List<X>>() {}));
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, List<X>> xKafkaListenerFactory() {
ConcurrentKafkaListenerContainerFactory<String, List<X>> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(xConsumerFactory());
return factory;
}
}
@Configuration
public class KafkaConfiguration {
@Bean
public ProducerFactory<String, List<X>> producerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092");
config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return new DefaultKafkaProducerFactory<>(config);
}
@Bean
public KafkaTemplate<String, List<X>> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
@配置
公共类卡夫卡配置{
@豆子
公共生产工厂生产工厂(){
Map config=newhashmap();
config.put(ProducerConfig.BOOTSTRAP_SERVERS_config,“127.0.0.1:9092”);
config.put(ProducerConfig.KEY\u SERIALIZER\u CLASS\u config,StringSerializer.CLASS);
config.put(ProducerConfig.VALUE\u SERIALIZER\u CLASS\u config,JsonSerializer.CLASS);
返回新的DefaultKafkaProducerFactory(配置);
}
@豆子
公共卡夫卡模板卡夫卡模板(){
返回新的卡夫卡模板(producerFactory());
}
}
这是制作人:
@Component
public class KafkaProducer {
private final static String TOPIC = "test";
private final KafkaTemplate<String, List<X>> kafkaTemplate;
public KafkaProducer(KafkaTemplate<String, List<X>> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}
@Override
public void publishMessage(List<X> x) {
// Created 3 instances of X for current example
List<X> list = new ArrayList();
list.add(new X("Apple"));
list.add(new X("Beach"));
list.add(new X("Money"));
ListenableFuture<SendResult<String, List<X>>> listenableFuture = kafkaTemplate.send(TOPIC, list);
}
}
@组件
公共级卡夫卡制作人{
私有最终静态字符串TOPIC=“test”;
私人最终卡夫卡模板卡夫卡模板;
公共卡夫卡制作人(卡夫卡模板卡夫卡模板){
this.kafkaTemplate=kafkaTemplate;
}
@凌驾
公共作废publishMessage(列表x){
//为当前示例创建了3个X实例
列表=新的ArrayList();
添加(新X(“苹果”);
名单.加上(新X个泳滩);
列表。添加(新X(“货币”));
ListenableFuture ListenableFuture=kafkaTemplate.send(主题,列表);
}
}
摘要:
@Service
public class KafkaConsumer {
@KafkaListener(topics = "test", groupId = "group_json", containerFactory = "xKafkaListenerFactory")
public void consume(List<X> x) {
// In reality it is consuming LinkedHashMap which isn't what i want
x.forEach(i ->
System.out.println("Consumed message: " + i.getName()));
}
}
制片人似乎工作得很好。当发送这样的消息时,我的消费者中出现以下错误。它无法将LinkedHashMap强制转换到列表中
org.springframework.kafka.listener.ListenerExecutionFailedException: Listener method 'public void nl.infrastructure.input.message.kafka.consumer.KafkaConsumer.consume(java.util.List<nl.domain.X>)' threw exception; nested exception is java.lang.ClassCastException: class java.util.LinkedHashMap cannot be cast to class nl.domain.X (java.util.LinkedHashMap is in module java.base of loader 'bootstrap'; nl.domain.X is in unnamed module of loader 'app'); nested exception is java.lang.ClassCastException: class java.util.LinkedHashMap cannot be cast to class nl.domain.X (java.util.LinkedHashMap is in module java.base of loader 'bootstrap'; nl.domain.X is in unnamed module of loader 'app')
org.springframework.kafka.listener.ListenerExecutionFailedException:侦听器方法“public void nl.infrastructure.input.message.kafka.consumer.KafkaConsumer.consumer(java.util.List)”引发异常;嵌套异常为java.lang.ClassCastException:类java.util.LinkedHashMap不能强制转换为类nl.domain.X(java.util.LinkedHashMap位于加载程序“bootstrap”的模块java.base中;nl.domain.X位于加载程序“app”的未命名模块中);嵌套异常为java.lang.ClassCastException:类java.util.LinkedHashMap无法强制转换为类nl.domain.X(java.util.LinkedHashMap位于加载程序“bootstrap”的模块java.base中;nl.domain.X位于加载程序“app”的未命名模块中)
这可能是因为JsonDeserializer的导入类型错误
使用导入org.springframework.kafka.support.serializer.JsonDeserializer代码>
您必须按如下方式配置JsonDeserializer:
protected Deserializer<List<X>> kafkaDeserializer() {
ObjectMapper om = new ObjectMapper();
om.getTypeFactory().constructParametricType(List.class, X.class);
return new JsonDeserializer<>(om);
}
受保护的反序列化程序kafkaDeserializer(){
ObjectMapper om=新的ObjectMapper();
om.getTypeFactory().ConstructParameterType(List.class,X.class);
返回新的JsonDeserializer(om);
}
这可能是因为JsonDeserializer的导入类型错误
使用导入org.springframework.kafka.support.serializer.JsonDeserializer代码>
您必须按如下方式配置JsonDeserializer:
protected Deserializer<List<X>> kafkaDeserializer() {
ObjectMapper om = new ObjectMapper();
om.getTypeFactory().constructParametricType(List.class, X.class);
return new JsonDeserializer<>(om);
}
受保护的反序列化程序kafkaDeserializer(){
ObjectMapper om=新的ObjectMapper();
om.getTypeFactory().ConstructParameterType(List.class,X.class);
返回新的JsonDeserializer(om);
}
使用@AmanGarg-answer。稍微调整一下。(不太清楚为什么它对我不起作用。)将单个类
转换为POJO是有效的,但转换为列表
则无效。如果有人面临这个问题,就添加它
protected JsonDeserializer<List<X>> kafkaDeserializer() {
ObjectMapper om = new ObjectMapper();
JavaType type = om.getTypeFactory().constructParametricType(List.class, X.class);
return new JsonDeserializer<List<X>>(type, om, false);
}
受保护的JsonDeserializer kafkaDeserializer(){
ObjectMapper om=新的ObjectMapper();
JavaType type=om.getTypeFactory().ConstructParameterType(List.class,X.class);
返回新的JsonDeserializer(类型、om、false);
}
使用了不同的JsonDeserializer构造函数。使用@AmanGarg answer。稍微调整一下。(不太知道为什么它对我不起作用。)将单个
类转换为POJO是一项工作