Spring boot springboot kafka java 8次序列化

Spring boot springboot kafka java 8次序列化,spring-boot,spring-kafka,Spring Boot,Spring Kafka,目前正在使用spring boot 2.0.4和spring kafka 2.1.8.0版本。 我想稍微简化一下将对象发送到kafka模板并使用json作为格式的交换。但是,一些需要反序列化的消息包含java.time.LocalDateTime。所以我的设置是 配置(application.yml): 至于jackson依赖项,它应该是工作所必需的,我的依赖项树是: [INFO] | | +- com.fasterxml.jackson.core:jackson-databind:jar:

目前正在使用spring boot 2.0.4和spring kafka 2.1.8.0版本。 我想稍微简化一下将对象发送到kafka模板并使用json作为格式的交换。但是,一些需要反序列化的消息包含java.time.LocalDateTime。所以我的设置是

配置(application.yml):

至于jackson依赖项,它应该是工作所必需的,我的依赖项树是:

[INFO] |  |  +- com.fasterxml.jackson.core:jackson-databind:jar:2.9.6:compile
[INFO] |  |  |  +- com.fasterxml.jackson.core:jackson-annotations:jar:2.9.0:compile
[INFO] |  |  |  \- com.fasterxml.jackson.core:jackson-core:jar:2.9.6:compile
[INFO] |  |  \- com.fasterxml.jackson.datatype:jackson-datatype-jsr310:jar:2.9.6:compile
[INFO] |  |  +- com.fasterxml.jackson.datatype:jackson-datatype-jdk8:jar:2.9.6:compile
[INFO] |  |  \- com.fasterxml.jackson.module:jackson-module-parameter-names:jar:2.9.6:compile
但是,这会产生以下错误:

org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition Foo-0 at offset 4. If needed, please seek past the record to continue consumption.
Caused by: org.apache.kafka.common.errors.SerializationException: Can't deserialize data [[123, 34, 105, 100, 34, 58, 34, 97, 50, 99, 50, 56, 99, 99, 101, 97, 49, 98, 98, 52, 51, 97, 97, 56, 53, 50, 49, 53, 99, 101, 49, 54, 57, 48, 52, 51, 51, 98, 51, 45, 50, 34, 44, 34, 97, 117, 116, 104, 111, 114, 34, 58, 34, 97, 110, 116, 111, 110, 105, 111, 34, 44, 34, 99, 114, 101, 97, 116, 101, 100, 34, 58, 123, 34, 104, 111, 117, 114, 34, 58, 49, 56, 44, 34, 109, 105, 110, 117, 116, 101, 34, 58, 52, 48, 44, 34, 115, 101, 99, 111, 110, 100, 34, 58, 53, 49, 44, 34, 110, 97, 110, 111, 34, 58, 51, 50, 53, 48, 48, 48, 48, 48, 48, 44, 34, 100, 97, 121, 79, 102, 89, 101, 97, 114, 34, 58, 50, 52, 48, 44, 34, 100, 97, 121, 79, 102, 87, 101, 101, 107, 34, 58, 34, 84, 85, 69, 83, 68, 65, 89, 34, 44, 34, 109, 111, 110, 116, 104, 34, 58, 34, 65, 85, 71, 85, 83, 84, 34, 44, 34, 100, 97, 121, 79, 102, 77, 111, 110, 116, 104, 34, 58, 50, 56, 44, 34, 121, 101, 97, 114, 34, 58, 50, 48, 49, 56, 44, 34, 109, 111, 110, 116, 104, 86, 97, 108, 117, 101, 34, 58, 56, 44, 34, 99, 104, 114, 111, 110, 111, 108, 111, 103, 121, 34, 58, 123, 34, 99, 97, 108, 101, 110, 100, 97, 114, 84, 121, 112, 101, 34, 58, 34, 105, 115, 111, 56, 54, 48, 49, 34, 44, 34, 105, 100, 34, 58, 34, 73, 83, 79, 34, 125, 125, 44, 34, 97, 103, 103, 114, 101, 103, 97, 116, 101, 73, 100, 34, 58, 34, 97, 50, 99, 50, 56, 99, 99, 101, 97, 49, 98, 98, 52, 51, 97, 97, 56, 53, 50, 49, 53, 99, 101, 49, 54, 57, 48, 52, 51, 51, 98, 51, 34, 44, 34, 118, 101, 114, 115, 105, 111, 110, 34, 58, 48, 44, 34, 112, 114, 105, 122, 101, 73, 110, 102, 111, 34, 58, 123, 34, 110, 117, 109, 98, 101, 114, 79, 102, 87, 105, 110, 110, 101, 114, 115, 34, 58, 49, 44, 34, 112, 114, 105, 122, 101, 80, 111, 111, 108, 34, 58, 49, 48, 44, 34, 112, 114, 105, 122, 101, 84, 97, 98, 108, 101, 34, 58, 91, 49, 48, 93, 125, 125]] from topic [Foo]
Caused by: com.fasterxml.jackson.databind.exc.MismatchedInputException: Expected array or string.
 at [Source: (byte[])"{"id":"a2c28ccea1bb43aa85215ce1690433b3-2","author":"foo","created":{"hour":18,"minute":40,"second":51,"nano":325000000,"dayOfYear":240,"dayOfWeek":"TUESDAY","month":"AUGUST","dayOfMonth":28,"year":2018,"monthValue":8,"chronology":{"calendarType":"iso8601","id":"ISO"}},"aggregateId":"a2c28ccea1bb43aa85215ce1690433b3","version":0,"prizeInfo":{"numberOfWinners":1,"prizePool":10,"prizeTable":[10]}}"; line: 1, column: 73] (through reference chain: my.package.Foo["created"])
    at com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63) ~[jackson-databind-2.9.6.jar:2.9.6]
    at com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1342) ~[jackson-databind-2.9.6.jar:2.9.6]
    at com.fasterxml.jackson.databind.DeserializationContext.handleUnexpectedToken(DeserializationContext.java:1138) ~[jackson-databind-2.9.6.jar:2.9.6]
    at com.fasterxml.jackson.datatype.jsr310.deser.JSR310DeserializerBase._handleUnexpectedToken(JSR310DeserializerBase.java:99) ~[jackson-datatype-jsr310-2.9.6.jar:2.9.6]
    at com.fasterxml.jackson.datatype.jsr310.deser.LocalDateTimeDeserializer.deserialize(LocalDateTimeDeserializer.java:141) ~[jackson-datatype-jsr310-2.9.6.jar:2.9.6]
    at com.fasterxml.jackson.datatype.jsr310.deser.LocalDateTimeDeserializer.deserialize(LocalDateTimeDeserializer.java:39) ~[jackson-datatype-jsr310-2.9.6.jar:2.9.6]
    at com.fasterxml.jackson.databind.deser.impl.FieldProperty.deserializeAndSet(FieldProperty.java:136) ~[jackson-databind-2.9.6.jar:2.9.6]
    at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:369) ~[jackson-databind-2.9.6.jar:2.9.6]
    at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:159) ~[jackson-databind-2.9.6.jar:2.9.6]
    at com.fasterxml.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:1611) ~[jackson-databind-2.9.6.jar:2.9.6]
    at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1234) ~[jackson-databind-2.9.6.jar:2.9.6]
    at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:228) ~[spring-kafka-2.1.8.RELEASE.jar:2.1.8.RELEASE]
    at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:923) ~[kafka-clients-1.0.2.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.access$2600(Fetcher.java:93) ~[kafka-clients-1.0.2.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.fetchRecords(Fetcher.java:1100) ~[kafka-clients-1.0.2.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.access$1200(Fetcher.java:949) ~[kafka-clients-1.0.2.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchRecords(Fetcher.java:570) ~[kafka-clients-1.0.2.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:531) ~[kafka-clients-1.0.2.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1154) ~[kafka-clients-1.0.2.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1111) ~[kafka-clients-1.0.2.jar:na]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:699) ~[spring-kafka-2.1.8.RELEASE.jar:2.1.8.RELEASE]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_131]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_131]
    at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
因此,我尝试了以下方法,但迄今为止没有成功: 1.声明为bean的自定义ObjectMapper

@Bean
public ObjectMapper objectMapper() {
    ObjectMapper objectMapper = new ObjectMapper();
    objectMapper.registerModule(new JavaTimeModule());
    objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
    return objectMapper;
}
2.LocalDateTime字段上的序列化程序注释

为了确保我有正确的对象映射器设置和必要的依赖项,我创建了一个rest控制器,将响应模拟为json,作为rest端点返回一个带有日期时间字段的对象,这将正确返回;样本:

[
    {
        "playerId": "foo",
        "points": 10,
        "entryDateTime": "2018-08-19T09:30:20.051"
    },
    {
        "playerId": "bar",
        "points": 3,
        "entryDateTime": "2018-08-27T09:30:20.051"
    }
]

当您使用属性设置序列化器/反序列化器时,Kafka将实例化它们,而不是Spring。卡夫卡对Spring或定制的
ObjectMapper
一无所知

您需要覆盖引导的默认生产者/消费者工厂,并使用替代构造函数(或setter)添加序列化器/反序列化器

重要

只能对属性执行简单配置;对于更高级的配置(例如在序列化程序/反序列化程序中使用自定义ObjectMapper),应使用接受预构建序列化程序和反序列化程序的生产者/消费者工厂构造函数。例如,使用Spring Boot,要覆盖默认工厂:

@Bean
public ConsumerFactory<Foo, Bar> kafkaConsumerFactory(KafkaProperties properties,
    JsonDeserializer customDeserializer) {

    return new DefaultKafkaConsumerFactory<>(properties.buildConsumerProperties(),
        customDeserializer, customDeserializer);
}

@Bean
public ProducererFactory<Foo, Bar> kafkaProducerFactory(KafkaProperties properties,
    JsonSserializer customSerializer) {

    return new DefaultKafkaConsumerFactory<>(properties.buildProducerProperties(),
        customSerializer, customSerializer);
}
@Bean
公共消费工厂卡夫卡消费工厂(卡夫卡房地产,
JsonDeserializer(自定义反序列化程序){
返回新的DefaultKafkanConsumerFactory(properties.buildConsumerProperties(),
customDeserializer,customDeserializer);
}
@豆子
公共生产商工厂卡夫卡布生产商工厂(卡夫卡布房地产,
JSONSerializer(自定义序列化程序){
返回新的DefaultKafkanConsumerFactory(properties.buildProducerProperties(),
customSerializer,customSerializer);
}

还提供了setter,作为使用这些构造函数的替代方法。

您可以扩展Spring Kafka的
JsonSerializer

public class JsonSerializerWithJTM<T> extends JsonSerializer<T> {
    public JsonSerializerWithJTM() {
        super();
        objectMapper.registerModule(new JavaTimeModule());
        //whatever you want to configure here
    }
}
使用带有对象映射器参数的Json(反)序列化器构造函数对我很有用。我很难(反)序列化一个具有java.time.Instant字段的pojo,因此在对同一个
org.apache.kafka.common.errors.SerializationException
***进行了数小时的故障排除后,我终于意识到(在这里的答案的帮助下)问题不是spring,而是kafka自己的序列化。考虑到我拥有的objectmapper bean,我通过将其自动连接到kafka生产者和消费者设置的
JsonSerializer
JsonDeserializer
参数中来解决

@配置
公共类JacksonConfig{
@豆子
@初级的
公共对象映射器对象映射器(Jackson2对象映射器生成器){
ObjectMapper ObjectMapper=builder.build();
registerModule(新的JavaTimeModule());
objectMapper.disable(SerializationFeature.WRITE_DATES_作为时间戳);
返回对象映射器;
}
}

@配置
公共类KafkaProducerConfig{
@值(Value=“${kafka.bootstrapAddress}”)
私有字符串引导器;
@自动连线
私有对象映射器对象映射器;
@豆子
公共KafkaTemplate命令KafkaTemplate(){
Map props=newhashmap();
put(ProducerConfig.BOOTSTRAP\u SERVERS\u CONFIG,bootstrapAddress);
ProducerFactory ProducerFactory=newdefaultkafkaproducerfactory(props,newstringserializer(),newjsonserializer(objectMapper));
返回新的卡夫卡模板(生产工厂);
}
}

@配置
公共类卡夫卡消费者配置{
@值(Value=“${kafka.bootstrapAddress}”)
私有字符串引导器;
@值(Value=“${kafka.consumer.groupId}”)
私有字符串groupId;
@自动连线
私有对象映射器对象映射器;
@豆子
公共ConcurrentKafkaListenerContainerFactory订单KafkalistenerContainerFactory(){
ConcurrentKafkListenerContainerFactory=新ConcurrentKafkListenerContainerFactory();
Map props=newhashmap();
put(ConsumerConfig.BOOTSTRAP\u SERVERS\u CONFIG,bootstrapAddress);
props.put(ConsumerConfig.GROUP\u ID\u CONFIG,groupId);
ConsumerFactory ConsumerFactory=new DefaultKafkanConsumerFactory(props,new StringDeserializer(),new JsonDeserializer(Order.class,objectMapper));
setConsumerFactory(consumerFactory);
返回工厂;
}
}
(图中所示为进一步澄清)

公共类秩序{
私人长帐号;
私人长期资产;
私人长数量;
私人多头价格;
private Instant createdOn=Instant.now();
//没有参数构造函数,除了createdOn之外的所有字段都有参数的构造函数,所有字段都省略了getter/setter

***原因通常是:
com.fasterxml.jackson.databind.exc.InvalidDefinitionException:无法构造“java.time.Instant”的实例(不存在像默认构造一样的创建者):无法从[Source:(byte[])“{“accountId”:1,“assetId”:2,“数量”:100,“价格”:1000”处的对象值反序列化createdOn:{“epochSecond”:1558570217,“nano”:728000000}”…

这个答案不是很有帮助;除了包含源文档中的许多打字错误外,
KafkaTemplate
周围的泛型规则记录得太少,以至于很难判断成功替换引导默认值需要什么bean签名(
KafkaAutoConfiguration
,例如,它正在寻找一家真正的
ProducerFactory
public class JsonSerializerWithJTM<T> extends JsonSerializer<T> {
    public JsonSerializerWithJTM() {
        super();
        objectMapper.registerModule(new JavaTimeModule());
        //whatever you want to configure here
    }
}
spring:
  kafka:
    consumer:
      value-deserializer: com.foo.JsonSerializerWithJTM