Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/json/15.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 卡夫卡-序列化例外;前导零误差_Java_Json_Spring Boot_Apache Kafka_Kafka Consumer Api - Fatal编程技术网

Java 卡夫卡-序列化例外;前导零误差

Java 卡夫卡-序列化例外;前导零误差,java,json,spring-boot,apache-kafka,kafka-consumer-api,Java,Json,Spring Boot,Apache Kafka,Kafka Consumer Api,我试图从卡夫卡主题中获取消费者数据,该主题的数据如下: { "team_member_id":"0007010267", "accrual_proc_dt":"2018/06/17", "timeoff_plans":"ABC", "length_of_service":357, "personal_hol_plan":"ABC", "personal_hol_bal":5.0, "personal_hol_carry_over_bal":0.0, "personal_hol_current_

我试图从卡夫卡主题中获取消费者数据,该主题的数据如下:

 {
"team_member_id":"0007010267",
"accrual_proc_dt":"2018/06/17",
"timeoff_plans":"ABC",
"length_of_service":357,
"personal_hol_plan":"ABC",
"personal_hol_bal":5.0,
"personal_hol_carry_over_bal":0.0,
"personal_hol_current_accrued_bal":11.04,
"personal_hol_current_taken_bal":0.0,
"wellbeing_hol_plan":"ABC",
"wellbeing_hol_bal":0.0,
"wellbeing_hol_carry_over_bal":0.0,
"wellbeing_hol_current_accrued_bal":0.0,
"wellbeing_hol_current_taken_bal":0.0,
"sick_hol_plan":"ABC",
"sick_hol_bal":35.32,
"sick_hol_carry_over_bal":0.0,
"sick_hol_current_accrued_bal":2.53,
"sick_hol_current_taken_bal":0.0,
"sick_po_hol_plan":"ABC",
"sick_po_hol_bal":240.01,
"sick_po_hol_carry_over_bal":0.0,
"sick_po_hol_current_accrued_bal":0.0,
"sick_po_hol_current_taken_bal":0.0,
"vac_hol_plan":"ABC",
"vac_hol_bal":27.67,
"vac_hol_carry_over_bal":0.0,
"vac_hol_current_accrued_bal":73.59,
"vac_hol_current_taken_bal":0.0}
我的卡夫卡侦听器代码如下:

@KafkaListener(topics = "ABC-leave")
    public void receive(LeaveAccurals leaveAccurals) throws SerializationException, SQLException, NoSuchAlgorithmException, KeyStoreException, KeyManagementException {
        LOGGER.info("Data received = '{}'", leaveAccurals.toString());
        consumerSupportService.setLeaveAccurals(leaveAccurals);
        latch.countDown();
        requests.inc();
    }
我有一个名为LeaveAccurals的POJO,类似于上面的JSON。我得到以下错误:

org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition ABC-leave-0 at offset 0
Caused by: org.apache.kafka.common.errors.SerializationException: Can't deserialize data [[48, 48, 55, 48, 49, 56, 55, 56, 55, 55]] from topic [hr-wdy-leave-accruals]
Caused by: com.fasterxml.jackson.core.JsonParseException: Invalid numeric value: Leading zeroes not allowed
 at [Source: [B@573c7e3c; line: 1, column: 2]
    at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1702) ~[jackson-core-2.8.7.jar:2.8.7]
    at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:558) ~[jackson-core-2.8.7.jar:2.8.7]
    at com.fasterxml.jackson.core.base.ParserBase.reportInvalidNumber(ParserBase.java:1062) ~[jackson-core-2.8.7.jar:2.8.7]
    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._verifyNoLeadingZeroes(UTF8StreamJsonParser.java:1549) ~[jackson-core-2.8.7.jar:2.8.7]
    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._parsePosNumber(UTF8StreamJsonParser.java:1395) ~[jackson-core-2.8.7.jar:2.8.7]
    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser._nextTokenNotInObject(UTF8StreamJsonParser.java:876) ~[jackson-core-2.8.7.jar:2.8.7]
    at com.fasterxml.jackson.core.json.UTF8StreamJsonParser.nextToken(UTF8StreamJsonParser.java:772) ~[jackson-core-2.8.7.jar:2.8.7]
    at com.fasterxml.jackson.databind.ObjectReader._initForReading(ObjectReader.java:355) ~[jackson-databind-2.8.7.jar:2.8.7]
    at com.fasterxml.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:1611) ~[jackson-databind-2.8.7.jar:2.8.7]
    at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1237) ~[jackson-databind-2.8.7.jar:2.8.7]
    at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:86) ~[spring-kafka-1.2.0.RELEASE.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:866) ~[kafka-clients-0.10.2.0.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.parseCompletedFetch(Fetcher.java:775) ~[kafka-clients-0.10.2.0.jar:na]
    at org.apache.kafka.clients.consumer.internals.Fetcher.fetchedRecords(Fetcher.java:473) ~[kafka-clients-0.10.2.0.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.pollOnce(KafkaConsumer.java:1062) ~[kafka-clients-0.10.2.0.jar:na]
    at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:995) ~[kafka-clients-0.10.2.0.jar:na]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:535) ~[spring-kafka-1.2.0.RELEASE.jar:na]
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_131]
    at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_131]
    at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131]
我读过一篇关于使用带有前导零的JSON数据的文章,但这是无法做到的。我无法更改卡夫卡队列,因为我是只读消费者


非常感谢您的建议。

我也有同样的问题。你找到解决办法了吗?看起来Jackson有一个设置(允许\u强制\u个标量),但我看不到一种不编写自己的代码就配置它的方法。是的,我无法更改此设置。我们用红移来处理这项工作。