elasticsearch,apache-kafka,Json,Parsing,elasticsearch,Apache Kafka" /> elasticsearch,apache-kafka,Json,Parsing,elasticsearch,Apache Kafka" />

Json 无法实例化[map type;类java.util.LinkedHashMap]类型的值

Json 无法实例化[map type;类java.util.LinkedHashMap]类型的值,json,parsing,elasticsearch,apache-kafka,Json,Parsing,elasticsearch,Apache Kafka,我正在将一个json文件从Kafka推送到Elasticsearch,然后在Kibana中可视化数据 以下是我的json文件格式: [{ "A": "---", "B": "---", "C": "---", "D": "---", "ABC": "---", "CDE": "---", "FGY": "1110", "ADF": "226", "SSS": "nil", "ASA": "9.5", "DFGHJKLIWSSFFFSF": "12121", "sasfasfafasfsa": "

我正在将一个json文件从Kafka推送到Elasticsearch,然后在Kibana中可视化数据

以下是我的json文件格式:

[{
"A": "---",
"B": "---",
"C": "---",
"D": "---",
"ABC": "---",
"CDE": "---",
"FGY": "1110",
"ADF": "226",
"SSS": "nil",
"ASA": "9.5",
"DFGHJKLIWSSFFFSF": "12121",
"sasfasfafasfsa": "0.21212",
"TEST": "12121121",
"AGAIN_TEST": "1.23456",
"SSS": "---",
"ASD": "---",
"ASSDFFF": "---",
"QQQQ": "61.2793",
"UYTR": "3619",
"testing": "58.3649",
"fffff": "1010",
"Fasa_sasfaf": "9.000"
}, {
"A": "1616161",
"B": "0.234",
"C": "---",
"D": "---",
"ABC": "1.11",
"CDE": "---",
"FGY": "323",
"ADF": "121",
"SSS": "---",
"ASA": "9.5",
"DFGHJKLIWSSFFFSF": "12121",
"sasfasfafasfsa": "0.21212",
"TEST": "---",
"AGAIN_TEST": "1.23456",
"SSS": "---",
"ASD": "121212",
"ASSDFFF": "---",
"QQQQ": "61.2793",
"UYTR": "3619",
"testing": "50.3649",
"fffff": "1030",
"Fasa_sasfaf": "123.012"
}]
根据网站
http://jsonlint.com/
,我使用的json文件是正确的。但是在Kafka中传递该文件时,我在elasticsearch中得到一个错误

Can not instantiate value of type [map type; class java.util.LinkedHashMap, [simple type, class java.lang.String] -> [simple type, class java.lang.Object]] from JSON String; no single-String constructor/factory method
以下是完整的堆栈跟踪:

Can not instantiate value of type [map type; class java.util.LinkedHashMap, [simple type, class java.lang.String] -> [simple type, class java.lang.Object]] from JSON String; no single-String constructor/factory method
at org.codehaus.jackson.map.deser.std.StdValueInstantiator._createFromStringFallbacks(StdValueInstantiator.java:379)
at org.codehaus.jackson.map.deser.std.StdValueInstantiator.createFromString(StdValueInstantiator.java:268)
at org.codehaus.jackson.map.deser.std.MapDeserializer.deserialize(MapDeserializer.java:244)
at org.codehaus.jackson.map.deser.std.MapDeserializer.deserialize(MapDeserializer.java:33)
at org.codehaus.jackson.map.ObjectReader._bindAndClose(ObjectReader.java:768)
at org.codehaus.jackson.map.ObjectReader.readValue(ObjectReader.java:473)
at org.elasticsearch.river.kafka.IndexDocumentProducer.addMessagesToBulkProcessor(IndexDocumentProducer.java:71)
at org.elasticsearch.river.kafka.KafkaWorker.consumeMessagesAndAddToBulkProcessor(KafkaWorker.java:107)
at org.elasticsearch.river.kafka.KafkaWorker.run(KafkaWorker.java:78)
at java.lang.Thread.run(Thread.java:745)

我认为这是一个实例化问题。粘贴完整的错误堆栈跟踪…用堆栈跟踪更新问题。谢谢。这些链接可能会帮助您。[和][您上面显示的文档实际上是一个JSON数组。您是否愿意将该数组的每个元素存储为不同的文档,或者该数组应该存储为单个文档?我想知道我是否可以在Kafka中将该文档作为数组传递,并且仍然能够在elasticsearch中看到它?