Java 从kafka使用flink到数据流json时出现问题

Java 从kafka使用flink到数据流json时出现问题,java,apache-kafka,flink-streaming,Java,Apache Kafka,Flink Streaming,我正在尝试使用flink数据流从kafka检索数据。 当我启动代码时,我有以下日志: 11:57:16,891 INFO org.apache.flink.api.java.typeutils.TypeExtractor - class org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode does not contain a getter for fiel

我正在尝试使用flink数据流从kafka检索数据。 当我启动代码时,我有以下日志:

    11:57:16,891 INFO  org.apache.flink.api.java.typeutils.TypeExtractor             - class org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode does not contain a getter for field _children
    
    11:57:16,892 INFO  org.apache.flink.api.java.typeutils.TypeExtractor             - class org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode does not contain a setter for field _children
    
    11:57:16,892 INFO  org.apache.flink.api.java.typeutils.TypeExtractor             - Class class org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode cannot be used as a POJO type because not all fields are valid POJO fields, and must be processed as GenericType. Please read the Flink documentation on "Data Types & Serialization" for details of the effect on performance.
所以我没有错误,但是这会记录日志,但是当我尝试打印数据流时,什么都不会出现。 这是我制作卡夫卡主题并使用flink的代码:

    String topicName = "prova3";

    Properties props = new Properties();


    props.put("bootstrap.servers", "192.168.1.22:9092");




    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);



    String msg = new JSONObject()
            .put("campo1", "Italy")
            .put("campo2", "Technology")
            .toString();

    System.out.println(msg);

    Producer<String, JsonNode> producer = new KafkaProducer<String, JsonNode>(props);

    ObjectMapper objectMapper = new ObjectMapper();

    JsonNode  jsonNode = null;



    System.out.println(msg);

    jsonNode = objectMapper.readTree(msg);

    System.out.println(jsonNode);

    producer.send(new ProducerRecord<String, JsonNode>(topicName, jsonNode));

    producer.close();




    final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();


    Properties properties = new Properties();
    properties.setProperty("bootstrap.servers", "192.168.1.22:9092");
    properties.setProperty("group.id", topicName);
    DataStreamSource<ObjectNode> stream = env
            .addSource(new FlinkKafkaConsumer<>("prova3", new JSONKeyValueDeserializationSchema(false), properties));

    stream.print();
String topicName=“prova3”;
Properties props=新属性();
道具放置(“bootstrap.servers”,“192.168.1.22:9092”);
put(ProducerConfig.KEY\u SERIALIZER\u CLASS\u CONFIG,JsonSerializer.CLASS);
put(ProducerConfig.VALUE\u SERIALIZER\u CLASS\u CONFIG,JsonSerializer.CLASS);
字符串msg=newJSONObject()
.put(“campo1”、“意大利”)
.put(“campo2”,“技术”)
.toString();
System.out.println(msg);
制作人=新卡夫卡制作人(道具);
ObjectMapper ObjectMapper=新的ObjectMapper();
JsonNode JsonNode=null;
System.out.println(msg);
jsonNode=objectMapper.readTree(msg);
System.out.println(jsonNode);
producer.send(新ProducerRecord(topicName,jsonNode));
producer.close();
最终StreamExecutionEnvironment env=StreamExecutionEnvironment.getExecutionEnvironment();
属性=新属性();
setProperty(“bootstrap.servers”,“192.168.1.22:9092”);
properties.setProperty(“group.id”,topicName);
DataStreamSource stream=env
.addSource(新的FlinkKafkaConsumer(“prova3”,新的JSONKeyValueDeserializationSchema(false),属性));
stream.print();