Serialization Kafka使用者在发送客户对象时接收空值

Serialization Kafka使用者在发送客户对象时接收空值,serialization,apache-kafka,deserialization,kafka-consumer-api,kafka-producer-api,Serialization,Apache Kafka,Deserialization,Kafka Consumer Api,Kafka Producer Api,所以我想实现一个从json格式文件读取数据的应用程序。我已经为json中的数据创建了customer对象。我想通过卡夫卡主题发送这些对象。到目前为止,我已经成功地将字符串消息发送给生产者和消费者。但当我尝试在消费者端发送对象时,当我执行.value().toString()时。我得到了空值。以下是我使用的代码: 这是制作人: public class MyProducer { public static void main(String[] args) throws Exception

所以我想实现一个从json格式文件读取数据的应用程序。我已经为json中的数据创建了customer对象。我想通过卡夫卡主题发送这些对象。到目前为止,我已经成功地将字符串消息发送给生产者和消费者。但当我尝试在消费者端发送对象时,当我执行.value().toString()时。我得到了空值。以下是我使用的代码:

这是制作人:

public class MyProducer {

    public static void main(String[] args) throws Exception {

        Properties properties = new Properties();
        properties.put("bootstrap.servers", "kafka.kafka-cluster-shared.non-prod-5-az-scus.prod.us.walmart.net:9092");
        properties.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        properties.put("value.serializer", "xxxxxxxxx.KafkaJsonSerializer");
        properties.put("acks", "1");
        properties.put("retries", "2");
        properties.put("batch.size", "16384");
        properties.put("linger.ms", "1");
        properties.put("buffer.memory", "33554432");

        KafkaProducer<String, pharmacyData> kafkaProducer = new KafkaProducer<String, pharmacyData>(
                properties);

        String topic = "insights";

        //try {
            Gson gson = new Gson();

            Reader reader = Files.newBufferedReader(Paths.get("......./part.json"));

            List<pharmacyData> pdata = new Gson().fromJson(reader, new TypeToken<List<pharmacyData>>() {}.getType());

            //pdata.forEach(System.out::println);

            reader.close();
        //} catch (Exception e) {
            //e.printStackTrace();
        //}

        for (pharmacyData data : pdata) {
            kafkaProducer.send(new ProducerRecord<String, pharmacyData>(topic, data), new Callback() {
                @Override
                public void onCompletion(RecordMetadata recordMetadata, Exception e) {
                    if (e == null) {
                        System.out.println(recordMetadata.partition() + "--" + recordMetadata.serializedValueSize());
                    } else {
                        e.printStackTrace();
                    }
                }
            });
        }
        kafkaProducer.close();
    }
}
这是客户序列化程序:

public class KafkaJsonSerializer implements Serializer {

    private Logger logger = LogManager.getLogger(this.getClass());
    @Override
    public void configure(Map map, boolean b) {

    }

    @Override
    public byte[] serialize(String s, Object o) {
        byte[] retVal = null;
        ObjectMapper objectMapper = new ObjectMapper();
        try {
            retVal = objectMapper.writeValueAsBytes(o);
        } catch (Exception e) {
            logger.error(e.getMessage());
        }
        return retVal;
    }

    @Override
    public void close() {

    }
}
public class KafkaJsonDeserializer implements Deserializer {


    @Override
    public void configure(Map map, boolean b) {

    }

    @Override
    public Object deserialize(String s, byte[] bytes) {
        ObjectMapper mapper = new ObjectMapper();
        pharmacyData pdata = null;
        try {
            pdata = mapper.readValue(bytes, pharmacyData.class);
        } catch (Exception e) {
            e.printStackTrace();
        }
        return pdata;
    }

    @Override
    public void close() {

    }
}
这是客户反序列化程序:

public class KafkaJsonSerializer implements Serializer {

    private Logger logger = LogManager.getLogger(this.getClass());
    @Override
    public void configure(Map map, boolean b) {

    }

    @Override
    public byte[] serialize(String s, Object o) {
        byte[] retVal = null;
        ObjectMapper objectMapper = new ObjectMapper();
        try {
            retVal = objectMapper.writeValueAsBytes(o);
        } catch (Exception e) {
            logger.error(e.getMessage());
        }
        return retVal;
    }

    @Override
    public void close() {

    }
}
public class KafkaJsonDeserializer implements Deserializer {


    @Override
    public void configure(Map map, boolean b) {

    }

    @Override
    public Object deserialize(String s, byte[] bytes) {
        ObjectMapper mapper = new ObjectMapper();
        pharmacyData pdata = null;
        try {
            pdata = mapper.readValue(bytes, pharmacyData.class);
        } catch (Exception e) {
            e.printStackTrace();
        }
        return pdata;
    }

    @Override
    public void close() {

    }
}
这是消费者:

public class MyConsumer {
    public static void main(String[] args) {

        Properties properties = new Properties();
        properties.put("bootstrap.servers", "kafka.kafka-cluster-shared.non-prod-5-az-scus.prod.us.walmart.net:9092");
        properties.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
        properties.put("value.deserializer", "xxxxxxxx.KafkaJsonDeserializer");
        properties.put("group.id", "consumer-group-1");
        properties.put("enable.auto.commit", "true");
        properties.put("auto.commit.interval.ms", "1000");
        properties.put("auto.offset.reset", "earliest");
        properties.put("session.timeout.ms", "30000");

        KafkaConsumer<String, pharmacyData> consumer = new KafkaConsumer<>(properties);

        String topic = "insights";

        consumer.subscribe(Collections.singletonList(topic));
            while (true) {
                ConsumerRecords<String, pharmacyData> consumerRecords = consumer.poll(100);

                for (ConsumerRecord<String, pharmacyData> consumerRecord : consumerRecords) {
                    System.out.println(consumerRecord.key() + "--" + consumerRecord.toString());
                    //System.out.println(consumerRecord.offset() + "--" + consumerRecord.partition());
                }
            }
    }
}
公共类消费者{
公共静态void main(字符串[]args){
属性=新属性();
properties.put(“bootstrap.servers”,“kafka.kafka集群共享.非-prod-5-az-scus.prod.us.walmart.net:9092”);
properties.put(“key.deserializer”、“org.apache.kafka.common.serialization.StringDeserializer”);
properties.put(“value.deserializer”、“xxxxxxxx.kafkajson反序列化器”);
属性。put(“group.id”、“consumer-group-1”);
properties.put(“enable.auto.commit”、“true”);
properties.put(“auto.commit.interval.ms”、“1000”);
properties.put(“自动偏移、重置”、“最早”);
properties.put(“session.timeout.ms”,“30000”);
卡夫卡消费者=新卡夫卡消费者(财产);
字符串topic=“insights”;
consumer.subscribe(Collections.singletonList(主题));
while(true){
ConsumerRecords ConsumerRecords=消费者投票(100);
用于(用户记录用户记录:用户记录){
System.out.println(consumerRecord.key()+“--”+consumerRecord.toString());
//System.out.println(consumerRecord.offset()+“--”+consumerRecord.partition());
}
}
}
}
有人能帮我解决这些问题吗?多谢各位

问题已解决:

此问题的解决方案只是添加一个默认构造函数,如下所示:

public pharmacyData() {

}
有关更多详细信息,请参见第页