Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/321.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache Kafka和Avro:org.Apache.Avro.generic.GenericData$Record不能强制转换为com.harmeetsingh13.java.Customer_Java_Apache Kafka_Avro_Kafka Consumer Api_Kafka Producer Api - Fatal编程技术网

Apache Kafka和Avro:org.Apache.Avro.generic.GenericData$Record不能强制转换为com.harmeetsingh13.java.Customer

Apache Kafka和Avro:org.Apache.Avro.generic.GenericData$Record不能强制转换为com.harmeetsingh13.java.Customer,java,apache-kafka,avro,kafka-consumer-api,kafka-producer-api,Java,Apache Kafka,Avro,Kafka Consumer Api,Kafka Producer Api,每当我试图从kafka队列读取消息时,我都会遇到以下异常: [error] (run-main-0) java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to com.harmeetsingh13.java.Customer java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record ca

每当我试图从kafka队列读取消息时,我都会遇到以下异常:

[error] (run-main-0) java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to com.harmeetsingh13.java.Customer
java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to com.harmeetsingh13.java.Customer
        at com.harmeetsingh13.java.consumers.avrodesrializer.AvroSpecificDeserializer.infiniteConsumer(AvroSpecificDeserializer.java:79)
        at com.harmeetsingh13.java.consumers.avrodesrializer.AvroSpecificDeserializer.main(AvroSpecificDeserializer.java:87)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
卡夫卡制作人代码:

public class AvroSpecificProducer {
    private static Properties kafkaProps = new Properties();
    private static KafkaProducer<String, Customer> kafkaProducer;

    static {
        kafkaProps.put("bootstrap.servers", "localhost:9092");
        kafkaProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
        kafkaProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
        kafkaProps.put("schema.registry.url", "http://localhost:8081");
        kafkaProducer = new KafkaProducer<>(kafkaProps);
    }

    public static void fireAndForget(ProducerRecord<String, Customer> record) {
        kafkaProducer.send(record);
    }

    public static void asyncSend(ProducerRecord<String, Customer> record) {
        kafkaProducer.send(record, (recordMetaData, ex) -> {
            System.out.println("Offset: "+ recordMetaData.offset());
            System.out.println("Topic: "+ recordMetaData.topic());
            System.out.println("Partition: "+ recordMetaData.partition());
            System.out.println("Timestamp: "+ recordMetaData.timestamp());
        });
    }

    public static void main(String[] args) throws InterruptedException, IOException {
        Customer customer1 = new Customer(1002, "Jimmy");
        ProducerRecord<String, Customer> record1 = new ProducerRecord<>("CustomerSpecificCountry",
                "Customer One 11 ", customer1
        );

        asyncSend(record1);

        Thread.sleep(1000);
    }
}
公共类AvroSpecificProducer{
私有静态属性kafkapprops=新属性();
私人静态卡夫卡制作人卡夫卡制作人;
静止的{
kafkaProps.put(“bootstrap.servers”,“localhost:9092”);
kafkaaprops.put(ProducerConfig.KEY\u SERIALIZER\u CLASS\u CONFIG,KafkaAvroSerializer.CLASS);
kafkaaprops.put(ProducerConfig.VALUE\u SERIALIZER\u CLASS\u CONFIG,KafkaAvroSerializer.CLASS);
kafkaProps.put(“schema.registry.url”)http://localhost:8081");
卡夫卡普洛斯=新卡夫卡普洛斯(卡夫卡普洛斯);
}
公共静态无效fireAndForget(生产记录记录){
kafkaProducer.send(记录);
}
公共静态无效异步发送(生产记录记录){
kafkaProducer.send(记录,(记录元数据,ex)->{
System.out.println(“Offset:+recordMetaData.Offset());
System.out.println(“主题:+recordMetaData.Topic());
System.out.println(“分区:+recordMetaData.Partition());
System.out.println(“Timestamp:+recordMetaData.Timestamp());
});
}
公共静态void main(字符串[]args)引发InterruptedException、IOException{
客户customer1=新客户(1002,“吉米”);
ProducerRecord1=新的ProducerRecord(“CustomerSpecificCountry”,
“客户一号11”,客户一号
);
异步发送(记录1);
睡眠(1000);
}
}
卡夫卡消费代码:

public class AvroSpecificDeserializer {

    private static Properties kafkaProps = new Properties();

    static {
        kafkaProps.put(ConsumerConfig.GROUP_ID_CONFIG, "CustomerCountryGroup1");
        kafkaProps.put("zookeeper.connect", "localhost:2181");
        kafkaProps.put("schema.registry.url", "http://localhost:8081");
    }

    public static void infiniteConsumer() throws IOException {
        VerifiableProperties properties = new VerifiableProperties(kafkaProps);
        KafkaAvroDecoder keyDecoder = new KafkaAvroDecoder(properties);
        KafkaAvroDecoder valueDecoder = new KafkaAvroDecoder(properties);

        Map<String, Integer> topicCountMap = new HashMap<>();
        topicCountMap.put("NewTopic", 1);

        ConsumerConnector consumer = createJavaConsumerConnector(new kafka.consumer.ConsumerConfig(kafkaProps));
        Map<String, List<KafkaStream<Object, Object>>> consumerMap = consumer.createMessageStreams(topicCountMap, keyDecoder, valueDecoder);

        KafkaStream stream = consumerMap.get("NewTopic").get(0);
        ConsumerIterator it = stream.iterator();

        System.out.println("???????????????????????????????????????????????? ");
        while (it.hasNext()) {
            System.out.println(">>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ");
            MessageAndMetadata messageAndMetadata = it.next();
            String key = (String) messageAndMetadata.key();
            GenericRecord record = (GenericRecord) messageAndMetadata.message();
            Customer customer = (Customer) SpecificData.get().deepCopy(Customer.SCHEMA$, record);
            System.out.println("Key: " + key);
            System.out.println("Value: " + customer);
        }

    }

    public static void main(String[] args) throws IOException {
        infiniteConsumer();
    }
}
公共类AvroSpecificDeserializer{
私有静态属性kafkapprops=新属性();
静止的{
kafkaProps.put(ConsumerConfig.GROUP_ID_CONFIG,“CustomerCountryGroup1”);
kafkaProps.put(“zookeeper.connect”,“localhost:2181”);
kafkaProps.put(“schema.registry.url”)http://localhost:8081");
}
公共静态void infiniteConsumer()引发IOException{
可验证属性=新的可验证属性(Kafkapprops);
KafkaAvroDecoder keyDecoder=新的KafkaAvroDecoder(属性);
KafkaAvroDecoder valueDecoder=新的KafkaAvroDecoder(属性);
Map topicCountMap=新HashMap();
topicCountMap.put(“NewTopic”,1);
ConsumerConnector consumer=createJavaConsumerConnector(新的kafka.consumer.ConsumerConfig(kafkaProps));
Map consumerMap=consumer.createMessageStreams(topicCountMap、keyDecoder、valueDecoder);
KafkaStream stream=consumerMap.get(“NewTopic”).get(0);
ConsumerIterator it=stream.iterator();
System.out.println(“????????????????????????????”;
while(it.hasNext()){
系统输出打印项次(“>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>”;
MessageAndMetadata MessageAndMetadata=it.next();
String key=(String)messageAndMetadata.key();
GenericRecord=(GenericRecord)messageAndMetadata.message();
Customer=(Customer)SpecificData.get().deepCopy(Customer.SCHEMA$,record);
System.out.println(“Key:+Key”);
System.out.println(“值:+customer”);
}
}
公共静态void main(字符串[]args)引发IOException{
无限消费者();
}
}
以下是我举的例子:


  • 在与@harmeen讨论之后,这是最终可行的代码

    static { 
        kafkaProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "smallest"); 
        kafkaProps.put(ConsumerConfig.GROUP_ID_CONFIG, "CustomerCountryGroup1"); 
        kafkaProps.put("zookeeper.connect", "localhost:2181"); 
        kafkaProps.put("schema.registry.url", "http://localhost:8081"); 
        kafkaProps.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, true); 
    }
    
    public static void infiniteConsumer() throws IOException { 
    
    VerifiableProperties properties = new VerifiableProperties(kafkaProps); 
    StringDecoder keyDecoder = new StringDecoder(properties); 
    KafkaAvroDecoder valueDecoder = new KafkaAvroDecoder(properties); 
    
    Map<String, Integer> topicCountMap = new HashMap<>(); 
    topicCountMap.put("BrandNewTopics", 1); 
    
    ConsumerConnector consumer = createJavaConsumerConnector(new kafka.consumer.ConsumerConfig(kafkaProps)); 
    Map<String, List<KafkaStream<String, Object>>> consumerMap = consumer.createMessageStreams(topicCountMap, keyDecoder, valueDecoder); 
    
    KafkaStream stream = consumerMap.get("BrandNewTopics").get(0); 
    ConsumerIterator it = stream.iterator(); 
    
    while (it.hasNext()) { 
        MessageAndMetadata messageAndMetadata = it.next(); 
        String key = (String) messageAndMetadata.key(); 
        GenericRecord record = (GenericRecord) messageAndMetadata.message(); 
        Customer customer = (Customer) SpecificData.get().deepCopy(Customer.SCHEMA$, record); 
        System.out.println("Key: " + key); 
        System.out.println("Value: " + customer); 
    } 
    
    static{
    kafkaProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG,“最小”);
    kafkaProps.put(ConsumerConfig.GROUP_ID_CONFIG,“CustomerCountryGroup1”);
    kafkaProps.put(“zookeeper.connect”,“localhost:2181”);
    kafkaProps.put(“schema.registry.url”)http://localhost:8081"); 
    kafkavrodeserializerconfig.SPECIFIC_AVRO_READER_CONFIG,true);
    }
    公共静态void infiniteConsumer()引发IOException{
    可验证属性=新的可验证属性(Kafkapprops);
    StringDecoder keyDecoder=新的StringDecoder(属性);
    KafkaAvroDecoder valueDecoder=新的KafkaAvroDecoder(属性);
    Map topicCountMap=新HashMap();
    topicCountMap.put(“BrandNewTopics”,1);
    ConsumerConnector consumer=createJavaConsumerConnector(新的kafka.consumer.ConsumerConfig(kafkaProps));
    Map consumerMap=consumer.createMessageStreams(topicCountMap、keyDecoder、valueDecoder);
    KafkaStream stream=consumerMap.get(“BrandNewTopics”).get(0);
    ConsumerIterator it=stream.iterator();
    而(it.hasNext()){
    MessageAndMetadata MessageAndMetadata=it.next();
    String key=(String)messageAndMetadata.key();
    GenericRecord=(GenericRecord)messageAndMetadata.message();
    Customer=(Customer)SpecificData.get().deepCopy(Customer.SCHEMA$,record);
    System.out.println(“Key:+Key”);
    System.out.println(“值:+customer”);
    } 
    
    发生变化的事情:

    • SPECIFIC\u AVRO\u READER\u CONFIG
      属性添加为true
    • 使用最小值从主题的开头开始
    • 对键使用
      StringSerializer
      StringDeserializer
    • 更改生产者和消费者以反映以前的更改
    • 调整代表Avro记录的
      Customer
      类的名称空间

    如果没有解决这些问题,请考虑以下步骤:1。检查您的模式注册表包含相应主题的模式。2。使用<代码>卡夫卡avro控制台消费< /代码>以消耗您的事件。这将将问题的范围缩小到您的生产者或您的消费者。嘿@哈维尔,我妈使用<代码>。/卡夫卡AVROR控制台消费者--引导服务器localhost:2181--主题CustomerSpecificCountry--从一开始--属性schema.registry.url=http://localhost:8081 用于执行消费者,但不接收任何内容