Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Avro向后模式演化引发ClassCastException_Java_Apache Kafka_Avro - Fatal编程技术网

Java Avro向后模式演化引发ClassCastException

Java Avro向后模式演化引发ClassCastException,java,apache-kafka,avro,Java,Apache Kafka,Avro,当我试图用一个简单的Java程序测试Avro模式演化时,我得到了一个ClassCastException Avro版本:1.10.0 customer-v1.avsc { "type": "record", "namespace": "com.practice.kafka", "name": "Customer", "doc": "

当我试图用一个简单的Java程序测试Avro模式演化时,我得到了一个
ClassCastException

Avro版本:
1.10.0

customer-v1.avsc

{
  "type": "record",
  "namespace": "com.practice.kafka",
  "name": "Customer",
  "doc": "Avro schema for Customer",
  "fields": [
    {"name":  "first_name", "type":  "string", "doc": "Customer first name"},
    {"name":  "last_name", "type":  "string", "doc": "Customer last name"},
    {"name":  "automated_email", "type":  "boolean", "default": true, "doc": "Receive marketing emails or not"}
  ]
}
{
  "type": "record",
  "namespace": "com.practice.kafka",
  "name": "CustomerV2",
  "doc": "Avro schema for Customer",
  "fields": [
    {"name":  "first_name", "type":  "string", "doc": "Customer first name"},
    {"name":  "last_name", "type":  "string", "doc": "Customer last name"},
    {"name":  "phone_number", "type":  ["null","boolean"], "default": null, "doc": "Optional phone number"},
    {"name":  "email", "type":  "string", "default":  "missing@example.com", "doc":  "Optional email address"}
  ]
}
customer-v2.avsc

{
  "type": "record",
  "namespace": "com.practice.kafka",
  "name": "Customer",
  "doc": "Avro schema for Customer",
  "fields": [
    {"name":  "first_name", "type":  "string", "doc": "Customer first name"},
    {"name":  "last_name", "type":  "string", "doc": "Customer last name"},
    {"name":  "automated_email", "type":  "boolean", "default": true, "doc": "Receive marketing emails or not"}
  ]
}
{
  "type": "record",
  "namespace": "com.practice.kafka",
  "name": "CustomerV2",
  "doc": "Avro schema for Customer",
  "fields": [
    {"name":  "first_name", "type":  "string", "doc": "Customer first name"},
    {"name":  "last_name", "type":  "string", "doc": "Customer last name"},
    {"name":  "phone_number", "type":  ["null","boolean"], "default": null, "doc": "Optional phone number"},
    {"name":  "email", "type":  "string", "default":  "missing@example.com", "doc":  "Optional email address"}
  ]
}
序列化v1和反序列化v2的程序

package com.practice.kafka;

import org.apache.avro.file.DataFileReader;
import org.apache.avro.file.DataFileWriter;
import org.apache.avro.io.DatumReader;
import org.apache.avro.io.DatumWriter;
import org.apache.avro.specific.SpecificDatumReader;
import org.apache.avro.specific.SpecificDatumWriter;

import java.io.File;
import java.io.IOException;

public class BackwardSchemaEvolutionSample {

    public static void main(String[] args) {

        // Step 1 - Create specific record
        Customer customer = Customer.newBuilder().setFirstName("John").setLastName("Doe").setAutomatedEmail(false).build();

        // Step 2 - Write specific record to a file
        final DatumWriter<Customer> datumWriter = new SpecificDatumWriter<>();
        try (DataFileWriter<Customer> dataFileWriter = new DataFileWriter<>(datumWriter)) {
            dataFileWriter.create(customer.getSchema(), new File("customer-v1.avro"));
            dataFileWriter.append(customer);
        } catch (IOException e) {
            e.printStackTrace();
        }

        // Step 3 - Read specific record from a file
        final File file = new File("customer-v1.avro");
        final DatumReader<CustomerV2> datumReader = new SpecificDatumReader<>();
        CustomerV2 customerRecord;
        try (DataFileReader<CustomerV2> dataFileReader = new DataFileReader<>(file, datumReader)) {
            customerRecord = dataFileReader.next();
            System.out.println(customerRecord.toString());
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
}


您能告诉我如何修复此错误吗?

您定义了两种数据类型
Customer
Customer2
,并且您不能进行任何转换,因为它们没有继承关系。 所以Java不能进行强制转换,而您得到的是
ClassCastException
。 在您的代码中,唯一的解决方案是捕获
ClassCastException
,并在catch块中将Customer转换为Customer2

我假设您正在Kafka环境中模拟模式的更改。 在此场景中,您将通过添加新字段或删除旧字段来扩展现有的avro模式

只要类的名称保持不变,avro模式的更改就会起作用