Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/383.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Kafkaavroderializer不返回SpecificRecord,但返回GenericRecord_Java_Apache Kafka_Avro_Confluent Platform_Confluent Schema Registry - Fatal编程技术网

Java Kafkaavroderializer不返回SpecificRecord,但返回GenericRecord

Java Kafkaavroderializer不返回SpecificRecord,但返回GenericRecord,java,apache-kafka,avro,confluent-platform,confluent-schema-registry,Java,Apache Kafka,Avro,Confluent Platform,Confluent Schema Registry,我的KafkaProducer能够使用KafkaAvroSerializer将对象序列化到我的主题。但是,KafkaConsumer.poll()返回反序列化的genericord,而不是我的序列化类 MyKafkaProducer KafkaProducer<CharSequence, MyBean> producer; try (InputStream props = Resources.getResource("producer.props").openStream(

我的
KafkaProducer
能够使用
KafkaAvroSerializer
将对象序列化到我的主题。但是,
KafkaConsumer.poll()
返回反序列化的
genericord
,而不是我的序列化类

MyKafkaProducer

 KafkaProducer<CharSequence, MyBean> producer;
    try (InputStream props = Resources.getResource("producer.props").openStream()) {
      Properties properties = new Properties();
      properties.load(props);
      properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
          io.confluent.kafka.serializers.KafkaAvroSerializer.class);
      properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
          io.confluent.kafka.serializers.KafkaAvroSerializer.class);
      properties.put("schema.registry.url", "http://localhost:8081");

      MyBean bean = new MyBean();
      producer = new KafkaProducer<>(properties);
      producer.send(new ProducerRecord<>(topic, bean.getId(), bean));
KafkaProducer制作人;
try(InputStream props=Resources.getResource(“producer.props”).openStream()){
属性=新属性();
属性。加载(道具);
properties.put(ProducerConfig.KEY\u SERIALIZER\u CLASS\u CONFIG,
Kafkavroserializer.class);
properties.put(ProducerConfig.VALUE\u SERIALIZER\u CLASS\u CONFIG,
Kafkavroserializer.class);
properties.put(“schema.registry.url”)http://localhost:8081");
MyBean=新的MyBean();
制作人=新卡夫卡制作人(财产);
send(新的ProducerRecord(主题,bean.getId(),bean));
我的卡夫卡苏美尔

 try (InputStream props = Resources.getResource("consumer.props").openStream()) {
      properties.load(props);
      properties.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, io.confluent.kafka.serializers.KafkaAvroDeserializer.class);
      properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, io.confluent.kafka.serializers.KafkaAvroDeserializer.class);
      properties.put("schema.registry.url", "http://localhost:8081");
      consumer = new KafkaConsumer<>(properties);
    }
    consumer.subscribe(Arrays.asList(topic));
    try {
      while (true) {
        ConsumerRecords<CharSequence, MyBean> records = consumer.poll(100);
        if (records.isEmpty()) {
          continue;
        }
        for (ConsumerRecord<CharSequence, MyBean> record : records) {
          MyBean bean = record.value(); // <-------- This is throwing a cast Exception because it cannot cast GenericRecord to MyBean
          System.out.println("consumer received: " + bean);
        }
      }
try(InputStream props=Resources.getResource(“consumer.props”).openStream()){
属性。加载(道具);
properties.put(ConsumerConfig.KEY\反序列化器\类\配置,io.confluent.kafka.serializers.KafkaAvroDeserializer.CLASS);
properties.put(ConsumerConfig.VALUE\反序列化程序\类\配置,io.confluent.kafka.serializers.kafkavrodeserializer.CLASS);
properties.put(“schema.registry.url”)http://localhost:8081");
消费者=新卡夫卡消费者(房产);
}
consumer.subscribe(Arrays.asList(topic));
试一试{
while(true){
ConsumerRecords记录=consumer.poll(100);
if(records.isEmpty()){
继续;
}
对于(消费者记录:记录){
MyBean bean=record.value();//KafkaAvroDeserializer支持特定数据
默认情况下未启用它。要启用它,请执行以下操作:

properties.put(KafkaAvroDeserializerConfig.SPECIFIC_AVRO_READER_CONFIG, true);
Kafkaavroderializer不支持ReflectData Confluent的
Kafkaavroodeserializer
不知道如何使用Avro ReflectData进行反序列化。我必须扩展它以支持Avro ReflectData:

/**
 * Extends deserializer to support ReflectData.
 *
 * @param <V>
 *     value type
 */
public abstract class ReflectKafkaAvroDeserializer<V> extends KafkaAvroDeserializer {

  private Schema readerSchema;
  private DecoderFactory decoderFactory = DecoderFactory.get();

  protected ReflectKafkaAvroDeserializer(Class<V> type) {
    readerSchema = ReflectData.get().getSchema(type);
  }

  @Override
  protected Object deserialize(
      boolean includeSchemaAndVersion,
      String topic,
      Boolean isKey,
      byte[] payload,
      Schema readerSchemaIgnored) throws SerializationException {

    if (payload == null) {
      return null;
    }

    int schemaId = -1;
    try {
      ByteBuffer buffer = ByteBuffer.wrap(payload);
      if (buffer.get() != MAGIC_BYTE) {
        throw new SerializationException("Unknown magic byte!");
      }

      schemaId = buffer.getInt();
      Schema writerSchema = schemaRegistry.getByID(schemaId);

      int start = buffer.position() + buffer.arrayOffset();
      int length = buffer.limit() - 1 - idSize;
      DatumReader<Object> reader = new ReflectDatumReader(writerSchema, readerSchema);
      BinaryDecoder decoder = decoderFactory.binaryDecoder(buffer.array(), start, length, null);
      return reader.read(null, decoder);
    } catch (IOException e) {
      throw new SerializationException("Error deserializing Avro message for id " + schemaId, e);
    } catch (RestClientException e) {
      throw new SerializationException("Error retrieving Avro schema for id " + schemaId, e);
    }
  }
}
配置
KafkaConsumer
以使用自定义反序列化器类:

properties.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, MyBeanDeserializer.class);

编辑:反映合并后的数据支持(见下文)

为了补充Chin Huang的答案,为了实现最少的代码和更好的性能,您可能应该这样实现:

/**
 * Extends deserializer to support ReflectData.
 *
 * @param <V>
 *     value type
 */
public abstract class SpecificKafkaAvroDeserializer<V extends SpecificRecordBase> extends AbstractKafkaAvroDeserializer implements Deserializer<V> {
  private final Schema schema;
  private Class<T> type;
  private DecoderFactory decoderFactory = DecoderFactory.get();

  protected SpecificKafkaAvroDeserializer(Class<T> type, Map<String, ?> props) {
    this.type = type;
    this.schema = ReflectData.get().getSchema(type);
    this.configure(this.deserializerConfig(props));
  }

  public void configure(Map<String, ?> configs) {
    this.configure(new KafkaAvroDeserializerConfig(configs));
  }

  @Override
  protected T deserialize(
          boolean includeSchemaAndVersion,
          String topic,
          Boolean isKey,
          byte[] payload,
          Schema readerSchemaIgnore) throws SerializationException {

    if (payload == null) {
      return null;
    }

    int schemaId = -1;
    try {
      ByteBuffer buffer = ByteBuffer.wrap(payload);
      if (buffer.get() != MAGIC_BYTE) {
        throw new SerializationException("Unknown magic byte!");
      }

      schemaId = buffer.getInt();
      Schema schema = schemaRegistry.getByID(schemaId);

      Schema readerSchema = ReflectData.get().getSchema(type);

      int start = buffer.position() + buffer.arrayOffset();
      int length = buffer.limit() - 1 - idSize;
      SpecificDatumReader<T> reader = new SpecificDatumReader(schema, readerSchema);
      BinaryDecoder decoder = decoderFactory.binaryDecoder(buffer.array(), start, length, null);
      return reader.read(null, decoder);
    } catch (IOException e) {
      throw new SerializationException("Error deserializing Avro message for id " + schemaId, e);
    } catch (RestClientException e) {
      throw new SerializationException("Error retrieving Avro schema for id " + schemaId, e);
    }
  }
}
/**
*扩展反序列化程序以支持ReflectData。
*
*@param
*值类型
*/
公共抽象类特定的Kafkavrodeserializer扩展了AbstractKafkavrodeserializer实现反序列化器{
私有最终模式;
私人阶级类型;
私有DecoderFactory DecoderFactory=DecoderFactory.get();
受保护的特定Kafkaavroderializer(类类型,地图道具){
this.type=type;
this.schema=ReflectData.get().getSchema(类型);
this.configure(this.deserializerConfig(props));
}
公共无效配置(映射配置){
此.configure(新KafkaAvroDeserializerConfig(configs));
}
@凌驾
受保护的T反序列化(
布尔includeSchemaAndVersion,
字符串主题,
布尔型isKey,
字节[]有效载荷,
架构读取器(SchemaInore)引发序列化异常{
如果(有效负载==null){
返回null;
}
int schemaId=-1;
试一试{
ByteBuffer缓冲区=ByteBuffer.wrap(有效负载);
if(buffer.get()!=MAGIC_字节){
抛出新的SerializationException(“未知的魔法字节!”);
}
schemaId=buffer.getInt();
Schema=schemaRegistry.getByID(schemaId);
Schema readerSchema=ReflectData.get().getSchema(类型);
int start=buffer.position()+buffer.arrayOffset();
int length=buffer.limit()-1-idSize;
SpecificDatumReader=新SpecificDatumReader(模式,readerSchema);
BinaryDecoder=decoderFactory.BinaryDecoder(buffer.array(),start,length,null);
返回reader.read(null,解码器);
}捕获(IOE异常){
抛出新的SerializationException(“反序列化id的Avro消息时出错”+schemaId,e);
}捕获(RestClientException e){
抛出新的SerializationException(“检索id的Avro架构时出错”+schemaId,e);
}
}
}

谢谢!我认为生成的bean扩展了
SpecificRecordBase
并实现了
SpecificRecord
,那么它与Avro反射数据有什么关系呢?我是Avro新手,只是想更好地理解一下。我尝试了你的代码,得到了以下异常:由以下原因引起:org.apache.Avro.avrotypexception:Found string,expecting com.MyBean位于org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:292)处org.apache.avro.io.Parser.Parser.advance(Parser.java:88)处org.apache.avro.io.ResolvingDecoder.readFieldOrder(ResolvingDecoder.java:130)处org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReadRecord.java:223)处在org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:174)在org.apache.avro.generic.GenericDatumReader.read(GeneI保持一切不变,只是切换到您提供的新反序列化程序。我不知道您正在使用特定数据。我更新了我的answer.properties.put(kafkaavroderializerconfig.SPECIFIC_AVRO_READER_CONFIG,true)为我工作。顺便说一句,有什么文档吗?我实际上一直在做尝试和错误来解决问题。我觉得需要将V重命名为TReflectData support合并
/**
 * Extends deserializer to support ReflectData.
 *
 * @param <V>
 *     value type
 */
public abstract class SpecificKafkaAvroDeserializer<V extends SpecificRecordBase> extends AbstractKafkaAvroDeserializer implements Deserializer<V> {
  private final Schema schema;
  private Class<T> type;
  private DecoderFactory decoderFactory = DecoderFactory.get();

  protected SpecificKafkaAvroDeserializer(Class<T> type, Map<String, ?> props) {
    this.type = type;
    this.schema = ReflectData.get().getSchema(type);
    this.configure(this.deserializerConfig(props));
  }

  public void configure(Map<String, ?> configs) {
    this.configure(new KafkaAvroDeserializerConfig(configs));
  }

  @Override
  protected T deserialize(
          boolean includeSchemaAndVersion,
          String topic,
          Boolean isKey,
          byte[] payload,
          Schema readerSchemaIgnore) throws SerializationException {

    if (payload == null) {
      return null;
    }

    int schemaId = -1;
    try {
      ByteBuffer buffer = ByteBuffer.wrap(payload);
      if (buffer.get() != MAGIC_BYTE) {
        throw new SerializationException("Unknown magic byte!");
      }

      schemaId = buffer.getInt();
      Schema schema = schemaRegistry.getByID(schemaId);

      Schema readerSchema = ReflectData.get().getSchema(type);

      int start = buffer.position() + buffer.arrayOffset();
      int length = buffer.limit() - 1 - idSize;
      SpecificDatumReader<T> reader = new SpecificDatumReader(schema, readerSchema);
      BinaryDecoder decoder = decoderFactory.binaryDecoder(buffer.array(), start, length, null);
      return reader.read(null, decoder);
    } catch (IOException e) {
      throw new SerializationException("Error deserializing Avro message for id " + schemaId, e);
    } catch (RestClientException e) {
      throw new SerializationException("Error retrieving Avro schema for id " + schemaId, e);
    }
  }
}