Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 将POJO转换为confluent.io中的通用记录以通过KafkaProducer发送_Java_Apache Kafka_Avro_Kafka Producer Api - Fatal编程技术网

Java 将POJO转换为confluent.io中的通用记录以通过KafkaProducer发送

Java 将POJO转换为confluent.io中的通用记录以通过KafkaProducer发送,java,apache-kafka,avro,kafka-producer-api,Java,Apache Kafka,Avro,Kafka Producer Api,我对卡夫卡和阿夫罗是完全陌生的,我试图使用融合的软件包。我们有用于JPA的现有POJO,我希望能够简单地生成我的POJO实例,而不必手动将每个值反映到通用记录中。我似乎不知道文档中是如何做到这一点的 示例使用通用记录并逐个设置每个值,如下所示: String key = "key1"; String userSchema = "{\"type\":\"record\"," + "\"name\":\"myrecord\"," +

我对卡夫卡和阿夫罗是完全陌生的,我试图使用融合的软件包。我们有用于JPA的现有POJO,我希望能够简单地生成我的POJO实例,而不必手动将每个值反映到通用记录中。我似乎不知道文档中是如何做到这一点的

示例使用通用记录并逐个设置每个值,如下所示:

String key = "key1";
String userSchema = "{\"type\":\"record\"," +
                    "\"name\":\"myrecord\"," +
                    "\"fields\":[{\"name\":\"f1\",\"type\":\"string\"}]}";
Schema.Parser parser = new Schema.Parser();
Schema schema = parser.parse(userSchema);
GenericRecord avroRecord = new GenericData.Record(schema);
avroRecord.put("f1", "value1");

record = new ProducerRecord<Object, Object>("topic1", key, avroRecord);
try {
  producer.send(record);
} catch(SerializationException e) {
  // may need to do something with it
}
String key=“key1”;
字符串userSchema=“{\”类型\“:\”记录\“+
“\”名称\“:\”我的记录\“,”+
“\'fields\':[{\'name\':\'f1\',\'type\':\'string\'}];
Schema.Parser=新Schema.Parser();
Schema=parser.parse(userSchema);
GenericRecord avroRecord=新的GenericData.Record(模式);
avroRecord.put(“f1”、“价值1”);
记录=新产品记录(“主题1”,关键字,avroRecord);
试一试{
制作人。发送(记录);
}捕获(序列化异常){
//可能需要做点什么
}
有几个例子可以从一个类中获取模式,我找到了必要时修改该模式的注释。现在,我如何获取POJO的一个实例,并按原样将其发送到序列化程序,让库完成从类匹配模式的工作,然后将值复制到通用记录中?我这样做完全错了吗?我最终想做的是这样的事情:

String key = "key1";
Schema schema = ReflectData.get().getSchema(myObject.getClass());
GenericRecord avroRecord = ReflectData.get().getRecord(myObject, schema);

record = new ProducerRecord<Object, Object>("topic1", key, avroRecord);
try {
  producer.send(record);
} catch(SerializationException e) {
  // may need to do something with it
}
String key=“key1”;
Schema Schema=ReflectData.get().getSchema(myObject.getClass());
GenericRecord avroRecord=ReflectData.get().getRecord(myObject,schema);
记录=新产品记录(“主题1”,关键字,avroRecord);
试一试{
制作人。发送(记录);
}捕获(序列化异常){
//可能需要做点什么
}

谢谢

我在这个实例中创建了自己的序列化程序:

public class KafkaAvroReflectionSerializer extends KafkaAvroSerializer {
   private final EncoderFactory encoderFactory = EncoderFactory.get();

   @Override
   protected byte[] serializeImpl(String subject, Object object) throws SerializationException {
      //TODO: consider caching schemas
      Schema schema = null;

      if(object == null) {
         return null;
      } else {
         try {
            schema = ReflectData.get().getSchema(object.getClass());
            int e = this.schemaRegistry.register(subject, schema);
            ByteArrayOutputStream out = new ByteArrayOutputStream();
            out.write(0);
            out.write(ByteBuffer.allocate(4).putInt(e).array());

            BinaryEncoder encoder = encoderFactory.directBinaryEncoder(out, null);
            DatumWriter<Object> writer = new ReflectDatumWriter<>(schema);
            writer.write(object, encoder);
            encoder.flush();
            out.close();

            byte[] bytes = out.toByteArray();
            return bytes;
         } catch (IOException ioe) {
            throw new SerializationException("Error serializing Avro message", ioe);
         } catch (RestClientException rce) {
            throw new SerializationException("Error registering Avro schema: " + schema, rce);
         } catch (RuntimeException re) {
            throw new SerializationException("Error serializing Avro message", re);
         }
      }
   }
}
公共类KafkaAvroReflectionSerializer扩展了KafkaAvroSerializer{
私有最终编码器工厂EncoderFactory=EncoderFactory.get();
@凌驾
受保护字节[]serializeImpl(字符串主题、对象对象)引发序列化异常{
/todo:考虑缓存模式
Schema=null;
if(object==null){
返回null;
}否则{
试一试{
schema=ReflectData.get().getSchema(object.getClass());
int e=this.schemaRegistry.register(subject,schema);
ByteArrayOutputStream out=新建ByteArrayOutputStream();
输出。写入(0);
out.write(ByteBuffer.allocate(4.putInt(e.array());
BinaryEncoder=encoderFactory.directBinaryEncoder(out,null);
DatumWriter=新的反射DatumWriter(模式);
writer.write(对象、编码器);
encoder.flush();
out.close();
byte[]bytes=out.toByteArray();
返回字节;
}捕获(ioe异常ioe){
抛出新的序列化异常(“序列化Avro消息时出错”,ioe);
}捕获(RestClientException rce){
抛出新的SerializationException(“注册Avro架构时出错:“+schema,rce”);
}捕获(运行时异常re){
抛出新的序列化异常(“序列化Avro消息时出错”,re);
}
}
}
}