Java 如何制作卡夫卡avro唱片,与使用avro控制台制作的唱片完全相同?

Java 如何制作卡夫卡avro唱片,与使用avro控制台制作的唱片完全相同?,java,apache-kafka,avro,kafka-producer-api,confluent-schema-registry,Java,Apache Kafka,Avro,Kafka Producer Api,Confluent Schema Registry,我正在使用Confluent 3.3.0。我的目的是使用kafka connect将kafka主题中的值插入到Oracle表中。我的连接器与我使用avro console producer制作的avro记录配合良好,如下所示: ./kafka-avro-console-producer --broker-list 192.168.0.1:9092 --topic topic6 --property value.schema='{"type":"record","name":"flights3",

我正在使用Confluent 3.3.0。我的目的是使用kafka connect将kafka主题中的值插入到Oracle表中。我的连接器与我使用avro console producer制作的avro记录配合良好,如下所示:

./kafka-avro-console-producer --broker-list 192.168.0.1:9092 --topic topic6 --property value.schema='{"type":"record","name":"flights3","fields":[{"name":"flight_id","type":"string"},{"name":"flight_to", "type": "string"}, {"name":"flight_from", "type": "string"}]}'
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,io.confluent.kafka.serializers.KafkaAvroSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,io.confluent.kafka.serializers.KafkaAvroSerializer.class); 
我插入如下值:

{"flight_id":"1","flight_to":"QWE","flight_from":"RTY"}
我试图实现的是使用Java应用程序和对象插入相同的数据。以下是我的制作人代码:

public class Sender {
    public static void main(String[] args) {
        Properties props = new Properties();
        props.put("bootstrap.servers", "192.168.0.1:9092");
        props.put("acks", "all");
        props.put("retries", 0);
        props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
        props.put("value.serializer", "serializers.custom.FlightSerializer");
        props.put("schema.registry.url", "http://192.168.0.1:8081");
        Producer<String, Flight> producer = new KafkaProducer<String, Flight>(props);
        Flight myflight = new Flight("testflight1","QWE","RTY");
        ProducerRecord<String, Flight> record = new ProducerRecord<String, Flight>("topic5","key",myflight);

        try {
            producer.send(record).get();
        } catch (Exception e) {
            e.printStackTrace();
        }
    }
}
最后,序列化程序:

package serializers.custom;

import java.util.Map;
import org.apache.kafka.common.serialization.Serializer;
import vo.Flight;
import com.fasterxml.jackson.databind.ObjectMapper;

public class FlightSerializer implements Serializer<Flight> {
    @Override
    public void close() {
    }

    @Override
    public void configure(Map<String, ?> arg0, boolean arg1) {
    }

    @Override
    public byte[] serialize(String arg0, Flight arg1) {
        byte[] retVal = null;
        ObjectMapper objectMapper = new ObjectMapper();

        try {
            retVal = objectMapper.writeValueAsString(arg1).getBytes();
        } catch (Exception e) {
            e.printStackTrace();
        }

        return retVal;
    }
}
package serializers.custom;
导入java.util.Map;
导入org.apache.kafka.common.serialization.Serializer;
进口签证航班;
导入com.fasterxml.jackson.databind.ObjectMapper;
公共类FlightSerializer实现序列化程序{
@凌驾
公众假期结束(){
}
@凌驾
公共void配置(映射arg0、布尔arg1){
}
@凌驾
公共字节[]序列化(字符串arg0,航班arg1){
字节[]retVal=null;
ObjectMapper ObjectMapper=新的ObjectMapper();
试一试{
retVal=objectMapper.writeValueAsString(arg1.getBytes();
}捕获(例外e){
e、 printStackTrace();
}
返回返回;
}
}
但我的理解是,需要定义模式之类的东西,并使用一些avro序列化程序来获取精确数据,就像我使用avro控制台消费者一样。我已经阅读了一些示例代码,但没有一个对我有效

编辑
我尝试了以下代码。但在avro控制台消费者中什么都没有

package producer.serialized.avro;

import org.apache.avro.Schema;
import org.apache.avro.generic.GenericData;
import org.apache.avro.generic.GenericRecord;
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRecord;
import vo.Flight;
import java.util.Properties;

public class Sender {
public static void main(String[] args) {
String flightSchema = "{\"type\":\"record\"," + "\"name\":\"flights\","
+ "\"fields\":[{\"name\":\"flight_id\",\"type\":\"string\"},{\"name\":\"flight_to\",\"type\":\"string\"},{\"name\":\"flight_from\",\"type\":\"string\"}]}";
Properties props = new Properties();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "192.168.0.1:9092");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,io.confluent.kafka.serializers.KafkaAvroSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,io.confluent.kafka.serializers.KafkaAvroSerializer.class);
props.put("schema.registry.url", "http://192.168.0.1:8081");
KafkaProducer producer = new KafkaProducer(props);
Schema.Parser parser = new Schema.Parser();
Schema schema = parser.parse(flightSchema);
GenericRecord avroRecord = new GenericData.Record(schema);
avroRecord.put("flight_id", "1");
avroRecord.put("flight_to", "QWE");
avroRecord.put("flight_from", "RTY");
ProducerRecord<String, GenericRecord> record = new ProducerRecord<>("topic6", avroRecord);

try {
producer.send(record);
} catch (Exception e) {
e.printStackTrace();
}
}
}
package producer.serialized.avro;
导入org.apache.avro.Schema;
导入org.apache.avro.generic.GenericData;
导入org.apache.avro.generic.GenericRecord;
导入org.apache.kafka.clients.producer.KafkaProducer;
导入org.apache.kafka.clients.producer.ProducerConfig;
导入org.apache.kafka.clients.producer.ProducerRecord;
进口签证航班;
导入java.util.Properties;
公共类发送器{
公共静态void main(字符串[]args){
字符串flightSchema=“{\”类型\“:\”记录\“,“+”\”名称\“:\”航班\“,”
+“\”字段\\”:[{\”名称\“:\”航班id \“,\”类型\“:”字符串\“},{\”名称\“:”航班到\“,\”类型\“:”字符串\“},{\”名称\“:”航班从\“,”类型\“:”字符串\“}”;
Properties props=新属性();
props.put(ProducerConfig.BOOTSTRAP\u SERVERS\u CONFIG,“192.168.0.1:9092”);
put(ProducerConfig.KEY\u SERIALIZER\u CLASS\u CONFIG,io.confluent.kafka.serializers.KafkaAvroSerializer.CLASS);
put(ProducerConfig.VALUE\u SERIALIZER\u CLASS\u CONFIG,io.confluent.kafka.serializers.KafkaAvroSerializer.CLASS);
put(“schema.registry.url”http://192.168.0.1:8081");
卡夫卡制作人=新卡夫卡制作人(道具);
Schema.Parser=新Schema.Parser();
Schema Schema=parser.parse(flightSchema);
GenericRecord avroRecord=新的GenericData.Record(模式);
avroRecord.put(“航班号”,“1”);
avroRecord.put(“飞往”、“QWE”的航班);
avroRecord.put(“航班从”,“RTY”);
生产记录=新生产记录(“topic6”,avroRecord);
试一试{
制作人。发送(记录);
}捕获(例外e){
e、 printStackTrace();
}
}
}

架构未定义,因此当
KafkaAvroSerializer
必须联系架构注册表以提交架构时,它将不具有该架构

您必须为您的对象
Flight

下面的file.avdl(avro扩展名文件之一)示例可以:

@namespace("vo")
protocol FlightSender {

    record Flight {
       union{null, string} flight_id = null;
       union{null, string} flight_to = null;
       union{null, string} flight_from = null;
    }
}

,上面的avro模式将生成您的java
Flight
类,因此您必须删除先前创建的类

当涉及到主类时,必须设置以下两个属性:

./kafka-avro-console-producer --broker-list 192.168.0.1:9092 --topic topic6 --property value.schema='{"type":"record","name":"flights3","fields":[{"name":"flight_id","type":"string"},{"name":"flight_to", "type": "string"}, {"name":"flight_from", "type": "string"}]}'
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,io.confluent.kafka.serializers.KafkaAvroSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,io.confluent.kafka.serializers.KafkaAvroSerializer.class); 
您可以使用生成的特定Avro类

Producer<String, Flight> producer = new KafkaProducer<String, Flight>(props);
Producer=新卡夫卡制作人(道具);
希望这有帮助:-)

与我使用avro console consumer时所做的一样精确的数据

你可以


假设您想使用通用记录,这一切都是正确的

Properties props = new Properties();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "192.168.0.1:9092");
...
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,io.confluent.kafka.serializers.KafkaAvroSerializer.class);
props.put("schema.registry.url", "http://192.168.0.1:8081");

Producer<String, GenericRecord> producer = new KafkaProducer<>(props);

...

GenericRecord avroRecord = new GenericData.Record(schema);
avroRecord.put("flight_id", "1");
avroRecord.put("flight_to", "QWE");
avroRecord.put("flight_from", "RTY");
ProducerRecord<String, GenericRecord> record = new ProducerRecord<>("topic6", avroRecord);

try {
    producer.send(record);
} catch (Exception e) {
    e.printStackTrace();
}
Properties=newproperties();
props.put(ProducerConfig.BOOTSTRAP\u SERVERS\u CONFIG,“192.168.0.1:9092”);
...
put(ProducerConfig.VALUE\u SERIALIZER\u CLASS\u CONFIG,io.confluent.kafka.serializers.KafkaAvroSerializer.CLASS);
put(“schema.registry.url”http://192.168.0.1:8081");
制作人=新卡夫卡制作人(道具);
...
GenericRecord avroRecord=新的GenericData.Record(模式);
avroRecord.put(“航班号”,“1”);
avroRecord.put(“飞往”、“QWE”的航班);
avroRecord.put(“航班从”,“RTY”);
生产记录=新生产记录(“topic6”,avroRecord);
试一试{
制作人。发送(记录);
}捕获(例外e){
e、 printStackTrace();
}

但是您缺少对
producer.flush()
producer.close()
的调用,无法实际发送该批记录

“avro控制台消费者中没有任何信息。”->运行java代码时是否出现错误?@RobinMoffatt没有,没有任何信息,甚至没有错误。我已经在调试模式下重新启动了kafka服务器,但仍然没有收到任何信息。KafkaProducer将消息保存在缓冲区中,稍后再传输。程序立即终止,因此传输从未发生。调用
close()
flush()
方法。