Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
.net 如何使用网络制作卡夫卡中的墓碑Avro记录?_.net_Apache Kafka_Avro_Kafka Producer Api_Confluent Schema Registry - Fatal编程技术网

.net 如何使用网络制作卡夫卡中的墓碑Avro记录?

.net 如何使用网络制作卡夫卡中的墓碑Avro记录?,.net,apache-kafka,avro,kafka-producer-api,confluent-schema-registry,.net,Apache Kafka,Avro,Kafka Producer Api,Confluent Schema Registry,mysink.properties: { "name": "jdbc-oracle", "config": { "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector", "tasks.max": "1", "topics": "orders", "connection.url": "jdbc:oracle:thin:@10.1.2.3:1071/orac", "connect

my
sink.properties

{
  "name": "jdbc-oracle",
  "config": {
    "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
    "tasks.max": "1",
    "topics": "orders",
    "connection.url": "jdbc:oracle:thin:@10.1.2.3:1071/orac",
    "connection.user": "ersin",
    "connection.password": "ersin!",
    "auto.create": "true",
    "delete.enabled": "true",
    "pk.mode": "record_key",
    "pk.fields": "id",
    "insert.mode": "upsert",
    "plugin.path": "/home/ersin/confluent-5.4.1/share/java/",
    "name": "jdbc-oracle"
  },
  "tasks": [
    {
      "connector": "jdbc-oracle",
      "task": 0
    }
  ],
  "type": "sink"
}
我的
连接avro分布式。属性

bootstrap.servers=10.0.0.0:9092

group.id=connect-cluster

key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://10.0.0.0:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://10.0.0.0:8081

config.storage.topic=connect-configs
offset.storage.topic=connect-offsets
status.storage.topic=connect-statuses

config.storage.replication.factor=1
offset.storage.replication.factor=1
status.storage.replication.factor=1

internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
代码

var messageToSend=新消息
{
键=记录键
//,Value=recordValue
};
当我想用空值发送数据时,它会给出错误(
空引用

如何解决此错误

提前感谢

var tombstoner=新的ProducerBuilder(\u kafkanconfiguration.ProducerConfiguration)
var tombstoner = new ProducerBuilder<int, Null>(_kafkaConfiguration.ProducerConfiguration)
    .SetKeySerializer(new AvroSerializer<int>(_schemaRegistryClient))
    .SetValueSerializer(Serializers.Null)
    .Build();

var tasks = properties.Select(property => tombstoner.ProduceAsync(
    "yourTopicName",
    new Message<int, Null> {
        Key = 100,
        Value = null,
        Timestamp = Timestamp.Default
    }
));
.SetKeySerializer(新的AvroSerializer(_SchemaRegistrCyclient)) .SetValueSerializer(Serializers.Null) .Build(); var tasks=properties.Select(property=>tombstoner.ProduceAsync( “你的主题名”, 新消息{ 键=100, 值=空, Timestamp=Timestamp.Default } ));

但是请注意,目前无法使用融合的kafka dotnet客户机使用墓碑记录

var tombstoner = new ProducerBuilder<int, Null>(_kafkaConfiguration.ProducerConfiguration)
    .SetKeySerializer(new AvroSerializer<int>(_schemaRegistryClient))
    .SetValueSerializer(Serializers.Null)
    .Build();

var tasks = properties.Select(property => tombstoner.ProduceAsync(
    "yourTopicName",
    new Message<int, Null> {
        Key = 100,
        Value = null,
        Timestamp = Timestamp.Default
    }
));