Apache kafka 我可以定义卡夫卡主题的模式吗?
我向卡夫卡主题发送数据和值模式,如下所示:Apache kafka 我可以定义卡夫卡主题的模式吗?,apache-kafka,apache-kafka-connect,confluent-schema-registry,Apache Kafka,Apache Kafka Connect,Confluent Schema Registry,我向卡夫卡主题发送数据和值模式,如下所示: ./bin/kafka-avro-console-producer \ --broker-list 10.0.0.0:9092 --topic orders \ --property parse.key="true" \ --property key.schema='{"type":"record","name":"key_schema","fields":[{"name":"id","type":"int"}]}' \ --prope
./bin/kafka-avro-console-producer \
--broker-list 10.0.0.0:9092 --topic orders \
--property parse.key="true" \
--property key.schema='{"type":"record","name":"key_schema","fields":[{"name":"id","type":"int"}]}' \
--property key.separator="$" \
--property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"id","type":["null","int"],"default": null},{"name":"product","type": ["null","string"],"default": null}, {"name":"quantity", "type": ["null","int"],"default": null}, {"name":"price","type": ["null","int"],"default": null}]}' \
--property schema.registry.url=http://10.0.0.0:8081
{
"name": "jdbc-oracle",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"tasks.max": "1",
"topics": "orders",
"connection.url": "jdbc:oracle:thin:@10.1.2.3:1071/orac",
"connection.user": "[redact]",
"connection.password": "[redact]",
"auto.create": "true",
"delete.enabled": "true",
"pk.mode": "record_key",
"pk.fields": "id",
"insert.mode": "upsert",
"name": "jdbc-oracle"
},
"tasks": [
{
"connector": "jdbc-oracle",
"task": 0
}
],
"type": "sink"
}
然后我从卡夫卡那里得到了这个接收器属性的数据:
./bin/kafka-avro-console-producer \
--broker-list 10.0.0.0:9092 --topic orders \
--property parse.key="true" \
--property key.schema='{"type":"record","name":"key_schema","fields":[{"name":"id","type":"int"}]}' \
--property key.separator="$" \
--property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"id","type":["null","int"],"default": null},{"name":"product","type": ["null","string"],"default": null}, {"name":"quantity", "type": ["null","int"],"default": null}, {"name":"price","type": ["null","int"],"default": null}]}' \
--property schema.registry.url=http://10.0.0.0:8081
{
"name": "jdbc-oracle",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"tasks.max": "1",
"topics": "orders",
"connection.url": "jdbc:oracle:thin:@10.1.2.3:1071/orac",
"connection.user": "[redact]",
"connection.password": "[redact]",
"auto.create": "true",
"delete.enabled": "true",
"pk.mode": "record_key",
"pk.fields": "id",
"insert.mode": "upsert",
"name": "jdbc-oracle"
},
"tasks": [
{
"connector": "jdbc-oracle",
"task": 0
}
],
"type": "sink"
}
但我只想从kafka获取json,而不使用value.schema
。如果我把kafka主题放在这个json数据上
{"id":9}${"id": {"int":9}, "product": {"string":"Yağız Gülbahar"}, "quantity": {"int":1071}, "price": {"int":61}}
我如何从kafka获得这些数据,并将oracle与融合的jdbc接收器结合起来
我想在卡夫卡连接端创建模式
另一件事是,我可以从一个kafka主题中获得两种不同类型的数据,它通过jdbc接收器进入oracle端的两个不同的表。如果您的源主题包含JSON数据,并且没有声明任何模式,则必须先添加该模式,然后才能使用jdbc接收器 选择包括:
是的,每个主题可以被多个接收器使用。配置选项可用于根据需要将主题路由到不同的表名 你的意思是,如何在不提供有效负载模式的情况下生成Avro消息?
plugin.path不属于连接器配置。谢谢你的回答,我知道必须在confluent jdbc接收器的主题上有模式?而汇合的jdbc必须从一个主题进行一次转换。我说的对吗“汇合jdbc必须从一个主题转换成一个主题”。您能更清楚地解释一下您的问题吗?“汇合jdbc接收器的主题必须有模式?”正确。我已经更新了我的答案。在StackOverflow上,如果您的原始问题有新的问题,您真的应该发布一个新问题:)谢谢您的回答,还有一件事可能我错了,我只是用jdbc接收器将数据从kafka传输到oracle(使用jdbc的anydb),是吗?ksql或ksqldb是不同的东西?