Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/vue.js/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
<img src="//i.stack.imgur.com/RUiNP.png" height="16" width="18" alt="" class="sponsor tag img">elasticsearch 为多个字段创建Kafka connect弹性搜索ID无效_<img Src="//i.stack.imgur.com/RUiNP.png" Height="16" Width="18" Alt="" Class="sponsor Tag Img">elasticsearch_Apache Kafka_Apache Kafka Connect_Confluent Platform - Fatal编程技术网 elasticsearch 为多个字段创建Kafka connect弹性搜索ID无效,elasticsearch,apache-kafka,apache-kafka-connect,confluent-platform,elasticsearch,Apache Kafka,Apache Kafka Connect,Confluent Platform" /> elasticsearch 为多个字段创建Kafka connect弹性搜索ID无效,elasticsearch,apache-kafka,apache-kafka-connect,confluent-platform,elasticsearch,Apache Kafka,Apache Kafka Connect,Confluent Platform" />

elasticsearch 为多个字段创建Kafka connect弹性搜索ID无效

elasticsearch 为多个字段创建Kafka connect弹性搜索ID无效,elasticsearch,apache-kafka,apache-kafka-connect,confluent-platform,elasticsearch,Apache Kafka,Apache Kafka Connect,Confluent Platform,我问这个问题是因为在最初的案例中没有答案: 我也有类似的情况 弹性搜索表,用于在通过kafkaconnect发送请求时为单个字段创建记录,但不为多个字段创建记录 获取异常“密钥用作文档id,在弹性搜索中不能为空” 我的连接器配置: { "name": "test-connector33", "config": { "connector.class":"io.confluent.connect.elasticsearch.ElasticsearchSinkConnector", "task

我问这个问题是因为在最初的案例中没有答案:

我也有类似的情况

弹性搜索表,用于在通过kafkaconnect发送请求时为单个字段创建记录,但不为多个字段创建记录

获取异常“密钥用作文档id,在弹性搜索中不能为空”

我的连接器配置:

{
 "name": "test-connector33",
 "config": {
 "connector.class":"io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
 "tasks.max": "1",
 "topics": "test-connector33",
 "connection.url": "http://localhost:9200",
 "type.name": "aggregator",
 "schema.ignore": "true",
 "topic.schema.ignore": "true",
  "topic.key.ignore": "false",
 "value.converter": "org.apache.kafka.connect.json.JsonConverter",
 "value.converter.schemas.enable": "false", 
 "key.converter": "org.apache.kafka.connect.json.JsonConverter",
 "key.converter.schemas.enable": "false",
 "key.ignore":"false",
 "name": "test-connector33",
"transforms": "InsertKey,extractKey",
"transforms.InsertKey.type":"org.apache.kafka.connect.transforms.ValueToKey",
"transforms.InsertKey.fields":"customerId,city",
"transforms.extractKey.type":"org.apache.kafka.connect.transforms.ExtractField$Key",
"transforms.extractKey.field":"customerId,city"
}}
你知道怎么解决这个问题吗


提前谢谢

org.apache.kafka.connect.transforms.ExtractField$Key
仅支持单个字段

假设您的JSON对象是一个
HashMap
。您找不到字段
customerId,city
,因此
map.get(field)
操作返回
null
,因此将字段设置为null


如果您想通过console producer发送键,欢迎您添加
--property print.key=true
作为标志,然后键入键,按tab键,然后输入值。如果您想将数据回送到流程中,那么您还可以为竖条设置
--property key.separator='.
,以及添加
--property parse.key=true

我正在将以下数据提交到主题:回送“{”customerId\”:“Jishnu1534465795885\”,“city\:“fremont\”,“name\:“Jishnu\”,“age\”:31地址:“:[{addressId\”:“215344465795884\”,“city\”:“Dallas\”,“state\”:“TX\”,{“addressId\”:“11534465795884\”,“city\”:“Detroit\”,“state\”:“MI\”)}124;。/卡夫卡控制台制作人——代理列表本地主机:9092——主题测试-connector33@rmoff,你知道如何解决这个问题吗?