Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache kafka Confluent 4.0.0 Kafka Connect-未找到架构注册表主题:org.apache.Kafka.Connect.errors.DataException:_Apache Kafka_Apache Kafka Connect_Confluent Platform_Confluent Schema Registry_Ksqldb - Fatal编程技术网

Apache kafka Confluent 4.0.0 Kafka Connect-未找到架构注册表主题:org.apache.Kafka.Connect.errors.DataException:

Apache kafka Confluent 4.0.0 Kafka Connect-未找到架构注册表主题:org.apache.Kafka.Connect.errors.DataException:,apache-kafka,apache-kafka-connect,confluent-platform,confluent-schema-registry,ksqldb,Apache Kafka,Apache Kafka Connect,Confluent Platform,Confluent Schema Registry,Ksqldb,我已经检查了两个类似的问题,但没有帮助 node 1 [/appl/node1/confluent-4.0.0] ./bin/confluent status elasticsearch-sink {"name":"elasticsearch-sink","connector":{"state":"RUNNING","worker_id":"10.192.226.24:8083"},"tasks":[{"state":"FAILED","trace":"org.apache.kafka.co

我已经检查了两个类似的问题,但没有帮助

node 1 [/appl/node1/confluent-4.0.0] ./bin/confluent status elasticsearch-sink


{"name":"elasticsearch-sink","connector":{"state":"RUNNING","worker_id":"10.192.226.24:8083"},"tasks":[{"state":"FAILED","trace":"org.apache.kafka.connect.errors.DataException:
emailfilters\n\tat
io.confluent.connect.avro.AvroConverter.toConnectData(AvroConverter.java:96)\n\tat
org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:453)\n\tat
org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:287)\n\tat
org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:198)\n\tat
org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:166)\n\tat
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)\n\tat
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)\n\tat
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat
java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat
java.lang.Thread.run(Thread.java:748)\nCaused by:
org.apache.kafka.common.errors.SerializationException: Error
retrieving Avro schema for id 21\nCaused by:
io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException:
Subject not found.; error code: 40401\n\tat
io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:191)\n\tat
io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:218)\n\tat
io.confluent.kafka.schemaregistry.client.rest.RestService.lookUpSubjectVersion(RestService.java:284)\n\tat
io.confluent.kafka.schemaregistry.client.rest.RestService.lookUpSubjectVersion(RestService.java:272)\n\tat
io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getVersionFromRegistry(CachedSchemaRegistryClient.java:71)\n\tat
io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getVersion(CachedSchemaRegistryClient.java:182)\n\tat
io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserialize(AbstractKafkaAvroDeserializer.java:152)\n\tat
io.confluent.kafka.serializers.AbstractKafkaAvroDeserializer.deserializeWithSchemaAndVersion(AbstractKafkaAvroDeserializer.java:194)\n\tat
io.confluent.connect.avro.AvroConverter$Deserializer.deserialize(AvroConverter.java:121)\n\tat
io.confluent.connect.avro.AvroConverter.toConnectData(AvroConverter.java:84)\n\tat
org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:453)\n\tat
org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:287)\n\tat
org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:198)\n\tat
org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:166)\n\tat
org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)\n\tat
org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)\n\tat
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat
java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat
java.lang.Thread.run(Thread.java:748)\n","id":0,"worker_id":"10.192.226.24:8083"}],"type":"sink"}
我的财产:

name=elasticsearch-sink
connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
tasks.max=1
topics=emailfilters
key.ignore=true
connection.url=http://127.0.0.1:9197
type.name=kafka-connect
尝试添加以下内容,但仍然出现相同的错误

value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://node1:9193

正在从KSQL流填充我的主题

这是故障的原因:

Caused by:
org.apache.kafka.common.errors.SerializationException: Error
retrieving Avro schema for id 21\nCaused by:
io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException:
Subject not found.; error code: 40401

这意味着您定义的模式注册表没有主题数据的模式。您是否在连接器配置中指定与KSQL相同的模式注册表

我注意到,如果使用
key.converter
到Avro,它仍然会尝试从字符串、int等非Avro数据类型中提取模式,然后给出这些令人困惑的错误。它检查字节有效负载是否以
0
开始,然后只获取接下来的4个字节,这并不是说它肯定不是Avro


检查主题键的数据类型,然后检查连接属性

是的,我在ksql和Connect中使用相同的架构注册表。我今天测试了弹性搜索连接,其中一个主题不是由ksql生成的,架构正在正确写入目标弹性搜索索引。我将再次测试KSQL,并让您知道。我在使用不同的连接器时遇到了相同的错误。架构存在,架构注册表的日志为响应返回HTTP代码200,但当尝试发布到
/subjects/{topic name}-key?deleted=true
时,它会得到404。不确定它为什么要这么做,因为它可以用模式很好地解码值。但本质上我的问题是,我用与连接器假设不匹配的主题名称对消息进行编码。@tytho您最好开始一个新问题,并提供有关您的设置、版本、配置等的全部详细信息。抱歉,我并不是想问更多关于我的问题。我们已经找到了一个不同的解决方法,只是想为连接器的行为添加一些细节。