Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache kafka Kafka Connect弹性搜索的KSQL AVRO格式问题:org.apache.Kafka.Connect.errors.DataException:de*******ense_Apache Kafka_Apache Kafka Connect_Confluent Platform - Fatal编程技术网

Apache kafka Kafka Connect弹性搜索的KSQL AVRO格式问题:org.apache.Kafka.Connect.errors.DataException:de*******ense

Apache kafka Kafka Connect弹性搜索的KSQL AVRO格式问题:org.apache.Kafka.Connect.errors.DataException:de*******ense,apache-kafka,apache-kafka-connect,confluent-platform,Apache Kafka,Apache Kafka Connect,Confluent Platform,正在为弹性搜索连接器引发以下异常: [2018-05-07 11:40:38,975] ERROR WorkerSinkTask{id=elasticsearch-sink-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:172) org.apache.kafka.connect.errors.DataException: de******ense

正在为弹性搜索连接器引发以下异常:

[2018-05-07 11:40:38,975] ERROR WorkerSinkTask{id=elasticsearch-sink-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:172)
org.apache.kafka.connect.errors.DataException: de******ense
        at io.confluent.connect.avro.AvroConverter.toConnectData(AvroConverter.java:95)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:467)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:301)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:205)
        at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:173)
        at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
        at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.common.errors.SerializationException: Error deserializing Avro message for id -1
Caused by: org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
[2018-05-07 11:40:38,976] ERROR WorkerSinkTask{id=elasticsearch-sink-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:173)
quickstart elasticsearch.properties的我的配置:

name=elasticsearch-sink
connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
tasks.max=1
topics=de******ense
key.ignore=true
compact.map.entries=false
connection.url=http://127.0.0.1:9197
type.name=kafka-connect
我正在传递
键。ignore=true
但它试图解析键

来自WorkerSinkTask.java:467

SchemaAndValue keyAndSchema = keyConverter.toConnectData(msg.topic(), msg.key());
连接器正在尝试分析密钥,但主题中没有密钥

主题数据示例:

{"EXPENSE_CODE":{"string":"NL1230"},"EXPENSE_CODE_DESCRIPTION":{"string":"ABC Company"},"NO_OF_DEALS":{"long":7}}
    {"EXPENSE_CODE":{"string":"NL1220"},"EXPENSE_CODE_DESCRIPTION":{"string":"XYZ Company"},"NO_OF_DEALS":{"long":308}}
    {"EXPENSE_CODE":{"string":"NL1210"},"EXPENSE_CODE_DESCRIPTION":{"string":"Alberthijn - Amsterdam"},"NO_OF_DEALS":{"long":287}}
    {"EXPENSE_CODE":{"string":"NL1200"},"EXPENSE_CODE_DESCRIPTION":{"string":"KLM - ADAM"},"NO_OF_DEALS":{"long":609}}
    {"EXPENSE_CODE":{"string":"NL1240"},"EXPENSE_CODE_DESCRIPTION":{"string":"EXIDS- Global Limit"},"NO_OF_DEALS":{"long":9786}}
架构注册表/connect-avro-distributed.properties

bootstrap.servers=localhost:9192


#schema.registry.url=http://localhost:9193
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:9193
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:9193

# tried schema enable true as well for keys
key.converter.schemas.enable=false
value.converter.schemas.enable=true

config.storage.topic=connect-configs
offset.storage.topic=connect-offsets
status.storage.topic=connect-statuses

config.storage.replication.factor=1
offset.storage.replication.factor=1
status.storage.replication.factor=1

#offset.storage.partitions=25
#status.storage.partitions=5


internal.key.converter.schema.registry.url=http://localhost:9193
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter.schema.registry.url=http://localhost:9193
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
plugin.path=bin/../share/java # tried share/java as well

它指向正确的模式注册表URL

问题是当KSQL写入表或流时。它的键为String,值为Avro

如果您更改如下所示的配置,它将正常工作

vi etc/schema registry/connect avro distributed.properties

bootstrap.servers=lrv141rq:9192

group.id=connect-cluster

key.converter=org.apache.kafka.connect.storage.StringConverter
key.converter.schema.registry.url=http://localhost:9193

value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:9193

key.converter.schemas.enable=false
value.converter.schemas.enable=true

config.storage.topic=connect-configs
offset.storage.topic=connect-offsets
status.storage.topic=connect-statuses

config.storage.replication.factor=1
offset.storage.replication.factor=1
status.storage.replication.factor=1


#offset.storage.partitions=25
#status.storage.partitions=5


internal.key.converter=org.apache.kafka.connect.storage.StringConverter
internal.key.converter.schema.registry.url=http://localhost:9193

internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter.schema.registry.url=http://localhost:9193

internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false

plugin.path=bin/../share/java
name=elasticsearch-sink
connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
tasks.max=1
topics=deal-expense,emailfilters
key.ignore=true
compact.map.entries=false
connection.url=http://127.0.0.1:9197
type.name=kafka-connect
vi etc/kafka connect elasticsearch/quickstart elasticsearch.properties

bootstrap.servers=lrv141rq:9192

group.id=connect-cluster

key.converter=org.apache.kafka.connect.storage.StringConverter
key.converter.schema.registry.url=http://localhost:9193

value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:9193

key.converter.schemas.enable=false
value.converter.schemas.enable=true

config.storage.topic=connect-configs
offset.storage.topic=connect-offsets
status.storage.topic=connect-statuses

config.storage.replication.factor=1
offset.storage.replication.factor=1
status.storage.replication.factor=1


#offset.storage.partitions=25
#status.storage.partitions=5


internal.key.converter=org.apache.kafka.connect.storage.StringConverter
internal.key.converter.schema.registry.url=http://localhost:9193

internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter.schema.registry.url=http://localhost:9193

internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false

plugin.path=bin/../share/java
name=elasticsearch-sink
connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
tasks.max=1
topics=deal-expense,emailfilters
key.ignore=true
compact.map.entries=false
connection.url=http://127.0.0.1:9197
type.name=kafka-connect
变化如下:


key.converter=org.apache.kafka.connect.storage.StringConverter

实际错误是:
由以下原因引起:org.apache.kafka.common.errors.SerializationException:反序列化id-1的Avro消息的错误:org.apache.kafka.common.errors.SerializationException:未知的魔法字节您接收的消息不是Avro消息,或者它们不是使用Confluent serializer序列化的。Confluent遵循
[(魔法字节)(模式ID)(数据)]
的格式检查链接:在您的情况下,它无法找到魔法字节。感谢@Explorer的响应。数据是由KSQL生成的。这是ksql表的一个主题,我正在尝试使用connect阅读该表。此主题没有键,并且在尝试分析键时由于其为null而引发异常。我在配置中禁用了密钥解析,但仍然不起作用<代码>ksql>显示表格;表名|卡夫卡主题|格式|窗口-------------------------------------------------------------------交易费用|交易费用| AVRO |真实