kafka jdbc接收器连接器独立错误

kafka jdbc接收器连接器独立错误,jdbc,apache-kafka-connect,confluent-platform,Jdbc,Apache Kafka Connect,Confluent Platform,我试图从卡夫卡的一个主题将数据插入postgres数据库。我正在使用以下命令加载 ./bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-jdbc/sink-quickstart-mysql.properties sink-quickstart-mysql.properties如下所示 name=test-sink-mysql-jdbc-autoincre

我试图从卡夫卡的一个主题将数据插入postgres数据库。我正在使用以下命令加载

./bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-jdbc/sink-quickstart-mysql.properties
sink-quickstart-mysql.properties如下所示

name=test-sink-mysql-jdbc-autoincrement
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
tasks.max=1
topics=third_topic
connection.url=jdbc:postgres://localhost:5432/postgres
connection.user=postgres
connection.password=postgres
auto.create=true
我得到的错误是

[2019-01-29 13:16:48,859] ERROR Failed to create job for /home/ashley/confluent-5.1.0/etc/kafka-connect-jdbc/sink-quickstart-mysql.properties (org.apache.kafka.connect.cli.ConnectStandalone:102) [2019-01-29 13:16:48,862] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:113) java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}     at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)  at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)     at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:110) Caused by: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.jdbc.JdbcSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.1.0-cp1', encodedVersion=2.1.0-cp1, type=source, typeName='source', location='classpath'}   at org.apache.kafka.connect.runtime.isolation.Plugins.newConnector(Plugins.java:179)    at org.apache.kafka.connect.runtime.AbstractHerder.getConnector(AbstractHerder.java:382)    at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:261)     at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:189)   at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:107) [2019-01-29 13:16:48,886] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:65) [2019-01-29 13:16:48,886] INFO Stopping REST server (org.apache.kafka.connect.runtime.rest.RestServer:223) [2019-01-29 13:16:48,894] INFO Stopped http_8083@dc4fee1{HTTP/1.1,[http/1.1]}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:341) [2019-01-29 13:16:48,895] INFO node0 Stopped scavenging (org.eclipse.jetty.server.session:167) [2019-01-29 13:16:48,930] INFO Stopped o.e.j.s.ServletContextHandler@3c46dcbe{/,null,UNAVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:1040) [2019-01-29 13:16:48,943] INFO REST server stopped (org.apache.kafka.connect.runtime.rest.RestServer:241) [2019-01-29 13:16:48,943] INFO Herder stopping (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:95) [2019-01-29 13:16:48,944] INFO Worker stopping (org.apache.kafka.connect.runtime.Worker:184) [2019-01-29 13:16:48,944] INFO Stopped FileOffsetBackingStore (org.apache.kafka.connect.storage.FileOffsetBackingStore:66) [2019-01-29 13:16:48,947] INFO Worker stopped (org.apache.kafka.connect.runtime.Worker:205) [2019-01-29 13:16:48,950] INFO Herder stopped (org.apache.kafka.connect.runtime.standalone.StandaloneHerder:112) [2019-01-29 13:16:48,951] INFO Kafka Connect stopped (org.apache.kafka.connect.runtime.Connect:70)

文件夹中已存在Postgres jar文件。有人能提供建议吗?

这几行是您日志中最重要的一行:

java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException:未能找到任何 类,该类实现连接器并且名称匹配 io.confluent.connect.jdbc.JdbcSinkConnector,可用的连接器 是:

似乎您没有安装
kafka connect jdbc
connector

并确保
plugin.path
的行未注释

如果不使用Confluent Platform,您需要在
plugin.path
目录下为jdbc插件创建另一个目录:例如
kafka connect jdbc
,并将所有需要的jar放在那里,例如
kafka-connect-jdbc-5.1.0.jar
,它的依赖项和jdbc驱动程序


可以找到更多详细信息:

这几行是日志中最重要的一行:

java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException:未能找到任何 类,该类实现连接器并且名称匹配 io.confluent.connect.jdbc.JdbcSinkConnector,可用的连接器 是:

似乎您没有安装
kafka connect jdbc
connector

并确保
plugin.path
的行未注释

如果不使用Confluent Platform,您需要在
plugin.path
目录下为jdbc插件创建另一个目录:例如
kafka connect jdbc
,并将所有需要的jar放在那里,例如
kafka-connect-jdbc-5.1.0.jar
,它的依赖项和jdbc驱动程序


可以找到更多详细信息:

注意:
plugin.path
应该是指向完整
…share/java
位置的绝对路径。不仅仅是默认属性中写的
share/java
。注意:
plugin.path
应该是指向完整
…share/java
位置的绝对路径。不仅是
共享/java
,如默认属性中所述