Apache kafka 找不到实现连接器且名称与io.confluent.connect.elasticsearch.ElasticsearchSinkConnector匹配的任何类

Apache kafka 找不到实现连接器且名称与io.confluent.connect.elasticsearch.ElasticsearchSinkConnector匹配的任何类,apache-kafka,apache-kafka-connect,aws-msk,Apache Kafka,Apache Kafka Connect,Aws Msk,我有MSK在aws上运行,我能够从MSK发送和发送记录。 我只是想使用卡夫卡连接,这样进入MSK的记录将进入弹性搜索。 我做了以下工作,但我不确定我的连接器是否正常工作,因为我无法在弹性搜索中查看记录 这是我发送的记录 { "data": { "RequestID": 517082653, "ContentTypeID": 9, "OrgID": 16

我有MSK在aws上运行,我能够从MSK发送和发送记录。 我只是想使用卡夫卡连接,这样进入MSK的记录将进入弹性搜索。 我做了以下工作,但我不确定我的连接器是否正常工作,因为我无法在弹性搜索中查看记录 这是我发送的记录

{
        "data": {
                "RequestID":    517082653,
                "ContentTypeID":        9,
                "OrgID":        16145,
                "UserID":       4,
                "PromotionStartDateTime":       "2019-12-14T16:06:21Z",
                "PromotionEndDateTime": "2019-12-14T16:16:04Z",
                "SystemStartDatetime":  "2019-12-14T16:17:45.507000000Z"
        },
        "metadata":     {
                "timestamp":    "2019-12-29T10:37:31.502042Z",
                "record-type":  "data",
                "operation":    "insert",
                "partition-key-type":   "schema-table",
                "schema-name":  "dbo",
                "table-name":   "TRFSDIQueue"
        }
}
我已经像这样安装了Kafka connect

wget /usr/local http://packages.confluent.io/archive/5.2/confluent-5.2.0-2.11.tar.gz -P ~/Downloads/

tar -zxvf ~/Downloads/confluent-5.2.0-2.11.tar.gz -C ~/Downloads/
sudo mv ~/Downloads/confluent-5.2.0 /usr/local/confluent

vim /usr/local/confluent/etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties
并更改url和主题名称

之后,我刚刚开始卡夫卡连接,如下所示

/usr/local/confluent/bin/connect-standalone /usr/local/confluent/etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties
这给了我输出
信息用法:ConnectStandalone worker.properties connector1.properties[connector2.properties…](org.apache.kafka.connect.cli.ConnectStandalone:62)

所以我没有提到我的经纪人详细信息和zookeper详细信息,那么我将如何连接到MSK

请帮助我理解这一点。我错过了什么。 我不进行架构转换,因此我不修改
架构注册表.properties

当我尝试使用下面的命令时,我得到下面的错误

/usr/local/confluent/bin/connect-standalone /home/ec2-user/kafka_2.12-2.2.1/config/connect-standalone.properties /usr/local/confluent/etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties


[2019-12-30 03:19:30,149] ERROR Failed to create job for /usr/local/confluent/etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties (org.apache.kafka.connect.cli.ConnectStandalone:108)
[2019-12-30 03:19:30,149] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:119)
java.util.concurrent.ExecutionException: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.elasticsearch.ElasticsearchSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}
        at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:79)
        at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:66)
        at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:116)
Caused by: org.apache.kafka.connect.errors.ConnectException: Failed to find any class that implements Connector and which name matches io.confluent.connect.elasticsearch.ElasticsearchSinkConnector, available connectors are: PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSinkConnector, name='org.apache.kafka.connect.file.FileStreamSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.file.FileStreamSourceConnector, name='org.apache.kafka.connect.file.FileStreamSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockConnector, name='org.apache.kafka.connect.tools.MockConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=connector, typeName='connector', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSinkConnector, name='org.apache.kafka.connect.tools.MockSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=sink, typeName='sink', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.MockSourceConnector, name='org.apache.kafka.connect.tools.MockSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.SchemaSourceConnector, name='org.apache.kafka.connect.tools.SchemaSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSinkConnector, name='org.apache.kafka.connect.tools.VerifiableSinkConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}, PluginDesc{klass=class org.apache.kafka.connect.tools.VerifiableSourceConnector, name='org.apache.kafka.connect.tools.VerifiableSourceConnector', version='2.2.0-cp1', encodedVersion=2.2.0-cp1, type=source, typeName='source', location='classpath'}
        at org.apache.kafka.connect.runtime.isolation.Plugins.newConnector(Plugins.java:175)
        at org.apache.kafka.connect.runtime.AbstractHerder.getConnector(AbstractHerder.java:382)
        at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:261)
        at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:188)
        at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:113)

将引导服务器和连接相关属性放入etc/kafka/Connect-standalone.properties文件中

要加载连接器,必须使用此处讨论的
plugin.path

您也可以下载Confluent Hub CLI来设置任何可用的Apache Kafka连接器(您不需要Confluent平台,因此可以忽略模式注册表)

或者如前所述,使用融合的Kafka Connect docker图像,该图像已预加载Elasticsearch连接器

另外,我建议不要使用tarballs,而是使用APT/YUM来安装合流包


这是否回答了您的问题?我在那个回答中给你看了正确的命令。