Apache kafka kafka connect avro未找到适合jdbc的驱动程序:mysql://127.0.0.1:3306/connect_test

Apache kafka kafka connect avro未找到适合jdbc的驱动程序:mysql://127.0.0.1:3306/connect_test,apache-kafka,apache-kafka-connect,confluent-platform,Apache Kafka,Apache Kafka Connect,Confluent Platform,我在跟踪调查 docker教程 我启动kafka connect docker映像,并在docker日志中检查它是否启动正常 docker run -d \ --name=kafka-connect-avro \ --net=host \ -e CONNECT_BOOTSTRAP_SERVERS=localhost:29092 \ -e CONNECT_REST_PORT=28083 \ -e CONNECT_GROUP_ID="quickstart-avro" \ -e

我在跟踪调查 docker教程

我启动kafka connect docker映像,并在docker日志中检查它是否启动正常

docker run -d \
  --name=kafka-connect-avro \
  --net=host \
  -e CONNECT_BOOTSTRAP_SERVERS=localhost:29092 \
  -e CONNECT_REST_PORT=28083 \
  -e CONNECT_GROUP_ID="quickstart-avro" \
  -e CONNECT_CONFIG_STORAGE_TOPIC="quickstart-avro-config" \
  -e CONNECT_OFFSET_STORAGE_TOPIC="quickstart-avro-offsets" \
  -e CONNECT_STATUS_STORAGE_TOPIC="quickstart-avro-status" \
  -e CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR=1 \
  -e CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR=1 \
  -e CONNECT_STATUS_STORAGE_REPLICATION_FACTOR=1 \
  -e CONNECT_KEY_CONVERTER="io.confluent.connect.avro.AvroConverter" \
  -e CONNECT_VALUE_CONVERTER="io.confluent.connect.avro.AvroConverter" \
  -e CONNECT_KEY_CONVERTER_SCHEMA_REGISTRY_URL="http://localhost:8081" \
  -e CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL="http://localhost:8081" \
  -e CONNECT_INTERNAL_KEY_CONVERTER="org.apache.kafka.connect.json.JsonConverter" \
  -e CONNECT_INTERNAL_VALUE_CONVERTER="org.apache.kafka.connect.json.JsonConverter" \
  -e CONNECT_REST_ADVERTISED_HOST_NAME="localhost" \
  -e CONNECT_LOG4J_ROOT_LOGLEVEL=DEBUG \
  -e CONNECT_PLUGIN_PATH=/usr/share/java,/etc/kafka-connect/jars \
  -v /tmp/quickstart/file:/tmp/quickstart \
  -v /tmp/quickstart/jars:/etc/kafka-connect/jars \
  confluentinc/cp-kafka-connect:latest
当我试图发布到数据库时

curl -X POST   -H "Content-Type: application/json"   --data '{ "name": "quickstart-jdbc-source", "config": { "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector", "tasks.max": 1, "connection.url": "jdbc:mysql://127.0.0.1:3306/connect_test?user=root&password=confluent", "mode": "incrementing", "incrementing.column.name": "id", "timestamp.column.name": "modified", "topic.prefix": "quickstart-jdbc-", "poll.interval.ms": 1000 } }'   http://$CONNECT_HOST:28083/connectors
我明白了

mysql jdbc jar有什么问题吗

更详细地说,从下面的日志中可以看出,CONNECT\u PLUGIN\u路径没有任何作用

[2019-08-28 17:19:27,113] INFO Added alias 'BasicAuthSecurityRestExtension' to plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader)
    plugin.path = [/usr/share/java, /etc/kafka-connect/jars]
[2019-08-28 17:19:27,231] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig)
从下面看,这座山是成功的

docker inspect kafka-connect-avro

{
       "Type": "bind",
       "Source": "/tmp/quickstart/jars",
       "Destination": "/etc/kafka-connect/jars",
       "Mode": "",
       "RW": true,
       "Propagation": "rprivate"
},

我通过将jar复制到docker图像中来实现这一点

docker cp /tmp/quickstart/jars/mysql-connector-java-8.0.17.jar kafka-connect-avro:/usr/share/java/kafka

你看到这个了吗?检查mysql驱动程序是否可以在目录中的容器中找到:
/etc/kafka connect/jars
,这正是我的问题所在(/etc/kafka connect/jars起作用)。命令:-/bin/bash--c-| cd/tmp curl | tar xz cd mysql-connector-java-8.0.21 cp*.jar/etc/kafka connect/jars confluent hub install--无提示confluent inc/kafka connect jdbc:latest/etc/confluent/docker/run
docker cp /tmp/quickstart/jars/mysql-connector-java-8.0.17.jar kafka-connect-avro:/usr/share/java/kafka