Apache kafka 雪花卡夫卡连接->;错误:由以下原因引起:java.lang.ClassNotFoundException:org.bouncycastle.jcajce.provider.BouncyCastleFipsProvider

Apache kafka 雪花卡夫卡连接->;错误:由以下原因引起:java.lang.ClassNotFoundException:org.bouncycastle.jcajce.provider.BouncyCastleFipsProvider,apache-kafka,bouncycastle,snowflake-cloud-data-platform,connect,Apache Kafka,Bouncycastle,Snowflake Cloud Data Platform,Connect,当我发送启动卡夫卡雪花连接器的post请求时,收到此错误消息: [ec2-user@ip-10-0-64-123 tmp]$ curl -X POST -H "Content-Type: application/json" --data @snowflake.json http://internal-test-dev-alb-39351xyz.eu-central-1.elb.amazonaws.com:80/connectors <html> <head&

当我发送启动卡夫卡雪花连接器的post请求时,收到此错误消息:

[ec2-user@ip-10-0-64-123 tmp]$ curl -X POST -H "Content-Type: application/json" --data @snowflake.json http://internal-test-dev-alb-39351xyz.eu-central-1.elb.amazonaws.com:80/connectors
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 500 Request failed.</title>
</head>
<body><h2>HTTP ERROR 500</h2>
<p>Problem accessing /connectors. Reason:
<pre>    Request failed.</pre></p><hr><a href="http://eclipse.org/jetty">Powered by Jetty:// 9.4.18.v20190429</a><hr/>

</body>
</html>

以下是用于构建连接器的docker文件:

[ec2-user@ip-10-0-64-123 tmp]$ curl -X POST -H "Content-Type: application/json" --data @snowflake.json http://internal-test-dev-alb-39351xyz.eu-central-1.elb.amazonaws.com:80/connectors
<html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 500 Request failed.</title>
</head>
<body><h2>HTTP ERROR 500</h2>
<p>Problem accessing /connectors. Reason:
<pre>    Request failed.</pre></p><hr><a href="http://eclipse.org/jetty">Powered by Jetty:// 9.4.18.v20190429</a><hr/>

</body>
</html>


这是我的雪花

有指针吗

我猜bouncycastle罐子没有放在openjdk的正确位置?你知道应该把它们放在哪里,或者是否有其他方法来解决问题吗


非常感谢您的帮助。

您如何知道需要从Maven下载哪些JAR?我认为这不是设置雪花连接器的正确方法。还有,为什么不使用Confluent现有的
kafka connect
容器呢?根据文档,我认为您的做法是正确的:建议将这些JAR文件安装到/libs文件夹中。“我已经在Docker中为Kafka Connect标识的插件路径中放置了Snowflake connector jar和bouncy castle jar,这已经奏效了。也许你也可以试试。谢谢你的指点,我把jar放在这里,设法让它工作起来:
运行wget-q-P/usr/share/java/kafka/https://repo1.maven.org/maven2/org/bouncycastle/bc-fips/1.0.1/bc-fips-1.0.1.jar 运行wget-q-P/usr/share/java/kafka/https://repo1.maven.org/maven2/org/bouncycastle/bcpkix-fips/1.0.3/bcpkix-fips-1.0.3.jar
FROM openjdk:8-jre

# Add Confluent Repository and install Confluent Platform
RUN wget -qO - http://packages.confluent.io/deb/5.3/archive.key | apt-key add -
RUN echo "deb [arch=amd64] http://packages.confluent.io/deb/5.3 stable main" > /etc/apt/sources.list.d/confluent.list
RUN apt-get update &&  apt-get install -y --no-install-recommends confluent-kafka-connect-* confluent-schema-registry gettext confluent-kafka-2.12
RUN mkdir -p /usr/share/java/kafka-connect-pubsub/
RUN mkdir -p /etc/kafka-connect-pubsub/

# Script to configure properties in various config files.
COPY config-templates/configure.sh /configure.sh
# RUN wget -q -P /usr/share/java/kafka-connect-jdbc/ https://s3.amazonaws.com/datahub-public-repo/ojdbc10.jar
# RUN wget -q -P /usr/share/java/kafka-connect-jdbc/ https://s3.amazonaws.com/datahub-public-repo/mysql-connector-java-5.1.41-bin.jar
# RUN wget -q -P /usr/share/java/kafka-connect-jdbc/ https://s3.amazonaws.com/datahub-public-repo/terajdbc4.jar
# RUN wget -q -P /usr/share/java/kafka-connect-jdbc/ https://s3.amazonaws.com/datahub-public-repo/tdgssconfig.jar
RUN wget -q -P /usr/share/java/ https://repo1.maven.org/maven2/com/snowflake/snowflake-kafka-connector/1.4.4/snowflake-kafka-connector-1.4.4.jar
RUN wget -q -P /usr/share/java/ https://repo1.maven.org/maven2/org/bouncycastle/bc-fips/1.0.2/bc-fips-1.0.2.jar
RUN wget -q -P /usr/share/java/ https://repo1.maven.org/maven2/org/bouncycastle/bcpkix-fips/1.0.4/bcpkix-fips-1.0.4.jar


COPY config-templates/connect-standalone.properties.template /etc/kafka/connect-standalone.properties.template
COPY config-templates/snowflake.properties.template /etc/kafka/snowflake.properties.template
COPY config-templates/connect-distributed.properties.template /etc/kafka/connect-distributed.properties.template


# COPY config-templates/jdbc-source.properties.template /etc/kafka-connect-jdbc/jdbc-source.properties.template
# COPY config-templates/jdbc-sink.properties.template /etc/kafka-connect-jdbc/jdbc-sink.properties.template
# COPY config-templates/pubsub-sink-connector.properties.template /etc/kafka-connect-pubsub/pubsub-sink-connector.properties.template
COPY config-templates/kafka-run-class /usr/bin/kafka-run-class
# Modify these lines to reflect your client Keystore and Truststore.
# COPY replicantSuperUser.kafka.client.keystore.jks /replicantSuperUser.kafka.client.keystore.jks
# COPY kafka.client.truststore.jks /tmp/kafka.client.truststore.jks

RUN chmod 755 configure.sh /usr/bin/kafka-run-class
ENTRYPOINT /configure.sh && $KC_CMD && bash
{
   "name":"Snowflaketest",
   "config":{
      "connector.class":"com.snowflake.kafka.connector.SnowflakeSinkConnector",
      "tasks.max":"8",
      "topics":"dat.slt.isc.incoming.json",
      "buffer.count.records":"10000",
      "buffer.flush.time":"60",
      "buffer.size.bytes":"5000000",
      "snowflake.url.name":"https://t1234.eu-central-1.snowflakecomputing.com:443",
      "snowflake.user.name":"asdf_CONNECT",
      "snowflake.private.key":"MIIFLTBXBgkqhkiG9w0BBQ0wSjApBgkqhkiG9w0BBQwwHAQIWi2iAjGL9JsCAggAMAw********",
      "snowflake.private.key.passphrase":"hchajvdzSvcmqamIWe1jvrF***",
      "snowflake.database.name":"db_SANDBOX",
      "snowflake.schema.name":"LOADDB",
      "key.converter":"org.apache.kafka.connect.storage.StringConverter",
      "value.converter":"com.snowflake.kafka.connector.records.SnowflakeJsonConverter"
   }
}