Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache kafka 将Kafka客户端配置为使用已颁发的SSL密钥/证书进行连接_Apache Kafka - Fatal编程技术网

Apache kafka 将Kafka客户端配置为使用已颁发的SSL密钥/证书进行连接

Apache kafka 将Kafka客户端配置为使用已颁发的SSL密钥/证书进行连接,apache-kafka,Apache Kafka,我使用的是Heroku Kafka,它运行0.10.1.1并使用SSL。它们只支持最新的协议 Heroku Kafka使用SSL进行身份验证和颁发客户端证书和密钥,并提供CA证书。我将它们分别放在client_cert.pem、client_key.pem和trusted_cert.pem中,并运行以下操作来构建密钥库: openssl pkcs12 -export -in client_cert.pem -inkey client_key.pem -certfile client_cert.p

我使用的是Heroku Kafka,它运行0.10.1.1并使用SSL。它们只支持最新的协议

Heroku Kafka使用SSL进行身份验证和颁发客户端证书和密钥,并提供CA证书。我将它们分别放在
client_cert.pem
client_key.pem
trusted_cert.pem
中,并运行以下操作来构建密钥库:

openssl pkcs12 -export -in client_cert.pem -inkey client_key.pem -certfile client_cert.pem -out client.p12
keytool -importkeystore -srckeystore client.p12 -srcstoretype pkcs12 -destkeystore kafka.keystore.jks -deststoretype JKS
keytool -keystore kafka.truststore.jks -alias CARoot -import -file trusted_cert.pem
然后,我创建了
client ssl.properties
,其中包含以下内容:

ssl.protocol=SSL
security.protocol=SSL
ssl.truststore.location=kafka.truststore.jks
ssl.truststore.type=JKS
ssl.truststore.password=xxxx
ssl.keystore.location=kafka.keystore.jks
ssl.keystore.type=JKS
ssl.keystore.password=xxxx
ssl.key.password=xxxx
kafka-console-producer --broker-list kafka+ssl://a.a.a.a:9096,kafka+ssl://b.b.b.b:9096,kafka+ssl://c.c.c.c:9096 --producer.config client-ssl.properties --topic robintest
然后,我使用带有以下内容的
kafka console producer
(版本0.10.1.1):

ssl.protocol=SSL
security.protocol=SSL
ssl.truststore.location=kafka.truststore.jks
ssl.truststore.type=JKS
ssl.truststore.password=xxxx
ssl.keystore.location=kafka.keystore.jks
ssl.keystore.type=JKS
ssl.keystore.password=xxxx
ssl.key.password=xxxx
kafka-console-producer --broker-list kafka+ssl://a.a.a.a:9096,kafka+ssl://b.b.b.b:9096,kafka+ssl://c.c.c.c:9096 --producer.config client-ssl.properties --topic robintest
(已创建
robintest
主题。)

此时,我发送一条记录并按enter键

[2017-01-31 10:06:53,194] DEBUG Initialize connection to node -2 for sending metadata request (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:53,194] DEBUG Initiating connection to node -2 at b.b.b.b:9096. (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:53,457] DEBUG Added sensor with name node--2.bytes-sent (org.apache.kafka.common.metrics.Metrics)
[2017-01-31 10:06:53,457] DEBUG Added sensor with name node--2.bytes-received (org.apache.kafka.common.metrics.Metrics)
[2017-01-31 10:06:53,458] DEBUG Added sensor with name node--2.latency (org.apache.kafka.common.metrics.Metrics)
[2017-01-31 10:06:53,460] DEBUG Created socket with SO_RCVBUF = 33304, SO_SNDBUF = 102808, SO_TIMEOUT = 0 to node -2 (org.apache.kafka.common.network.Selector)
[2017-01-31 10:06:53,463] DEBUG Completed connection to node -2 (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:53,692] DEBUG Sending metadata request {topics=[robintest]} to node -2 (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:53,724] DEBUG Connection with ec2-34-194-25-39.compute-1.amazonaws.com/b.b.b.b disconnected (org.apache.kafka.common.network.Selector)
java.io.EOFException
    at org.apache.kafka.common.network.SslTransportLayer.read(SslTransportLayer.java:488)
    at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:81)
    at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:71)
    at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:154)
    at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:135)
    at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:343)
    at org.apache.kafka.common.network.Selector.poll(Selector.java:291)
    at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:260)
    at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:236)
    at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:135)
    at java.lang.Thread.run(Thread.java:745)
[2017-01-31 10:06:53,728] DEBUG Node -2 disconnected. (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:53,728] WARN Bootstrap broker b.b.b.b:9096 disconnected (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:53,729] DEBUG Initialize connection to node -1 for sending metadata request (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:53,729] DEBUG Initiating connection to node -1 at a.a.a.a:9096. (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:53,791] DEBUG Added sensor with name node--1.bytes-sent (org.apache.kafka.common.metrics.Metrics)
[2017-01-31 10:06:53,792] DEBUG Added sensor with name node--1.bytes-received (org.apache.kafka.common.metrics.Metrics)
[2017-01-31 10:06:53,792] DEBUG Added sensor with name node--1.latency (org.apache.kafka.common.metrics.Metrics)
[2017-01-31 10:06:53,792] DEBUG Created socket with SO_RCVBUF = 33304, SO_SNDBUF = 102808, SO_TIMEOUT = 0 to node -1 (org.apache.kafka.common.network.Selector)
[2017-01-31 10:06:53,792] DEBUG Completed connection to node -1 (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:53,994] DEBUG Sending metadata request {topics=[robintest]} to node -1 (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,025] DEBUG Connection with ec2-34-194-39-35.compute-1.amazonaws.com/a.a.a.a disconnected (org.apache.kafka.common.network.Selector)
java.io.EOFException
    at org.apache.kafka.common.network.SslTransportLayer.read(SslTransportLayer.java:488)
    at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:81)
    at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:71)
    at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:154)
    at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:135)
    at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:343)
    at org.apache.kafka.common.network.Selector.poll(Selector.java:291)
    at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:260)
    at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:236)
    at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:135)
    at java.lang.Thread.run(Thread.java:745)
[2017-01-31 10:06:54,026] DEBUG Node -1 disconnected. (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,026] WARN Bootstrap broker a.a.a.a:9096 disconnected (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,027] DEBUG Initialize connection to node -3 for sending metadata request (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,027] DEBUG Initiating connection to node -3 at c.c.c.c:9096. (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,102] DEBUG Added sensor with name node--3.bytes-sent (org.apache.kafka.common.metrics.Metrics)
[2017-01-31 10:06:54,103] DEBUG Added sensor with name node--3.bytes-received (org.apache.kafka.common.metrics.Metrics)
[2017-01-31 10:06:54,103] DEBUG Added sensor with name node--3.latency (org.apache.kafka.common.metrics.Metrics)
[2017-01-31 10:06:54,104] DEBUG Created socket with SO_RCVBUF = 33304, SO_SNDBUF = 102808, SO_TIMEOUT = 0 to node -3 (org.apache.kafka.common.network.Selector)
[2017-01-31 10:06:54,104] DEBUG Completed connection to node -3 (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,309] DEBUG Sending metadata request {topics=[robintest]} to node -3 (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,342] DEBUG Connection with ec2-34-194-45-119.compute-1.amazonaws.com/c.c.c.c disconnected (org.apache.kafka.common.network.Selector)
java.io.EOFException
    at org.apache.kafka.common.network.SslTransportLayer.read(SslTransportLayer.java:488)
    at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:81)
    at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:71)
    at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:154)
    at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:135)
    at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:343)
    at org.apache.kafka.common.network.Selector.poll(Selector.java:291)
    at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:260)
    at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:236)
    at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:135)
    at java.lang.Thread.run(Thread.java:745)
[2017-01-31 10:06:54,342] DEBUG Node -3 disconnected. (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,343] WARN Bootstrap broker c.c.c.c:9096 disconnected (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,343] DEBUG Initialize connection to node -1 for sending metadata request (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,343] DEBUG Initiating connection to node -1 at a.a.a.a:9096. (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,348] DEBUG Initialize connection to node -2 for sending metadata request (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,348] DEBUG Initiating connection to node -2 at b.b.b.b:9096. (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,376] DEBUG Created socket with SO_RCVBUF = 33304, SO_SNDBUF = 102808, SO_TIMEOUT = 0 to node -2 (org.apache.kafka.common.network.Selector)
[2017-01-31 10:06:54,377] DEBUG Completed connection to node -2 (org.apache.kafka.clients.NetworkClient)
[2017-01-31 10:06:54,379] DEBUG Created socket with SO_RCVBUF = 33304, SO_SNDBUF = 102808, SO_TIMEOUT = 0 to node -1 (org.apache.kafka.common.network.Selector)
[2017-01-31 10:06:54,379] DEBUG Completed connection to node -1 (org.apache.kafka.clients.NetworkClient)
这些条目将一直持续到我终止流程为止


我尝试了各种配置组合,包括在属性文件中使用
producer.
作为所有配置的前缀,删除整个配置(这似乎没有什么区别),将密码设置为不正确的值(这似乎没有什么区别)。我还尝试用他们的凭据连接到另一个提供商(www.cloudkarafka.com),得到了相同的结果。因此,这显然是一个配置问题。

事实证明,我的卡夫卡集群(Heroku插件)实际上并没有运行0.10.1.1,而是运行0.10.0.1。两者似乎有不兼容的消费者API。(我必须说,“这正是语义版本控制存在的原因。”)


使用:
heroku-kafka:upgrade--version 0.10
,升级到最新的0.10.X版本。因此,如果您使用的是0.9,您想要0.10.0.1,祝您好运。

我注意到这一点(您正在创建信任库和密钥库)

但在卡夫卡配置中,你有

ssl.truststore.type=**JKS**
那里有冲突吗

我这么说的原因是我制作了一个pkcs12信任库,但我没有配置ssl.truststore.type(从未将该行添加到配置中)

我明白了,我明白了

2018-04-13 19:45:04495[main]错误c.my.special.package.MyApp:启动SpecialConsumer时发生异常 java.io.IOException:无效的密钥库格式


值得一提的是,对于那些在需要SSL身份验证(
SSL.client.auth
)的情况下将客户端连接到Kafka时遇到问题的人,我发现了一个非常有用的代码片段

cd ssl

# Create a java keystore and get a signed certificate for the broker. Then copy the certificate to the VM where the CA is running.

keytool -genkey -keystore kafka.client.keystore.jks -validity 365 -storepass "MyClientPassword123" -keypass "MyClientPassword123" -dname "CN=mylaptop1" -alias my-local-pc1 -storetype pkcs12

keytool -keystore kafka.client.keystore.jks -certreq -file client-cert-sign-request -alias my-local-pc1 -storepass "MyClientPassword123" -keypass "MyClientPassword123"

# Copy the cert to the CA
scp client-cert-sign-request3 sshuser@HeadNode0_Name:~/tmp1/client-cert-sign-request

# Switch to the CA machine (hn0) to sign the client certificate.
cd ssl
openssl x509 -req -CA ca-cert -CAkey ca-key -in /tmp1/client-cert-sign-request -out /tmp1/client-cert-signed -days 365 -CAcreateserial -passin pass:MyServerPassword123

# Return to the client machine (hn1), navigate to ~/ssl folder and copy signed cert from the CA (hn0) to client machine
scp -i ~/kafka-security.pem sshuser@HeadNode0_Name:/tmp1/client-cert-signed

# Import CA cert to trust store
keytool -keystore kafka.client.truststore.jks -alias CARoot -import -file ca-cert -storepass "MyClientPassword123" -keypass "MyClientPassword123" -noprompt

# Import CA cert to key store
keytool -keystore kafka.client.keystore.jks -alias CARoot -import -file ca-cert -storepass "MyClientPassword123" -keypass "MyClientPassword123" -noprompt

# Import signed client (cert client-cert-signed1) to keystore
keytool -keystore kafka.client.keystore.jks -import -file client-cert-signed -alias my-local-pc1 -storepass "MyClientPassword123" -keypass "MyClientPassword123" -noprompt