Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache kafka 如何使用SpringCloudStream访问由密码保护的合流模式注册表服务器?_Apache Kafka_Avro_Spring Cloud Stream_Confluent Schema Registry_Aiven - Fatal编程技术网

Apache kafka 如何使用SpringCloudStream访问由密码保护的合流模式注册表服务器?

Apache kafka 如何使用SpringCloudStream访问由密码保护的合流模式注册表服务器?,apache-kafka,avro,spring-cloud-stream,confluent-schema-registry,aiven,Apache Kafka,Avro,Spring Cloud Stream,Confluent Schema Registry,Aiven,我在使用SpringCloudStream的schema注册表旁边使用。Aiven的模式注册表由密码保护。根据说明,需要设置这两个配置参数才能成功访问架构注册表服务器 props.put("basic.auth.credentials.source", "USER_INFO"); props.put("basic.auth.user.info", "avnadmin:schema-reg-password"); 当我只使用vanilla java的kafka驱动程序时,一切都很好,但如果我

我在使用SpringCloudStream的schema注册表旁边使用。Aiven的模式注册表由密码保护。根据说明,需要设置这两个配置参数才能成功访问架构注册表服务器

 props.put("basic.auth.credentials.source", "USER_INFO");
 props.put("basic.auth.user.info", "avnadmin:schema-reg-password");
当我只使用vanilla java的kafka驱动程序时,一切都很好,但如果我使用Spring cloud stream,我不知道如何注入这两个参数。目前,我正在将
“basic.auth.user.info”
“basic.auth.credentials.source”
放在
“spring.cloud.stream.kafka.binder.configuration”下的
应用程序.yml
文件中

这样做,我将在架构想要注册的行上获得
“401 Unauthorized”

更新1:

根据Ali n的建议,我更新了SchemaRegistryClient的bean的配置方式,以便它能够感知SSL上下文

@Bean
public SchemaRegistryClient schemaRegistryClient(
    @Value("${spring.cloud.stream.schemaRegistryClient.endpoint}") String endpoint) {
  try {
    final KeyStore keyStore = KeyStore.getInstance("PKCS12");
    keyStore.load(new FileInputStream(
            new File("path/to/client.keystore.p12")),
        "secret".toCharArray());

    final KeyStore trustStore = KeyStore.getInstance("JKS");
    trustStore.load(new FileInputStream(
            new File("path/to/client.truststore.jks")),
        "secret".toCharArray());

    TrustStrategy acceptingTrustStrategy = (X509Certificate[] chain, String authType) -> true;

    SSLContext sslContext = SSLContextBuilder
        .create()
        .loadKeyMaterial(keyStore, "secret".toCharArray())
        .loadTrustMaterial(trustStore, acceptingTrustStrategy)
        .build();

    HttpClient httpClient = HttpClients.custom().setSSLContext(sslContext).build();
    ClientHttpRequestFactory requestFactory = new HttpComponentsClientHttpRequestFactory(
        httpClient);
    ConfluentSchemaRegistryClient schemaRegistryClient = new ConfluentSchemaRegistryClient(
        new RestTemplate(requestFactory));
    schemaRegistryClient.setEndpoint(endpoint);
    return schemaRegistryClient;
  } catch (Exception ex) {
    ex.printStackTrace();
    return null;
  }
}

这有助于消除应用程序启动时的错误并注册架构。然而,每当应用程序想要将消息推送到卡夫卡时,就会再次抛出一个新错误。最后,梅尔森的回答也解决了这一问题。

活页夹配置仅处理众所周知的消费者和生产者属性

可以在绑定级别设置任意属性

spring.cloud.stream.kafka.binding.<binding>.consumer.configuration.basic.auth...
spring.cloud.stream.kafka.binding..consumer.configuration.basic.auth。。。

由于Aiven对卡夫卡安全协议使用SSL,因此需要使用证书进行身份验证

你可以跟随了解它是如何工作的。简而言之,您需要运行以下命令来生成证书并导入它们:

openssl pkcs12 -export -inkey service.key -in service.cert -out client.keystore.p12 -name service_key
keytool -import -file ca.pem -alias CA -keystore client.truststore.jks
然后,您可以使用以下属性来使用证书:

spring.cloud.stream.kafka.streams.binder:
  configuration:
    security.protocol: SSL
    ssl.truststore.location: client.truststore.jks
    ssl.truststore.password: secret
    ssl.keystore.type: PKCS12
    ssl.keystore.location: client.keystore.p12
    ssl.keystore.password: secret
    ssl.key.password: secret
    key.serializer: org.apache.kafka.common.serialization.StringSerializer
    value.serializer: org.apache.kafka.common.serialization.StringSerializer

我遇到了与我所处的情况相同的问题,即连接到由aiven托管并由basic auth保护的安全模式注册表。为了使其正常工作,我必须配置以下属性:

spring.kafka.properties.schema.registry.url=https://***.aiven***.com:port
spring.kafka.properties.basic.auth.credentials.source=USER_INFO
spring.kafka.properties.basic.auth.user.info=username:password
我的活页夹的其他属性包括:

spring.cloud.stream.binders.input.type=kafka
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.brokers=https://***.aiven***.com:port <-- different from the before mentioned port
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.security.protocol=SSL
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.truststore.location=truststore.jks
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.truststore.password=secret
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.keystore.type=PKCS12
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.keystore.location=clientkeystore.p12
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.keystore.password=secret
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.ssl.key.password=secret
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.configuration.value.deserializer=io.confluent.kafka.serializers.KafkaAvroDeserializer
spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.streams.binder.autoCreateTopics=false
spring.cloud.stream.binders.input.type=kafka

spring.cloud.stream.binders.input.environment.spring.cloud.stream.kafka.binder.brokers=https://***.aiven***.com:port这对kafka本身很好,但对架构注册表服务器没有帮助。在“spring.cloud.stream.kafka.bindings.output.producer.configuration”下移动了参数但是错误+stacktrace没有改变。你能分享你在application.yml文件中的所有属性吗?@Milad你找到解决方法了吗?@mmelsen我和Ali n聊天过。我想他已经找到了解决办法。一旦我们确定了,他就会发布。如果你使用这个,你能告诉我如何强制从注册表中提取模式,而不是使用本地模式吗?