Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Spring boot 卡夫卡·萨斯卢平原紧急停堆在春季开机时失败_Spring Boot_Apache Kafka_Docker Compose - Fatal编程技术网

Spring boot 卡夫卡·萨斯卢平原紧急停堆在春季开机时失败

Spring boot 卡夫卡·萨斯卢平原紧急停堆在春季开机时失败,spring-boot,apache-kafka,docker-compose,Spring Boot,Apache Kafka,Docker Compose,我尝试kafka身份验证SASL_明文/紧急停堆,但在春季启动时身份验证失败 我尝试更改SASL_纯文本/纯文本,它正在工作。但是紧急停堆是身份验证失败的SHA-512和SHA-256 做了很多不同的事情,但都不起作用。。。。 我怎样才能修好它 代理日志 broker1 | [2020-12-31 02:57:37,831] INFO [SocketServer brokerId=1] Failed authentication with /172.29.0.1 (Authentic

我尝试kafka身份验证SASL_明文/紧急停堆,但在春季启动时身份验证失败

我尝试更改SASL_纯文本/纯文本,它正在工作。但是紧急停堆是身份验证失败的SHA-512和SHA-256

做了很多不同的事情,但都不起作用。。。。 我怎样才能修好它

代理日志

broker1       | [2020-12-31 02:57:37,831] INFO [SocketServer brokerId=1] Failed authentication with /172.29.0.1 (Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512) (org.apache.kafka.common.network.Selector)
broker2       | [2020-12-31 02:57:37,891] INFO [SocketServer brokerId=2] Failed authentication with /172.29.0.1 (Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512) (org.apache.kafka.common.network.Selector)
Spring启动日志

2020-12-31 11:57:37.438  INFO 82416 --- [  restartedMain] o.a.k.c.s.authenticator.AbstractLogin    : Successfully logged in.
2020-12-31 11:57:37.497  INFO 82416 --- [  restartedMain] o.a.kafka.common.utils.AppInfoParser     : Kafka version: 2.6.0
2020-12-31 11:57:37.499  INFO 82416 --- [  restartedMain] o.a.kafka.common.utils.AppInfoParser     : Kafka commitId: 62abe01bee039651
2020-12-31 11:57:37.499  INFO 82416 --- [  restartedMain] o.a.kafka.common.utils.AppInfoParser     : Kafka startTimeMs: 1609383457495
2020-12-31 11:57:37.502  INFO 82416 --- [  restartedMain] o.a.k.clients.consumer.KafkaConsumer     : [Consumer clientId=consumer-Test-Consumer-1, groupId=Test-Consumer] Subscribed to topic(s): test
2020-12-31 11:57:37.508  INFO 82416 --- [  restartedMain] o.s.s.c.ThreadPoolTaskScheduler          : Initializing ExecutorService
2020-12-31 11:57:37.528  INFO 82416 --- [  restartedMain] o.s.b.w.embedded.tomcat.TomcatWebServer  : Tomcat started on port(s): 8080 (http) with context path ''
2020-12-31 11:57:37.546  INFO 82416 --- [  restartedMain] i.m.k.p.KafkaProducerScramApplication    : Started KafkaProducerScramApplication in 2.325 seconds (JVM running for 3.263)
2020-12-31 11:57:37.833  INFO 82416 --- [ntainer#0-0-C-1] o.apache.kafka.common.network.Selector   : [Consumer clientId=consumer-Test-Consumer-1, groupId=Test-Consumer] Failed authentication with localhost/127.0.0.1 (Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512)
2020-12-31 11:57:37.836 ERROR 82416 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient   : [Consumer clientId=consumer-Test-Consumer-1, groupId=Test-Consumer] Connection to node -1 (localhost/127.0.0.1:9091) failed authentication due to: Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512
2020-12-31 11:57:37.837  WARN 82416 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient   : [Consumer clientId=consumer-Test-Consumer-1, groupId=Test-Consumer] Bootstrap broker localhost:9091 (id: -1 rack: null) disconnected
2020-12-31 11:57:37.842 ERROR 82416 --- [ntainer#0-0-C-1] essageListenerContainer$ListenerConsumer : Consumer exception

java.lang.IllegalStateException: This error handler cannot process 'org.apache.kafka.common.errors.SaslAuthenticationException's; no record information is available
    at org.springframework.kafka.listener.SeekUtils.seekOrRecover(SeekUtils.java:151) ~[spring-kafka-2.6.4.jar:2.6.4]
    at org.springframework.kafka.listener.SeekToCurrentErrorHandler.handle(SeekToCurrentErrorHandler.java:113) ~[spring-kafka-2.6.4.jar:2.6.4]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.handleConsumerException(KafkaMessageListenerContainer.java:1425) ~[spring-kafka-2.6.4.jar:2.6.4]
    at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1122) ~[spring-kafka-2.6.4.jar:2.6.4]
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[na:na]
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[na:na]
    at java.base/java.lang.Thread.run(Thread.java:832) ~[na:na]
Caused by: org.apache.kafka.common.errors.SaslAuthenticationException: Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512
我的docker-compose.yml

...
...

  zookeeper3:
    image: confluentinc/cp-zookeeper:6.0.1
    hostname: zookeeper3
    container_name: zookeeper3
    environment:
      KAFKA_ZOOKEEPER_CONNECT: zookeeper3:2183
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://zookeeper:2183
      ZOOKEEPER_CLIENT_PORT: 2183
      ZOOKEEPER_TICK_TIME: 2000
      ZOOKEEPER_SERVER_ID: 3
      KAFKA_OPTS: "-Djava.security.auth.login.config=/etc/kafka/secrets/sasl/zookeeper_jaas.conf \ 
          -Dzookeeper.authProvider.1=org.apache.zookeeper.server.auth.SASLAuthenticationProvider \
          -Dzookeeper.authProvider.2=org.apache.zookeeper.server.auth.DigestAuthenticationProvider \
          -Dquorum.auth.enableSasl=true \
          -Dquorum.auth.learnerRequireSasl=true \
          -Dquorum.auth.serverRequireSasl=true \
          -Dquorum.auth.learner.saslLoginContext=QuorumLearner \
          -Dquorum.auth.server.saslLoginContext=QuorumServer \
          -Dquorum.cnxn.threads.size=20 \
          -DrequireClientAuthScheme=sasl"
    volumes: 
      - /etc/kafka/secrets/sasl:/etc/kafka/secrets/sasl

  broker1:
    image: confluentinc/cp-kafka:6.0.1
    hostname: broker1
    container_name: broker1
    depends_on:
      - zookeeper1
      - zookeeper2
      - zookeeper3
    ports:
      - "9091:9091"
      - "9101:9101"
      - "29091:29091"
    expose: 
      - "29090"
    environment:
      KAFKA_OPTS: "-Dzookeeper.sasl.client=true -Djava.security.auth.login.config=/etc/kafka/secrets/sasl/kafka_server_jaas.conf"
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: 'zookeeper1:2181,zookeeper2:2182,zookeeper3:2183'
      KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT,SASL_PLAINHOST:SASL_PLAINTEXT
      KAFKA_LISTENERS: INSIDE://:29090,OUTSIDE://:29091,SASL_PLAINHOST://:9091
      KAFKA_ADVERTISED_LISTENERS: INSIDE://broker1:29090,OUTSIDE://localhost:29091,SASL_PLAINHOST://localhost:9091
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
      KAFKA_JMX_PORT: 9101
      KAFKA_JMX_HOSTNAME: localhost
      KAFKA_SECURITY_INTER_BROKER_PROTOCAL: SASL_PLAINTEXT
      KAFKA_SASL_ENABLED_MECHANISMS: SCRAM-SHA-512
      KAFKA_SASL_MECHANISM_INTER_BROKER_PROTOCOL: PLAINTEXT
    volumes: 
      - /etc/kafka/secrets/sasl:/etc/kafka/secrets/sasl
  broker2:
    image: confluentinc/cp-kafka:6.0.1
    hostname: broker2
    container_name: broker2
    depends_on:
      - zookeeper1
      - zookeeper2
      - zookeeper3
    ports:
      - "9092:9092"
      - "9102:9102"
      - "29092:29092"
    expose: 
      - "29090"
    environment:
      KAFKA_OPTS: "-Dzookeeper.sasl.client=true -Djava.security.auth.login.config=/etc/kafka/secrets/sasl/kafka_server_jaas.conf"
      KAFKA_BROKER_ID: 2
      KAFKA_ZOOKEEPER_CONNECT: 'zookeeper1:2181,zookeeper2:2182,zookeeper3:2183'
      KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT,SASL_PLAINHOST:SASL_PLAINTEXT
      KAFKA_LISTENERS: INSIDE://:29090,OUTSIDE://:29092,SASL_PLAINHOST://:9092
      KAFKA_ADVERTISED_LISTENERS: INSIDE://broker2:29090,OUTSIDE://localhost:29092,SASL_PLAINHOST://localhost:9092
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
      KAFKA_JMX_PORT: 9102
      KAFKA_JMX_HOSTNAME: localhost
      KAFKA_SECURITY_INTER_BROKER_PROTOCAL: SASL_PLAINTEXT
      KAFKA_SASL_ENABLED_MECHANISMS: SCRAM-SHA-512
      KAFKA_SASL_MECHANISM_INTER_BROKER_PROTOCOL: PLAINTEXT
    volumes: 
      - /etc/kafka/secrets/sasl:/etc/kafka/secrets/sasl
kaka_服务器_jaas.conf

KafkaServer {
    org.apache.kafka.common.security.scram.ScramLoginModule required
    username="admin"
    password="password"
    user_admin="password"
    user_client="password";
};

Client {
    org.apache.kafka.common.security.plain.PlainLoginModule required
    username="admin"
    password="password";
};

KafkaClient {
    org.apache.kafka.common.security.scram.ScramLoginModule required
    username="client"
    password="password";
};
Server {
    org.apache.kafka.common.security.plain.PlainLoginModule required
    user_admin="password";
};
QuorumServer {
       org.apache.zookeeper.server.auth.DigestLoginModule required
       user_admin="password";
};
 
QuorumLearner {
       org.apache.zookeeper.server.auth.DigestLoginModule required
       username="admin"
       password="password";
};
动物园管理员_jaas.conf

KafkaServer {
    org.apache.kafka.common.security.scram.ScramLoginModule required
    username="admin"
    password="password"
    user_admin="password"
    user_client="password";
};

Client {
    org.apache.kafka.common.security.plain.PlainLoginModule required
    username="admin"
    password="password";
};

KafkaClient {
    org.apache.kafka.common.security.scram.ScramLoginModule required
    username="client"
    password="password";
};
Server {
    org.apache.kafka.common.security.plain.PlainLoginModule required
    user_admin="password";
};
QuorumServer {
       org.apache.zookeeper.server.auth.DigestLoginModule required
       user_admin="password";
};
 
QuorumLearner {
       org.apache.zookeeper.server.auth.DigestLoginModule required
       username="admin"
       password="password";
};
ConsumerConfig.java

private static final String BOOTSTRAP_ADDRESS = "localhost:9091,localhost:9092";
private static final String JAAS_TEMPLATE = "org.apache.kafka.common.security.scram.ScramLoginModule required username=\"%s\" password=\"%s\";";

public Map<String, Object> consumerConfigs() {
    Map<String, Object> props = new HashMap<>();
    
    String jaasCfg = String.format(JAAS_TEMPLATE, "client", "password");
    
    props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, BOOTSTRAP_ADDRESS);
    props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    props.put(ConsumerConfig.MAX_POLL_INTERVAL_MS_CONFIG, "1000");
    props.put(ConsumerConfig.GROUP_ID_CONFIG, "Test-Consumer");
    props.put("sasl.jaas.config", jaasCfg);
    props.put("sasl.mechanism", "SCRAM-SHA-512");
    props.put("security.protocol", "SASL_PLAINTEXT");
    return props;
}

@Bean
public ConsumerFactory<String, String> consumerFactory() {
    return new DefaultKafkaConsumerFactory<>(consumerConfigs());
}

@Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
    ConcurrentKafkaListenerContainerFactory<String, String> factory =
      new ConcurrentKafkaListenerContainerFactory<>();
    factory.setConsumerFactory(consumerFactory());
    return factory;
}
private static final String BOOTSTRAP_ADDRESS=“localhost:9091,localhost:9092”;
私有静态最终字符串JAAS_TEMPLATE=“org.apache.kafka.common.security.scram.ScramLoginModule所需用户名=\%s\”密码=\%s\;”;
公共地图使用者配置(){
Map props=newhashmap();
String jaasCfg=String.format(JAAS_模板,“客户端”,“密码”);
put(ConsumerConfig.BOOTSTRAP\u server\u CONFIG,BOOTSTRAP\u ADDRESS);
put(ConsumerConfig.KEY\u反序列化程序\u类\u配置,StringDeserializer.CLASS);
put(ConsumerConfig.VALUE\u反序列化程序\u类\u配置,StringDeserializer.CLASS);
props.put(ConsumerConfig.MAX_POLL_INTERVAL_MS_CONFIG,“1000”);
props.put(ConsumerConfig.GROUP_ID_CONFIG,“测试消费者”);
put(“sasl.jaas.config”,jaasCfg);
道具放置(“sasl机构”、“紧急停堆-SHA-512”);
props.put(“security.protocol”、“SASL_明文”);
返回道具;
}
@豆子
公共消费者工厂消费者工厂(){
返回新的DefaultKafkanConsumerFactory(consumerConfigs());
}
@豆子
公共ConcurrentKafkaListenerContainerFactory kafkaListenerContainerFactory(){
ConcurrentKafkalistener集装箱工厂=
新的ConcurrentKafkaListenerContainerFactory();
setConsumerFactory(consumerFactory());
返回工厂;
}
已解决

因为我没有在zookeeper中添加用户信息。 添加此代码

  zookeeper-add-kafka-users:
    image: confluentinc/cp-kafka:6.0.1
    container_name: "zookeeper-add-kafka-users"
    depends_on:
      - zookeeper1
      - zookeeper2
      - zookeeper3
    command: "bash -c 'echo Waiting for Zookeeper to be ready... && \
                          cub zk-ready zookeeper1:2181 120 && \
                          cub zk-ready zookeeper2:2182 120 && \
                          cub zk-ready zookeeper3:2183 120 && \
                          kafka-configs --zookeeper zookeeper1:2181 --alter --add-config 'SCRAM-SHA-512=[iterations=4096,password=password]' --entity-type users --entity-name admin && \
                          kafka-configs --zookeeper zookeeper1:2181 --alter --add-config 'SCRAM-SHA-512=[iterations=4096,password=password]' --entity-type users --entity-name client '"
    environment:
      KAFKA_BROKER_ID: ignored
      KAFKA_ZOOKEEPER_CONNECT: ignored
      KAFKA_OPTS: -Djava.security.auth.login.config=/etc/kafka/secrets/sasl/kafka_server_jaas.conf
    volumes:
      - /home/mysend/dev/docker/kafka/sasl:/etc/kafka/secrets/sasl

如果不使用docker,则可以使用命令

bin/kafka-configs --zookeeper localhost:2181 --alter --add-config 'SCRAM-SHA-256=[password=admin-secret],SCRAM-SHA-512=[password=admin-secret]' --entity-type users --entity-name admin