Java Kafka流媒体无法处理多个实例

Java Kafka流媒体无法处理多个实例,java,apache-kafka,kafka-consumer-api,apache-kafka-streams,Java,Apache Kafka,Kafka Consumer Api,Apache Kafka Streams,当我运行Kafka Streams应用程序的多个实例时,只有第一个实例正确接收消息。但如果我启动新实例,它们将不会收到任何消息 有没有解决这个问题的建议 这是我的卡夫卡流媒体应用程序 package test.kafkastream; import java.util.Properties; import org.apache.kafka.clients.consumer.ConsumerConfig; import org.apache.kafka.common.serialization

当我运行Kafka Streams应用程序的多个实例时,只有第一个实例正确接收消息。但如果我启动新实例,它们将不会收到任何消息

有没有解决这个问题的建议

这是我的卡夫卡流媒体应用程序

package test.kafkastream;

import java.util.Properties;

import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.streams.KafkaStreams;
import org.apache.kafka.streams.StreamsConfig;
import org.apache.kafka.streams.processor.TopologyBuilder;

public class Main {

    public static void main(String[] args) {
        Properties props = new Properties();
        props.put(StreamsConfig.APPLICATION_ID_CONFIG, "streams-wordcount-processor");
        //props.put(ConsumerConfig.GROUP_ID_CONFIG, "streams-wordcount-processor");

        props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "192.168.2.38:45983,192.168.2.112:45635,192.168.2.116:39571");
        //props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");

        props.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
        props.put(StreamsConfig.VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());
        //props.put(StreamsConfig.TIMESTAMP_EXTRACTOR_CLASS_CONFIG, MyEventTimeExtractor.class);


        // setting offset reset to earliest so that we can re-run the demo code
        // with the same pre-loaded data
        props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");

        TopologyBuilder builder = new TopologyBuilder();

        builder.addSource("Source", "topic6");

        builder.addProcessor("Process", new ProcessMessage(), "Source");

        KafkaStreams streams = new KafkaStreams(builder, props);
        streams.start();
    }

}
这是我的制片人

package test.kafkamesos;

import java.util.Date;
import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.ExecutionException;

import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.serialization.ByteArraySerializer;

public class Producer {

    public static void main(String[] args) throws InterruptedException, ExecutionException {
        Map<String, Object> producerConfig = new HashMap<String, Object>();
        producerConfig.put("bootstrap.servers", "192.168.2.38:45983,192.168.2.112:45635,192.168.2.116:39571");
        //producerConfig.put("bootstrap.servers", "localhost:9092");

        // optional:
        producerConfig.put("metadata.fetch.timeout.ms", "3000");
        producerConfig.put("request.timeout.ms", "3000");
        // ... other options:
        // http://kafka.apache.org/documentation.html#producerconfigs
        ByteArraySerializer serializer = new ByteArraySerializer();
        KafkaProducer<byte[], byte[]> kafkaProducer = new KafkaProducer<byte[], byte[]>(producerConfig, serializer,
                serializer);

        int i = 0;
        while (true) {
            String message = "{data:success,g:" + i + "}";
            ProducerRecord<byte[], byte[]> record = new ProducerRecord<byte[], byte[]>("topic6", message.getBytes());
            kafkaProducer.send(record).get();
            System.out.println("sending " + message);
            Thread.sleep(1000);
            i++;
        }
    }
}

我相信您会遇到这个问题,因为Kafka代理只为您正在使用的主题配置了一个分区(
topic6
)。从Confluent博客:

例如,如果您的应用程序从一个包含10个主题的主题中读取内容 分区,则最多可以运行10个应用程序实例 (请注意,您可以运行更多实例,但这些实例将处于空闲状态)。在里面 总之,主题分区的数量是 Streams API应用程序的并行性,以及 正在运行应用程序的实例


来源:

我只能假设这是因为您添加了指向代码的链接,而不是代码本身。不过,如果这是原因,那就应该说明。@AleksandarStojadinovic谢谢你。我现在将添加代码。。。。
FROM openjdk:8-jre
COPY ./target/*-with-dependencies.jar /jars/service-jar.jar
CMD java -cp /jars/service-jar.jar test.kafkastream.Main