Java spark kafka消费者仅在其停止后接收消息

Java spark kafka消费者仅在其停止后接收消息,java,apache-spark,apache-kafka,spark-streaming-kafka,Java,Apache Spark,Apache Kafka,Spark Streaming Kafka,当我向卡夫卡发送消息时,它们会被发送,但只有在我停止消费之后,我的消费者才能阅读。然后所有事件都出现在日志中,并传递给卡夫卡制作人 public static void main(String[] args) throws InterruptedException { SparkConf sparkConf = new SparkConf(); sparkConf.setAppName("kafakTest2"); JavaS

当我向卡夫卡发送消息时,它们会被发送,但只有在我停止消费之后,我的消费者才能阅读。然后所有事件都出现在日志中,并传递给卡夫卡制作人

public static void main(String[] args) throws InterruptedException {
        SparkConf sparkConf = new SparkConf();
        sparkConf.setAppName("kafakTest2");

        JavaStreamingContext streamingContext = new JavaStreamingContext(
                sparkConf, Durations.seconds(10));

        Map<String, Object> kafkaParams = new HashMap<String, Object>();
        kafkaParams.put("bootstrap.servers", "kafka.kafka:9092");
        kafkaParams.put("key.deserializer", StringDeserializer.class);
        kafkaParams.put("value.deserializer", StringDeserializer.class);
        kafkaParams.put("group.id", "spark_group1");
        kafkaParams.put("auto.offset.reset", "earliest");
        kafkaParams.put("enable.auto.commit", true);
        kafkaParams.put("partition.assignment.strategy", "range");

  

      Map<String, Integer> topics = new HashMap<String, Integer>();
            topics.put("dev_parcels_belt_scan", 1);
    
            Properties props = new Properties();
            props.put("bootstrap.servers", "kafka.kafka:9092");
            props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
            props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
            props.put("partition.assignment.strategy", "range");
    
            JavaPairReceiverInputDStream<String, String> lines = KafkaUtils.createStream(streamingContext, "kafka-zookeeper.kafka:2181", "group1", topics);

lines.foreachRDD(rdd -> {
            rdd.foreachPartition(iterator -> {
                KafkaProducer producer = new KafkaProducer<String,String>(props);
                        while (iterator.hasNext()) {
                            Tuple2<String, String> next = iterator.next();
                            System.out.println(next._1() + " --> " + next._2());
                            ProducerRecord record = new ProducerRecord<String,String>("spark","key",next._2());
                            producer.send(record);
                        }
                    }


            );
            });


streamingContext.start()            
streamingContext.awaitTermination() 

 }
}
只有在我停止spark作业后,我才能看到事件日志,并且事件被发送到其他主题

test --> {"Id":1,"StateId":1635749,"barcode":"10099817543400111299","trackingNumber":null,"timeStamp":"2020-10-09T13:29:05Z"}
pom:


org.scala-lang
scala图书馆
2.11.12
org.apache.spark
spark-core_2.11
${spark.version}
假如
org.apache.spark
spark-U 2.11
${spark.version}
假如
org.apache.spark
spark-streaming-kafka-0-8_2.11
2.4.3
org.apache.spark
spark-sql_2.11
${spark.version}
假如
com.datasax.spark
spark-cassandra-connector_2.11
${spark.version}

Hi,消费者轮询超时的值是多少?
test --> {"Id":1,"StateId":1635749,"barcode":"10099817543400111299","trackingNumber":null,"timeStamp":"2020-10-09T13:29:05Z"}
<dependencies>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.11.12</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.11</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
       <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
            <version>2.4.3</version>
      </dependency>
      <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>${spark.version}</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.11</artifactId>
            <version>${spark.version}</version>
        </dependency>
     </dependencies>