Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 不确定如何激活Kafka consumer consume()方法_Java_Apache Kafka_Spring Kafka - Fatal编程技术网

Java 不确定如何激活Kafka consumer consume()方法

Java 不确定如何激活Kafka consumer consume()方法,java,apache-kafka,spring-kafka,Java,Apache Kafka,Spring Kafka,编辑:根本原因指的是不存在的属性路径。Consumer.java类现在有了这个功能,它正在工作: @conditionalnproperty(value=“aaronshaver.kafka.consumer enabled”,havingValue=“true”) 我有一个简单的卡夫卡设置与制片人,这肯定是生产。我用以下代码确认了这一点: SendMessageTask.java ListenableFuture<SendResult<String, String

编辑:根本原因指的是不存在的属性路径。Consumer.java类现在有了这个功能,它正在工作:

@conditionalnproperty(value=“aaronshaver.kafka.consumer enabled”,havingValue=“true”)


我有一个简单的卡夫卡设置与制片人,这肯定是生产。我用以下代码确认了这一点:

SendMessageTask.java

        ListenableFuture<SendResult<String, String>> listenableFuture = this.producer.sendMessage("INPUT_DATA", "IN_KEY", LocalDate.now().toString());

        SendResult<String, String> result = listenableFuture.get();
        logger.info(String.format("\nProduced:\ntopic: %s\noffset: %d\npartition: %d\nvalue size: %d\n", result.getRecordMetadata().topic(), result.getRecordMetadata().offset(), result.getRecordMetadata().partition(), result.getRecordMetadata().serializedValueSize()));
@Service
@ConditionalOnProperty(value = "example.kafka.consumer-enabled", havingValue = "true")
public class Consumer {

    private final Logger logger = LoggerFactory.getLogger(Producer.class);

    @KafkaListener(topics = "INPUT_DATA", groupId = "fooGroup")
    public void listenGroupFoo(String message) {
        System.out.println("Received Message in group fooGroup: " + message);
    }

    @KafkaListener(topics = {"INPUT_DATA"})
    public void consume(
            final @Payload String message,
            final @Header(KafkaHeaders.OFFSET) Integer offset,
            final @Header(KafkaHeaders.RECEIVED_MESSAGE_KEY) String key,
            final @Header(KafkaHeaders.RECEIVED_PARTITION_ID) int partition,
            final @Header(KafkaHeaders.RECEIVED_TOPIC) String topic,
            final @Header(KafkaHeaders.RECEIVED_TIMESTAMP) long ts,
            final Acknowledgment acknowledgment
            ) {
        logger.info(String.format("#### -> Consumed message -> TIMESTAMP: %d\n%s\noffset: %d\nkey: %s\npartition: %d\ntopic: %s", ts, message, offset, key, partition, topic));
        acknowledgment.acknowledge();
    }
@EnableKafka
@Configuration
public class KafkaConsumerConfig {

    @Bean
    public ConsumerFactory<String, String> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(
                ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
                "localhost:9092");
        props.put(
                ConsumerConfig.GROUP_ID_CONFIG,
                "fooGroup");
        props.put(
                ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
                StringDeserializer.class);
        props.put(
                ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
                StringDeserializer.class);
        return new DefaultKafkaConsumerFactory<>(props);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, String>
    kafkaListenerContainerFactory() {

        ConcurrentKafkaListenerContainerFactory<String, String> factory =
                new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
我尝试了这个配置文件:

*KafkaConsumerConfig.java

        ListenableFuture<SendResult<String, String>> listenableFuture = this.producer.sendMessage("INPUT_DATA", "IN_KEY", LocalDate.now().toString());

        SendResult<String, String> result = listenableFuture.get();
        logger.info(String.format("\nProduced:\ntopic: %s\noffset: %d\npartition: %d\nvalue size: %d\n", result.getRecordMetadata().topic(), result.getRecordMetadata().offset(), result.getRecordMetadata().partition(), result.getRecordMetadata().serializedValueSize()));
@Service
@ConditionalOnProperty(value = "example.kafka.consumer-enabled", havingValue = "true")
public class Consumer {

    private final Logger logger = LoggerFactory.getLogger(Producer.class);

    @KafkaListener(topics = "INPUT_DATA", groupId = "fooGroup")
    public void listenGroupFoo(String message) {
        System.out.println("Received Message in group fooGroup: " + message);
    }

    @KafkaListener(topics = {"INPUT_DATA"})
    public void consume(
            final @Payload String message,
            final @Header(KafkaHeaders.OFFSET) Integer offset,
            final @Header(KafkaHeaders.RECEIVED_MESSAGE_KEY) String key,
            final @Header(KafkaHeaders.RECEIVED_PARTITION_ID) int partition,
            final @Header(KafkaHeaders.RECEIVED_TOPIC) String topic,
            final @Header(KafkaHeaders.RECEIVED_TIMESTAMP) long ts,
            final Acknowledgment acknowledgment
            ) {
        logger.info(String.format("#### -> Consumed message -> TIMESTAMP: %d\n%s\noffset: %d\nkey: %s\npartition: %d\ntopic: %s", ts, message, offset, key, partition, topic));
        acknowledgment.acknowledge();
    }
@EnableKafka
@Configuration
public class KafkaConsumerConfig {

    @Bean
    public ConsumerFactory<String, String> consumerFactory() {
        Map<String, Object> props = new HashMap<>();
        props.put(
                ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
                "localhost:9092");
        props.put(
                ConsumerConfig.GROUP_ID_CONFIG,
                "fooGroup");
        props.put(
                ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
                StringDeserializer.class);
        props.put(
                ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
                StringDeserializer.class);
        return new DefaultKafkaConsumerFactory<>(props);
    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, String>
    kafkaListenerContainerFactory() {

        ConcurrentKafkaListenerContainerFactory<String, String> factory =
                new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        return factory;
    }
有什么想法吗?谢谢

您的问题在于:

@ConditionalOnProperty(value = "example.kafka.consumer-enabled", havingValue = "true")
而且没有这样的配置属性

当我删除此条件时,它开始使用主题中的记录


请修改应用程序中的配置或逻辑。

没有任何东西引人注目。也许你可以与我们分享一个简单的项目,让我们玩和复制…谢谢。乐意效劳。最好的方法是什么?我可以链接到GitHub项目,我想:太棒了!让我在当地检查一下,看看里面!非常感谢。就这样。它正在试图消费。现在我必须弄清楚为什么确认不起作用。