Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache kafka 按日期列出的Apache Kafka消费者_Apache Kafka_Consumer - Fatal编程技术网

Apache kafka 按日期列出的Apache Kafka消费者

Apache kafka 按日期列出的Apache Kafka消费者,apache-kafka,consumer,Apache Kafka,Consumer,是否存在可以从特定时间段/日期消费的低级或高级消费者Kafka 0.8.2? (卡夫卡客户端,spark Kafka,…)没有。您必须转到0.10才能获得此类功能。不,没有。您必须转到0.10才能获得此类功能。使用OffsetAndTimestamp: ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100)); Set<TopicPartition>

是否存在可以从特定时间段/日期消费的低级或高级消费者Kafka 0.8.2?
(卡夫卡客户端,spark Kafka,…)

没有。您必须转到0.10才能获得此类功能。

不,没有。您必须转到0.10才能获得此类功能。

使用OffsetAndTimestamp:

ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100));
            Set<TopicPartition> assignments = consumer.assignment();
            Map<TopicPartition, Long> query = new HashMap<>();
            for (TopicPartition topicPartition : assignments) {
                query.put(topicPartition, date);
            }

            Map<TopicPartition, OffsetAndTimestamp> result = consumer.offsetsForTimes(query);

            result.entrySet().stream().forEach(entry -> consumer.seek(entry.getKey(),
                    Optional.ofNullable(entry.getValue()).map(OffsetAndTimestamp::offset).orElse(new Long(0))));

            flag = false;

        for (ConsumerRecord<String, String> record : records)
            System.out.printf("offset = %d, key = %s, value = %s%n", record.offset(), record.key(), record.value());
ConsumerRecords records=consumer.poll(持续时间:百万分之一百);
Set assignments=consumer.assignment();
Map query=newhashmap();
用于(主题分区主题分区:分配){
查询.放置(主题分区,日期);
}
映射结果=consumer.offsetsForTimes(查询);
result.entrySet().stream().forEach(entry->consumer.seek(entry.getKey()),
可选.ofNullable(entry.getValue()).map(OffsetAndTimestamp::offset).orElse(newlong(0));
flag=false;
对于(消费者记录:记录)
System.out.printf(“偏移量=%d,键=%s,值=%s%n”,record.offset(),record.key(),record.value());

使用偏移和时间戳:

ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100));
            Set<TopicPartition> assignments = consumer.assignment();
            Map<TopicPartition, Long> query = new HashMap<>();
            for (TopicPartition topicPartition : assignments) {
                query.put(topicPartition, date);
            }

            Map<TopicPartition, OffsetAndTimestamp> result = consumer.offsetsForTimes(query);

            result.entrySet().stream().forEach(entry -> consumer.seek(entry.getKey(),
                    Optional.ofNullable(entry.getValue()).map(OffsetAndTimestamp::offset).orElse(new Long(0))));

            flag = false;

        for (ConsumerRecord<String, String> record : records)
            System.out.printf("offset = %d, key = %s, value = %s%n", record.offset(), record.key(), record.value());
ConsumerRecords records=consumer.poll(持续时间:百万分之一百);
Set assignments=consumer.assignment();
Map query=newhashmap();
用于(主题分区主题分区:分配){
查询.放置(主题分区,日期);
}
映射结果=consumer.offsetsForTimes(查询);
result.entrySet().stream().forEach(entry->consumer.seek(entry.getKey()),
可选.ofNullable(entry.getValue()).map(OffsetAndTimestamp::offset).orElse(newlong(0));
flag=false;
对于(消费者记录:记录)
System.out.printf(“偏移量=%d,键=%s,值=%s%n”,record.offset(),record.key(),record.value());

请详细说明哪个API调用?请详细说明哪个API调用?