Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/spring-boot/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/apache-kafka/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Spring boot Spring Kafka-如何在生成消息时获取时间戳(事件时间)_Spring Boot_Apache Kafka_Apache Kafka Streams_Spring Kafka - Fatal编程技术网

Spring boot Spring Kafka-如何在生成消息时获取时间戳(事件时间)

Spring boot Spring Kafka-如何在生成消息时获取时间戳(事件时间),spring-boot,apache-kafka,apache-kafka-streams,spring-kafka,Spring Boot,Apache Kafka,Apache Kafka Streams,Spring Kafka,我需要在kafka消费者应用程序中获取消息生成时的时间戳(事件时间)。我知道timestampExtractor可以与kafka流一起使用,但我的要求不同,因为我没有使用流来使用消息 我的卡夫卡制作人如下: @Override public void run(ApplicationArguments args) throws Exception { List<String> names = Arrays.asList("priya", "dyser", "Ray", "M

我需要在kafka消费者应用程序中获取消息生成时的时间戳(事件时间)。我知道timestampExtractor可以与kafka流一起使用,但我的要求不同,因为我没有使用流来使用消息

我的卡夫卡制作人如下:

@Override
public void run(ApplicationArguments args) throws Exception {


    List<String> names = Arrays.asList("priya", "dyser", "Ray", "Mark", "Oman", "Larry");
    List<String> pages = Arrays.asList("blog", "facebook", "instagram", "news", "youtube", "about");
    Runnable runnable = () -> {
        String rPage = pages.get(new Random().nextInt(pages.size()));
        String rName = pages.get(new Random().nextInt(names.size()));
        PageViewEvent pageViewEvent = new PageViewEvent(rName, rPage, Math.random() > .5 ? 10 : 1000);

        Message<PageViewEvent> message =  MessageBuilder
                .withPayload(pageViewEvent).
                setHeader(KafkaHeaders.MESSAGE_KEY, pageViewEvent.getUserId().getBytes())
                        .build();

        try {
            this.pageViewsOut.send(message);
            log.info("sent " + message);
        } catch (Exception e) {
            log.error(e);
        }
    };
集装箱工厂配置

   @Bean
   public ConsumerFactory<String,PageViewEvent > priceEventConsumerFactory() {

        Map<String, Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "json");
        props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
        return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new JsonDeserializer<>(PageViewEvent.class));



    }

    @Bean
    public ConcurrentKafkaListenerContainerFactory<String, PageViewEvent> priceEventsKafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String, PageViewEvent> factory =
                new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(priceEventConsumerFactory());
        return factory;
    }
@Bean
公共消费者工厂价格EventConsumerFactory(){
Map props=newhashmap();
put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,“localhost:9092”);
put(ConsumerConfig.KEY\u反序列化程序\u类\u配置,StringDeserializer.CLASS);
put(ConsumerConfig.VALUE\u反序列化程序\u类\u配置,JsonDeserializer.CLASS);
put(ConsumerConfig.GROUP_ID_CONFIG,“json”);
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG,“最早”);
返回新的DefaultKafkanConsumerFactory(props、新的StringDeserializer()、新的JsonDeserializer(PageViewEvent.class));
}
@豆子
公共ConcurrentKafkaListenerContainerFactory价格事件KafkaListenerContainerFactory(){
ConcurrentKafkalistener集装箱工厂=
新的ConcurrentKafkaListenerContainerFactory();
setConsumerFactory(priceEventConsumerFactory());
返回工厂;
}
当我打印时发送消息的制作人给我以下数据:

[payload=PageViewEvent(userId=blog,page=about,duration=10), 标题={id=8ebdad85-e2f7-958f-500e-4560ac0970e5, 卡夫卡=[B@71975e1a,contentType=application/json, 时间戳=1553041963803}]


这确实有一个生成的时间戳。如何使用Spring kafka获取消息生成的时间戳?

RECEIVED\u时间戳意味着接收到的是记录中的时间戳,而不是接收到的时间。我们避免将其放入时间戳中,以避免意外传播到出站消息。

您可以使用下:
You can use something like below:

final Producer<String, String> producer = new KafkaProducer<String, String>(properties);
        long time = System.currentTimeMillis();
        final CountDownLatch countDownLatch = new CountDownLatch(5);
        int count=0;
        try {
            for (long index = time; index < time + 10; index++) {
                String key = null;
                count++;
                if(count<=5)
                    key = "id_"+ Integer.toString(1);
                else
                    key = "id_"+ Integer.toString(2);
                final ProducerRecord<String, String> record =
                        new ProducerRecord<>(TOPIC, key, "B2B Sample Message: " + count);
                producer.send(record, (metadata, exception) -> {
                    long elapsedTime = System.currentTimeMillis() - time;
                    if (metadata != null) {
                        System.out.printf("sent record(key=%s value=%s) " +
                                        "meta(partition=%d, offset=%d) time=%d timestamp=%d\n",
                                record.key(), record.value(), metadata.partition(),
                                metadata.offset(), elapsedTime, metadata.timestamp());
                        System.out.println("Timestamp:: "+metadata.timestamp() );
                    } else {
                        exception.printStackTrace();
                    }
                    countDownLatch.countDown();
                });
            }
            try {
                countDownLatch.await(25, TimeUnit.SECONDS);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
        }finally {
            producer.flush();
            producer.close();
        }

    }
最终制作人=新卡夫卡制作人(财产); 长时间=System.currentTimeMillis(); 最终倒计时锁存器倒计时锁存器=新倒计时锁存器(5); 整数计数=0; 试一试{ 用于(长索引=时间;索引<时间+10;索引++){ 字符串键=null; 计数++; 如果(计算){ long elapsedTime=System.currentTimeMillis()-时间; if(元数据!=null){ System.out.printf(“已发送记录(项=%s值=%s)”+ “元(分区=%d,偏移量=%d)时间=%d时间戳=%d\n”, record.key()、record.value()、metadata.partition(), metadata.offset()、elapsedTime、metadata.timestamp()); System.out.println(“Timestamp::”+metadata.Timestamp()); }否则{ 异常。printStackTrace(); } countdownlock.countDown(); }); } 试一试{ 倒计时锁存。等待(25,时间单位。秒); }捕捉(中断异常e){ e、 printStackTrace(); } }最后{ producer.flush(); producer.close(); } }
Does
headers.get(KafkaHeaders.RECEIVED\u时间戳)
@KafkaListener
中对你不起作用?@Gary Russell它起作用…但它不是消息生成的时间戳。据我所知,消费者收到它时它是一个时间戳。让我知道我的理解是否错误。但是KafkaHeaders.timestampType是创建时间,那么它是否意味着在received_时间戳是消息生成的时间??谢谢@Gary Russell。这意味着这一切都取决于时间戳类型…将输入什么值。什么的时间戳?
ConsumerRecord.Timestamp()
-它可以由制作人或经纪人设置。还有一个时间戳类型字段,它描述时间戳
无时间戳类型(-1,“NoTimestampType”),创建时间(0,“CreateTime”),LOG_APPEND_TIME(1,“LogAppendTime”);
You can use something like below:

final Producer<String, String> producer = new KafkaProducer<String, String>(properties);
        long time = System.currentTimeMillis();
        final CountDownLatch countDownLatch = new CountDownLatch(5);
        int count=0;
        try {
            for (long index = time; index < time + 10; index++) {
                String key = null;
                count++;
                if(count<=5)
                    key = "id_"+ Integer.toString(1);
                else
                    key = "id_"+ Integer.toString(2);
                final ProducerRecord<String, String> record =
                        new ProducerRecord<>(TOPIC, key, "B2B Sample Message: " + count);
                producer.send(record, (metadata, exception) -> {
                    long elapsedTime = System.currentTimeMillis() - time;
                    if (metadata != null) {
                        System.out.printf("sent record(key=%s value=%s) " +
                                        "meta(partition=%d, offset=%d) time=%d timestamp=%d\n",
                                record.key(), record.value(), metadata.partition(),
                                metadata.offset(), elapsedTime, metadata.timestamp());
                        System.out.println("Timestamp:: "+metadata.timestamp() );
                    } else {
                        exception.printStackTrace();
                    }
                    countDownLatch.countDown();
                });
            }
            try {
                countDownLatch.await(25, TimeUnit.SECONDS);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
        }finally {
            producer.flush();
            producer.close();
        }

    }