Apache kafka 卡夫卡流如何获取卡夫卡标题
我有下面的卡夫卡流代码Apache kafka 卡夫卡流如何获取卡夫卡标题,apache-kafka,stream,apache-kafka-streams,Apache Kafka,Stream,Apache Kafka Streams,我有下面的卡夫卡流代码 public class KafkaStreamHandler implements Processor<String, String>{ private ProcessorContext context; @Override public void init(ProcessorContext context) { // TODO Auto-generated method stub
public class KafkaStreamHandler implements Processor<String, String>{
private ProcessorContext context;
@Override
public void init(ProcessorContext context) {
// TODO Auto-generated method stub
this.context = context;
}
public KeyValue<String, KafkaStatusRecordWrapper> process(String key, String value) {
Headers contexts = context.headers();
contexts.forEach(header -> System.out.println(header));
}
public void StartFailstreamHandler() {
StreamsBuilder builder = new StreamsBuilder();
KStream<String, String> userStream = builder.stream("usertopic",Consumed.with(Serdes.String(), Serdes.String()));
Properties props = new Properties();
props.put(StreamsConfig.APPLICATION_ID_CONFIG, "failed-streams-userstream");
props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "ALL my bootstrap servers);
props.put(StreamsConfig.NUM_STREAM_THREADS_CONFIG, 4);
props.put("enable.auto.commit", "true");
props.put("auto.commit.interval.ms", "500");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "latest");
//consumer_timeout_ms
props.put(ConsumerConfig.MAX_POLL_INTERVAL_MS_CONFIG, 2000);
props.put("state.dir","/tmp/kafka/stat));
userStream.peek((key,value)->System.out.println("key :"+key+" value :"+value));
/* take few descsion based on Header */
/* How to get the Header */
userStream.map(this::process);
KafkaStreams kafkaStreams = new KafkaStreams(builder.build(), props);
kafkaStreams.setUncaughtExceptionHandler(new Thread.UncaughtExceptionHandler() {
@Override
public void uncaughtException(Thread t, Throwable e) {
logger.error("Thread Name :" + t.getName() + " Error while processing:", e);
}
});
kafkaStreams.cleanUp();
kafkaStreams.start();
}
}
公共类KafkaStreamHandler实现处理器{
私有处理器上下文上下文;
@凌驾
公共void init(ProcessorContext上下文){
//TODO自动生成的方法存根
this.context=上下文;
}
公钥值处理(字符串键、字符串值){
Headers contexts=context.Headers();
contexts.forEach(header->System.out.println(header));
}
public void StartFailstreamHandler(){
StreamsBuilder builder=新的StreamsBuilder();
KStream userStream=builder.stream(“usertopic”,consumered.with(Serdes.String(),Serdes.String());
Properties props=新属性();
put(StreamsConfig.APPLICATION_ID_CONFIG,“failed streams userstream”);
put(StreamsConfig.BOOTSTRAP\u SERVERS\u CONFIG,“所有我的引导服务器”);
put(StreamsConfig.NUM\u STREAM\u THREADS\u CONFIG,4);
props.put(“enable.auto.commit”、“true”);
props.put(“auto.commit.interval.ms”,“500”);
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG,“最新”);
//消费者超时
props.put(ConsumerConfig.MAX\u POLL\u INTERVAL\u MS\u CONFIG,2000);
props.put(“state.dir”,”/tmp/kafka/stat));
peek((key,value)->System.out.println(“key:+key+”value:+value));
/*根据标题进行少量描述*/
/*如何获取标题*/
map(this::process);
KafkaStreams KafkaStreams=新的KafkaStreams(builder.build(),props);
kafkaStreams.setUncaughtExceptionHandler(新线程.UncaughtExceptionHandler(){
@凌驾
公共无效未捕获异常(线程t,可丢弃的e){
logger.error(“线程名称:”+t.getName()+“处理时出错:”,e);
}
});
kafkaStreams.cleanUp();
kafkaStreams.start();
}
}
现在我们的一个客户端正在发送卡夫卡头上的版本信息,如下所示
ProducerRecord<Integer, String> record = new ProducerRecord<Integer, String>("topic", 1, "message");
record.headers().add(new RecordHeader("version", "v1".getBytes()));
producer.send(record);
ProducerRecord记录=新的ProducerRecord(“主题”,1,“消息”);
record.headers().add(新的RecordHeader(“version”,“v1.getBytes()));
制作人。发送(记录);
基于这个头,我需要为我的消息选择解析器,如何使用KStream操作符读取这个头?
我已经看过流的所有API,但没有方法给出头
我无法更改为普通的kakfa消费者,因为我的应用程序已经依赖于少数KStream API 处理器不允许您在下游DSL中链接新的运营商,您应该使用transformValues,以便use可以继续使用流DSL:
我们可以从上下文中获取标题
userStream.to { key, value, recordContext ->
recordContext.headers()
destinationTopic
}
这回答了你的问题吗?我使用的是2.1.0版本,可能存在重复,但建议的方法不起作用。您是否尝试使用ProcessorContext#header()获取标题?你能分享你正在尝试的代码吗?我添加了我尝试的代码..你错过了调用处理器的代码吗?、userStream.process(KafkaStreamHandler::new)
userStream
.transformValues(ExtractHeaderThenDoSomethingTransformer::new)
.map(this::processs);
userStream.to { key, value, recordContext ->
recordContext.headers()
destinationTopic
}