Apache flink 如何迭代Flink数据流中的每条消息?
我有一个来自卡夫卡的消息流,如下所示Apache flink 如何迭代Flink数据流中的每条消息?,apache-flink,flink-streaming,Apache Flink,Flink Streaming,我有一个来自卡夫卡的消息流,如下所示 DataStream<String> messageStream = env .addSource(new FlinkKafkaConsumer09<>(topic, new MsgPackDeserializer(), props)); DataStream messageStream=env .addSource(新的FlinkKafkaConsumer09(主题,新的MsgPackDeserializer(),props)
DataStream<String> messageStream = env
.addSource(new FlinkKafkaConsumer09<>(topic, new MsgPackDeserializer(), props));
DataStream messageStream=env
.addSource(新的FlinkKafkaConsumer09(主题,新的MsgPackDeserializer(),props));
如何迭代流中的每条消息并对其进行处理?我在
DataStream
上看到了一个iterate()
方法,但它没有返回迭代器
,我想您正在寻找一个映射函数
DataStream<String> messageStream = env.addSource(
new FlinkKafkaConsumer09<>(topic, new MsgPackDeserializer(), props));
DataStream<Y> mappedMessages = messageStream
.map(new MapFunction<String, Y>() {
public Y map(String message) {
// do something with each message and return Y
}
});
DataStream messageStream=env.addSource(
新的FlinkKafkaConsumer09(主题,新的MsgPackDeserializer(),props));
DataStream mappedMessages=messageStream
.map(新的映射函数(){
公共Y映射(字符串消息){
//对每条消息执行一些操作并返回Y
}
});
如果您不想为每个传入消息只发出一条记录,请查看flatmap函数
DataStream<String> messageStream = env.addSource(
new FlinkKafkaConsumer09<>(topic, new MsgPackDeserializer(), props));
DataStream<Y> mappedMessages = messageStream
.map(new MapFunction<String, Y>() {
public Y map(String message) {
// do something with each message and return Y
}
});