Apache flink Apache Flink如何从Java ObjectNode接收->;JSON字符串?
所以这需要JSON字符串->Java对象节点Apache flink Apache Flink如何从Java ObjectNode接收->;JSON字符串?,apache-flink,flink-streaming,flink-cep,Apache Flink,Flink Streaming,Flink Cep,所以这需要JSON字符串->Java对象节点 final DataStream<ObjectNode> inputStream = env .addSource(new RMQSource<ObjectNode>( connectionConfig, // config for the RabbitMQ connection "start",
final DataStream<ObjectNode> inputStream = env
.addSource(new RMQSource<ObjectNode>(
connectionConfig, // config for the RabbitMQ connection
"start", // name of the RabbitMQ queue to consume
true, // use correlation ids; can be false if only at-least-once is required
new JSONDeserializationSchema())) // deserialization schema to turn messages into Java objects
.setParallelism(1); // non-parallel source is only required for exactly-once
final DataStream inputStream=env
.addSource(新的RMQSource)(
connectionConfig,//RabbitMQ连接的配置
“start”,//要使用的RabbitMQ队列的名称
true,//使用相关ID;如果至少需要一次,则可以为false
新的JSONDeserializationSchema())//将消息转换为Java对象的反序列化模式
.setParallelism(1);//非并行源仅需要一次
如何将它们从Java ObjectNode->JSON字符串中放回
stream.addSink(new RMQSink<ObjectNode>(
connectionConfig,
"stop",
new JSONSerializationSchema()
));
stream.addSink(新的RMQSink(
连接配置,
“停止”,
新的JSONSerializationSchema()
));
JSONSerializationSchema
不存在,但我需要类似的东西。使用自定义的序列化架构
如下:
stream.addSink(new RMQSink<ObjectNode>(
connectionConfig,
"stop",
new SerializationSchema<ObjectNode>() {
@Override
public byte[] serialize( ObjectNode element ) {
return element.toString().getBytes();
}
}
));
stream.addSink(新的RMQSink(
连接配置,
“停止”,
新的SerializationSchema(){
@凌驾
公共字节[]序列化(ObjectNode元素){
返回元素.toString().getBytes();
}
}
));