Java 如何定义具有行时间属性的ApacheFlink表
我有json行作为我的数据,我想用它创建一个表Java 如何定义具有行时间属性的ApacheFlink表,java,apache-flink,flink-streaming,flink-sql,Java,Apache Flink,Flink Streaming,Flink Sql,我有json行作为我的数据,我想用它创建一个表 StreamTableEnvironment fsTableEnv = StreamTableEnvironment.create(streamExecutionEnvironment, fsSettings); String allEventsTable = "allEventsTable"; fsTableEnv.connect(new Kafka() .version("
StreamTableEnvironment fsTableEnv = StreamTableEnvironment.create(streamExecutionEnvironment, fsSettings);
String allEventsTable = "allEventsTable";
fsTableEnv.connect(new Kafka()
.version("0.11")
.topic("events")
.property("bootstrap.servers", "localhost:9092")
.property("group.id", "dummyquery").startFromLatest())
.withSchema(new Schema()
.field("rule_id", Types.INT)
.field("sourceAddress", Types.STRING)
.field("deviceProduct", Types.STRING)
.field("destHost", Types.STRING)
.field("extra", Types.STRING)
.field("rowtime", Types.SQL_TIMESTAMP)
.rowtime(new Rowtime().timestampsFromField("rowtime").watermarksPeriodicBounded(2000))
)
.withFormat(new Json().failOnMissingField(false).deriveSchema())
.inAppendMode()
.registerTableSource(allEventsTable);
Table result = fsTableEnv.sqlQuery("select * from allEventsTable where sourceAddress='12345431'");
DataStream alert = fsTableEnv.toAppendStream(result, Row.class);
alert.print();
然而,在运行作业时,我得到了错误
Exception in thread "main" org.apache.flink.table.api.ValidationException: Field 'rowtime' could not be resolved by the field mapping.
at org.apache.flink.table.sources.TableSourceValidation.resolveField(TableSourceValidation.java:245)
at org.apache.flink.table.sources.TableSourceValidation.lambda$validateTimestampExtractorArguments$6(TableSourceValidation.java:202)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
另外,我正在使用flink 1.9
我在kafka主题事件中输入的json数据如下
{"rule_id":"", "rowtime":"2020-07-23 13:10:13","sourceAddress":"12345433","deviceProduct":"234r5t", "destHost":"876543", "extra":"dummy"}
恐怕这是一只虫子。我创建了跟踪它 如果更改行时定义中的一个字段名,您应该能够解决这个问题。更改逻辑字段的名称:
.field("timeAttribute", Types.SQL_TIMESTAMP)
.rowtime(new Rowtime().timestampsFromField("rowtime").watermarksPeriodicBounded(2000))
或起源的物理场:
.field("rowtime", Types.SQL_TIMESTAMP)
.rowtime(new Rowtime().timestampsFromField("timestamp").watermarksPeriodicBounded(2000))