Scala 在flink中使用表(数据流)时出错
我正在使用flink 1.9.0,无法导入或获取表文件 我尝试导入与之相关的不同SBTScala 在flink中使用表(数据流)时出错,scala,csv,sbt,apache-flink,flink-cep,Scala,Csv,Sbt,Apache Flink,Flink Cep,我正在使用flink 1.9.0,无法导入或获取表文件 我尝试导入与之相关的不同SBT def main(args: Array[String]): Unit = { val env = StreamExecutionEnvironment.getExecutionEnvironment val tEnv = StreamTableEnvironment.create(env) val tempSource = CsvTableSource.builder()
def main(args: Array[String]): Unit = {
val env = StreamExecutionEnvironment.getExecutionEnvironment
val tEnv = StreamTableEnvironment.create(env)
val tempSource = CsvTableSource.builder()
.path("/home/amulya/Desktop/csvForBroadcast/CSV.csv")
.fieldDelimiter(",")
.field("locationID", Types.STRING())
.field("temp", Types.DOUBLE())
.build()
tEnv.registerTableSource("Temperatures", tempSource)
val askTable = tEnv
.scan("Temperatures")
.where(" 'Temperature >= 50")
.select("'locationID, 'temp")
val stream = tEnv.toAppendStream[Events](askTable)
.print()
env.execute()
}
case class Events(locationID: String, temp: Long)
}
我有一个简单的CSV格式的数据:-
locationID,temp
"1",25
"2",25
"3",35
"4",45
"5",55
这就是错误:-
Error:scalac: missing or invalid dependency detected while loading class file 'ScalaCaseClassSerializer.class'.
Could not access type SelfResolvingTypeSerializer in object org.apache.flink.api.common.typeutils.TypeSerializerConfigSnapshot,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'ScalaCaseClassSerializer.class' was compiled against an incompatible version of org.apache.flink.api.common.typeutils.TypeSerializerConfigSnapshot.
我正在尝试对这些基本数据执行CEP,以便开始使用ApacheFlink,非常感谢您提供的任何帮助
目前scala桥接模块仅提供表和数据集/数据流之间的基本转换。非常感谢,我会尝试让您知道它是否适用于meIt,但在使用筛选器时会出现进一步的问题,
无法解决重载方法筛选器
最好在说明中添加代码。我在图片中没有看到任何过滤器。完成后,请立即选中删除过滤器中表达式末尾的“删除”。