没有发现嵌套Java类的编码器错误

没有发现嵌套Java类的编码器错误,java,scala,apache-spark,apache-spark-sql,spark-structured-streaming,Java,Scala,Apache Spark,Apache Spark Sql,Spark Structured Streaming,我创建了一个Scala类,如下所示: case class MyObjectWithEventTime(value: MyObject, eventTime: Timestamp) MyObject是一个Java对象 我尝试在Spark结构化流媒体工作中使用它,如下所示: implicit val myObjectEncoder: Encoder[MyObject] = Encoders.bean(classOf[MyObject]) val withEventTime = mystream

我创建了一个Scala类,如下所示:

case class MyObjectWithEventTime(value: MyObject, eventTime: Timestamp)
MyObject是一个Java对象

我尝试在Spark结构化流媒体工作中使用它,如下所示:

implicit val myObjectEncoder: Encoder[MyObject] = Encoders.bean(classOf[MyObject])

val withEventTime = mystream
 .select(from_json(col("value").cast("string"), schema).alias("value"))
 .withColumn("eventTime", to_timestamp(col("value.timeArrived")))
 .as[MyObjectWithEventTime]
 .groupByKey(row => {... some code here
 })
 .mapGroupsWithState(GroupStateTimeout.ProcessingTimeTimeout())(updateAcrossEvents)
 .filter(col("id").isNotNull)
 .toJSON
 .writeStream
 .format("kafka")
 .option("checkpointLocation", "/tmp")
 .option("kafka.bootstrap.servers", "localhost:9092")
 .option("topic", conf.KafkaProperties.outputTopic)
 .option("checkpointLocation", "/tmo/checkpointLocation")
 .outputMode("update")
 .start()
 .awaitTermination()
但我一直在犯这个错误

Exception in thread "main" java.lang.UnsupportedOperationException: No Encoder found for com.xxx.MyObject
- field (class: "com.xxx.MyObject", name: "value")
- root class: "com.xxx.MyObjectWithEventTime"

尝试为
MyObjectWithEventTime
定义编码器,并使用
编码器。javaSerialization[T]
方法:

implicit val myObjectEncoder: Encoder[MyObject] = Encoders.javaSerialization[MyObject]
implicit val myObjectWithEventEncoder: Encoder[MyObjectWithEventTime] = Encoders.javaSerialization[MyObjectWithEventTime]

记住,您的java类
MyObject
应该实现可序列化,并且已经为所有字段实现了公共getter和setter

错误:(55,69)类型参数[com.xxx.MyObject]不符合方法产品的类型参数界限[T@DilTeam I updated answer,尝试使用JavaSerialization不确定原因。因为[MyObject](Encoders.kryo[MyObject])工作!!!