Apache spark 从流式kafka检查点ClassNotFoundException恢复

Apache spark 从流式kafka检查点ClassNotFoundException恢复,apache-spark,Apache Spark,我使用spark streaming kafka checkpoints将处理后的kafka偏移量存储到HDFS中的文件夹中,在重新启动应用程序(使用spark submit)以检查恢复后,我在属于spark streaming kafka模块的类上获得ClassNotFoundException,该类已打包到我的应用程序uber jar中。似乎没有在我的应用程序jar中查找该类 使用v1.5.1 15/12/02 15:42:30 INFO streaming.CheckpointReader

我使用spark streaming kafka checkpoints将处理后的kafka偏移量存储到HDFS中的文件夹中,在重新启动应用程序(使用spark submit)以检查恢复后,我在属于spark streaming kafka模块的类上获得ClassNotFoundException,该类已打包到我的应用程序uber jar中。似乎没有在我的应用程序jar中查找该类

使用v1.5.1

15/12/02 15:42:30 INFO streaming.CheckpointReader: Attempting to load checkpoint from file hdfs://ip-xxx-xx-xx-xx:8020/user/checkpoint-1449064500000
15/12/02 15:42:30 WARN streaming.CheckpointReader: Error reading checkpoint from file hdfs://ip-xxx-xx-xx-xx:8020/user/checkpoint-1449064500000
java.io.IOException: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka.OffsetRange
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1163)
    at org.apache.spark.streaming.DStreamGraph.readObject(DStreamGraph.scala:188)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    at org.apache.spark.streaming.Checkpoint$$anonfun$deserialize$2.apply(Checkpoint.scala:151)
    at org.apache.spark.streaming.Checkpoint$$anonfun$deserialize$2.apply(Checkpoint.scala:141)
    at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1206)
    at org.apache.spark.streaming.Checkpoint$.deserialize(Checkpoint.scala:154)
    at org.apache.spark.streaming.CheckpointReader$$anonfun$read$2.apply(Checkpoint.scala:329)
    at org.apache.spark.streaming.CheckpointReader$$anonfun$read$2.apply(Checkpoint.scala:325)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:35)
    at org.apache.spark.streaming.CheckpointReader$.read(Checkpoint.scala:325)
    at org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:852)
... 

更新:发现这个上有一个开放的bug-SPARK-5569()


在建议的提交和构建spark程序集中应用代码更改后,该程序集现在可以运行。

请分享您为从中恢复所做的代码更改?非常感谢