Apache spark Spark流媒体中的java.lang.LinkageError

Apache spark Spark流媒体中的java.lang.LinkageError,apache-spark,spark-streaming,Apache Spark,Spark Streaming,我在Scala 2.11.8的CDH 5.10集群上使用Spark 2.2。一切正常,但我突然开始在驱动程序代码中输入: Exception in thread "main" java.lang.LinkageError: loader constraint violation: when resolving method "org.apache.spark.streaming.StreamingContext$.getOrCreate(Ljava/lang/String;L

我在Scala 2.11.8的CDH 5.10集群上使用Spark 2.2。一切正常,但我突然开始在驱动程序代码中输入:

     Exception in thread "main" java.lang.LinkageError: loader constraint violation: when resolving method 
    "org.apache.spark.streaming.StreamingContext$.getOrCreate(Ljava/lang/String;Lscala/Function0;Lorg/apache/hadoop/conf/Configuration;Z)Lorg/apache/spark/streaming/StreamingContext;" 
    the class loader (instance of org/apache/spark/util/ChildFirstURLClassLoader) of the current class, com/hp/hawkeye/driver/StreamingDriver$, 
    and the class loader (instance of sun/misc/Launcher$AppClassLoader)
 for the method's defining class, org/apache/spark/streaming/StreamingContext$, 
have different Class objects for the type scala/Function0 used in the signature

有什么办法可以解决这个问题吗?

找到了解决方案-出现了一个类装入器冲突,这是因为在集群上手动放置了一个依赖项jar。这些措施有助于:

rm -rf ~/.sbt
rm -rf ~/.ivy2/cache
然后重新开始了这个想法。集群上的Spark提交很好。但是在lib(spark-avro-assembly-4.0.0-snapshot)中放置一个额外的依赖jar会带来这个问题。不知怎的,那个用Spark avro 3.2修复Spark 2.2的jar造成了问题