“线程中的异常”;“主要”;java.lang.NoSuchMethodError:scala.Predef$.$scope()Lscala/xml/TopScope$;

“线程中的异常”;“主要”;java.lang.NoSuchMethodError:scala.Predef$.$scope()Lscala/xml/TopScope$;,scala,apache-spark,Scala,Apache Spark,我正在spark中运行一个字数计算程序,但我得到以下错误 我添加了scala-xml_2.11-1.0.2.jar Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 16/12/16 05:14:02 INFO SparkContext: Running Spark version 2.0.2 16/12/16 05:14:03 WARN NativeCodeLo

我正在spark中运行一个字数计算程序,但我得到以下错误 我添加了
scala-xml_2.11-1.0.2.jar

    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    16/12/16 05:14:02 INFO SparkContext: Running Spark version 2.0.2
    16/12/16 05:14:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    16/12/16 05:14:03 WARN Utils: Your hostname, ubuntu resolves to a loopback address: 127.0.1.1; using 192.168.59.132 instead (on interface ens33) 
    16/12/16 05:14:03 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
    16/12/16 05:14:04 INFO SecurityManager: Changing view acls to: hadoopusr
    16/12/16 05:14:04 INFO SecurityManager: Changing modify acls to: hadoopusr
    16/12/16 05:14:04 INFO SecurityManager: Changing view acls groups to: 
    16/12/16 05:14:04 INFO SecurityManager: Changing modify acls groups to: 
    16/12/16 05:14:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(hadoopusr); groups with view permissions: Set(); users  with modify permissions: Set(hadoopusr); groups with modify permissions: Set()
    16/12/16 05:14:05 INFO Utils: Successfully started service 'sparkDriver' on port 40559.
    16/12/16 05:14:05 INFO SparkEnv: Registering MapOutputTracker
    16/12/16 05:14:05 INFO SparkEnv: Registering BlockManagerMaster
    16/12/16 05:14:05 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-0b830180-ae51-451f-9673-4f98dbaff520
    16/12/16 05:14:05 INFO MemoryStore: MemoryStore started with capacity 433.6 MB
    16/12/16 05:14:05 INFO SparkEnv: Registering OutputCommitCoordinator
    Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$;
        at org.apache.spark.ui.jobs.StagePage.<init>(StagePage.scala:44)
        at org.apache.spark.ui.jobs.StagesTab.<init>(StagesTab.scala:34)
        at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:62)
        at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:219)
        at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:161)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:440)
        at LearnScala.WordCount$.main(WordCount.scala:15)
        at LearnScala.WordCount.main(WordCount.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
    16/12/16 05:14:05 INFO DiskBlockManager: Shutdown hook called
    16/12/16 05:14:05 INFO ShutdownHookManager: Shutdown hook called
    16/12/16 05:14:05 INFO ShutdownHookManager: Deleting directory /tmp/spark-789e9a76-894f-468b-a39a-cf00da30e4ba/userFiles-3656d5f8-25ba-45c4-b2f6-9f654a049bb1
    16/12/16 05:14:05 INFO ShutdownHookManager: Deleting directory /tmp/spark-789e9a76-894f-468b-a39a-cf00da30e4ba
Spark版本:2.0.2

我正在spark中运行一个字数计算程序,但我得到了以下信息 错误我添加了scala-xml\ustrong>2.11-1.0.2.jar

稍后我们可以看到:

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "2.0.2"
选择一个;)Scala2.10或Scala2.11。将Scala XML版本更改为2.10或Spark更改为2.11。从Spark 2.0开始,推荐使用Scala 2.11

通过在build.sbt中添加%%,您可以轻松添加正确的Scala版本:

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.2"
其次,在build.sbt中没有关于Scala XML依赖性的信息-您应该添加它


最后,您必须通过
--jars
选项将所有第三方jar添加到spark submit,或构建uber jar-请参见问题

我不明白为什么在给出确切的所需解决方案后,此答案未被标记为接受。
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.2"