线程“main”java.lang.NoSuchMethodError中出现异常:scala.Predef$.refArrayOps(

线程“main”java.lang.NoSuchMethodError中出现异常:scala.Predef$.refArrayOps(,java,scala,apache-spark,intellij-idea,sparkcore,Java,Scala,Apache Spark,Intellij Idea,Sparkcore,我是scala新手,下面是INTELLJ中代码的错误,请任何人帮助解决 import org.apache.spark.{SparkContext, SparkConf} object wordcount { def main(args: Array[String]) { val conf = new SparkConf() .setMaster("local[*]") .setAppName("TestSpark")

我是scala新手,下面是INTELLJ中代码的错误,请任何人帮助解决

     import org.apache.spark.{SparkContext, SparkConf}
     object wordcount {
     def main(args: Array[String])
      {
      val conf = new SparkConf()
      .setMaster("local[*]")
      .setAppName("TestSpark")
      .set("spark.executor.memory","2g")

       val sc = new SparkContext(conf)
       val a  = sc.parallelize(Seq("This is the firstline", "This is the  
       second line", "This is the third line"))              
       val count = a.flatMap(x => x.split(" "))
       val counts = count.map(word => (word,1)).reduceByKey((x,y) => x+y)
        counts.foreach(println)

       }

       }
我得到以下错误:

      Exception in thread "main" java.lang.NoSuchMethodError:    
     scala.Predef$.refArrayOps([Ljava/lang/Object;)
      Lscala/collection/mutable/ArrayOps;
      at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1342)
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:81)
     at wordcount$.main(wordcount.scala:12)
     at wordcount.main(wordcount.scala)
     Using Spark's default log4j profile: org/apache/spark/log4j-    

您应该使用Scala 2.11来使用Park-core_2.11。即,使用:

scalaVersion := "2.11.8"

AFAIk Spark还不能与Scala 2.12配合使用

您应该使用Scala 2.11才能与Park-core_2.11配合使用。即,使用:

scalaVersion := "2.11.8"

AFAIk Spark在Scala 2.12上还不起作用

谢谢这个workedManoj-你能把它标记为接受答案吗?@Manoj4068你能接受这个答案吗?谢谢这个workedManoj-你能把它标记为接受答案吗?@Manoj4068你能接受这个答案吗?