Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/ruby-on-rails/58.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
“线程中的异常”;“主要”;java.lang.NoSuchMethodError:scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps_Scala_Apache Spark_Sbt - Fatal编程技术网

“线程中的异常”;“主要”;java.lang.NoSuchMethodError:scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps

“线程中的异常”;“主要”;java.lang.NoSuchMethodError:scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps,scala,apache-spark,sbt,Scala,Apache Spark,Sbt,当我在终端运行时: sudo spark-submit --master local --class xxx.xxxx.xxx.xxxx.xxxxxxxxxxxxJob --conf 'spark.driver.extraJavaOptions=-Dconfig.resource=xxx.conf' /home/xxxxx/workspace/prueba/pruebas/target/scala-2.11/MiPrueba.jar 配置: val configuration = Seq(

当我在终端运行时:

sudo spark-submit --master local --class xxx.xxxx.xxx.xxxx.xxxxxxxxxxxxJob --conf 'spark.driver.extraJavaOptions=-Dconfig.resource=xxx.conf' /home/xxxxx/workspace/prueba/pruebas/target/scala-2.11/MiPrueba.jar 配置:

  val configuration = Seq(
    "com.github.pureconfig" %% "pureconfig" % "0.9.2",
    "com.typesafe" % "config" % "1.3.1",
    "org.lz4" % "lz4-java" % "1.4.1"
  )
火花:

  val spark = Seq(
    "org.apache.spark" %% "spark-core" % Versions.spark % "provided" exclude("javax.jms", "jms"),
    "org.apache.spark" %% "spark-sql" % Versions.spark % "provided",
    "com.databricks" %% "spark-xml" % "0.4.1"
    // https://mvnrepository.com/artifact/mrpowers/spark-daria
  )

有什么想法吗?

您正在混合scala版本。 Spark 2.4.2不支持scala 2.11。请切换到Spark 2.4.0或将库替换为scala 2.12版本


请注意,从2.4.1开始,Scala 2.11支持已被弃用。从2.4.2开始,为Scala 2.12编译预构建的方便二进制文件。Spark仍然在Maven Central中为2.11和2.12交叉发布,并且可以从源代码为2.11构建。

是否可以运行
Spark submit--version
?Spark submit--version-->2.4.2
  val configuration = Seq(
    "com.github.pureconfig" %% "pureconfig" % "0.9.2",
    "com.typesafe" % "config" % "1.3.1",
    "org.lz4" % "lz4-java" % "1.4.1"
  )
  val spark = Seq(
    "org.apache.spark" %% "spark-core" % Versions.spark % "provided" exclude("javax.jms", "jms"),
    "org.apache.spark" %% "spark-sql" % Versions.spark % "provided",
    "com.databricks" %% "spark-xml" % "0.4.1"
    // https://mvnrepository.com/artifact/mrpowers/spark-daria
  )