Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/elixir/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark “线程中的异常”;“主要”;java.lang.NoClassDefFoundError:org/apache/spark/sql/SQLContext_Apache Spark_Apache Spark Sql_Noclassdeffounderror_Apache Spark 1.6 - Fatal编程技术网

Apache spark “线程中的异常”;“主要”;java.lang.NoClassDefFoundError:org/apache/spark/sql/SQLContext

Apache spark “线程中的异常”;“主要”;java.lang.NoClassDefFoundError:org/apache/spark/sql/SQLContext,apache-spark,apache-spark-sql,noclassdeffounderror,apache-spark-1.6,Apache Spark,Apache Spark Sql,Noclassdeffounderror,Apache Spark 1.6,我正在使用IntelliJ 2016.3版本 import sbt.Keys._ import sbt._ object ApplicationBuild extends Build { object Versions { val spark = "1.6.3" } val projectName = "example-spark" val common = Seq( version := "1.0", scalaVersion := "2.11.

我正在使用IntelliJ 2016.3版本

import sbt.Keys._
import sbt._

object ApplicationBuild extends Build {

  object Versions {
    val spark = "1.6.3"
  }

  val projectName = "example-spark"

  val common = Seq(
    version := "1.0",
    scalaVersion := "2.11.7"
  )

  val customLibraryDependencies = Seq(
    "org.apache.spark" %% "spark-core" % Versions.spark % "provided",
    "org.apache.spark" %% "spark-sql" % Versions.spark % "provided",
    "org.apache.spark" %% "spark-hive" % Versions.spark % "provided",
    "org.apache.spark" %% "spark-streaming" % Versions.spark % "provided",

    "org.apache.spark" %% "spark-streaming-kafka" % Versions.spark
      exclude("log4j", "log4j")
      exclude("org.spark-project.spark", "unused"),

    "com.typesafe.scala-logging" %% "scala-logging" % "3.1.0",

    "org.slf4j" % "slf4j-api" % "1.7.10",

    "org.slf4j" % "slf4j-log4j12" % "1.7.10"
      exclude("log4j", "log4j"),

    "log4j" % "log4j" % "1.2.17" % "provided",

    "org.scalatest" %% "scalatest" % "2.2.4" % "test"
  )
我已经得到下面的运行时异常,即使我正确地提到了所有的依赖项,如上所示

在这个网站上进行了更多的调查。发现这主要是由于
buld.sbt
中的适当条目或版本不匹配。但在我的情况下,一切看起来都很好,如上所示。
请说明我哪里做错了?

我想这是因为您将依赖项标记为“已提供”,但显然您(或想法)没有提供它们


尝试删除“提供的”选项或(我的首选方式):将带有main方法的类移动到
src/test/scala

您不应该使用
spark-sql_2.11
之类的吗?@philantrover!由于我们在提到依赖项时使用%%,sbt足够智能,可以在scala版本中添加下划线。正如我们前面提到的scalaVersion:=“2.11.7”。,sbt将其作为2.11获取,并最终将其作为spark-sql_2.11追加到依赖项中。
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/SQLContext
    at example.SparkSqlExample.main(SparkSqlExample.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.SQLContext
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 6 more