Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 来自sbt程序集的FatJar抛出java.lang.NoClassDefFoundError_Apache Spark_Sbt_Runtime Error_Sbt Assembly_Deeplearning4j - Fatal编程技术网

Apache spark 来自sbt程序集的FatJar抛出java.lang.NoClassDefFoundError

Apache spark 来自sbt程序集的FatJar抛出java.lang.NoClassDefFoundError,apache-spark,sbt,runtime-error,sbt-assembly,deeplearning4j,Apache Spark,Sbt,Runtime Error,Sbt Assembly,Deeplearning4j,我正在尝试使用spark和deepLearning4J执行一个用sbt程序集构建的胖jar,不幸的是,在执行过程中,我遇到了线程“main”java.lang.NoClassDefFoundError中的异常:对于许多jar,错误为。 我试图在spark submit中使用--jars选项添加jar,但是当我添加jar时,对于来自另一个依赖项的另一个类,我遇到了相同的错误 如果我理解的很好,那么sbt汇编生成的FatJar应该可以防止这种问题,因为它包含了所有需要的jar 我的scala文件位于

我正在尝试使用spark和deepLearning4J执行一个用sbt程序集构建的胖jar,不幸的是,在执行过程中,我遇到了线程“main”java.lang.NoClassDefFoundError中的异常:对于许多jar,错误为。 我试图在spark submit中使用--jars选项添加jar,但是当我添加jar时,对于来自另一个依赖项的另一个类,我遇到了相同的错误

如果我理解的很好,那么sbt汇编生成的FatJar应该可以防止这种问题,因为它包含了所有需要的jar

我的scala文件位于myproject/src/main/scala/xxx/spark/yyy/

也许是因为合并策略

如果有帮助,我将加入build.sbt文件

先谢谢你

name := "myproject"

version := "1.0"

scalaVersion := "2.10.4"

val sparkVersion = "1.6.2"

mainClass in assembly := Some("xxx.spark.yyy.Main")

resolvers += Resolver.sojava.lang.NoClassDefFoundErrornatypeRepo("releases")

resolvers += "Spark Packages Repo" at "https://dl.bintray.com/spark-packages/maven"

resolvers += "Akka Snapshot Repository" at "http://repo.akka.io/snapshots/"

resolvers += "Artifactory" at "http://artifacts.kameleoon.net:8081/artifactory/sbt/"

resolvers += "Sbt plugins" at "https://dl.bintray.com/sbt/sbt-plugin-releases"

resolvers += "Sonatype Releases" at "https://oss.sonatype.org/content/repositories/releases/"

resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)


libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"com.datastax.spark" %% "spark-cassandra-connector" % "1.6.0",
"org.apache.spark"  %% "spark-mllib"  % sparkVersion % "provided",
"org.hibernate" % "hibernate-core" % "4.3.11.Final",
"org.hibernate" % "hibernate-entitymanager" % "4.3.11.Final",
compilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full),
"org.json" % "json" % "20160810",
"org.joda" % "joda-convert" % "1.2",
"jfree" % "jfreechart" % "1.0.13",
"commons-io" % "commons-io" % "2.4",
"com.google.guava" % "guava" % "20.0",
"jfree" % "jfreechart" % "1.0.13",
"org.bytedeco" % "javacv" % "1.2",
"org.datavec" % "datavec-data-codec" % "0.7.2",
"org.datavec" % "datavec-spark_2.10" % "0.7.2",
"org.datavec" % "datavec-api" % "0.7.2",
"org.deeplearning4j" % "deeplearning4j-core" % "0.7.2",
"org.deeplearning4j" % "deeplearning4j-nn" % "0.7.2",
"org.deeplearning4j" % "dl4j-spark_2.10" % "0.7.2",
"org.jblas" % "jblas" % "1.2.4"
)


assemblyMergeStrategy in assembly := {
    case PathList("org", "joda", "time", "base", "BaseDateTime.class") => MergeStrategy.first
    case PathList("com", "esotericsoftware", "minlog", "Log.class") => MergeStrategy.first
    case PathList("org", "apache", xs @ _*) => MergeStrategy.last
    case PathList("com", "google", xs @ _*) => MergeStrategy.last
    case PathList("META-INF", xs @ _*) => MergeStrategy.rename
    case "about.html" => MergeStrategy.rename
    case x => val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}

你有完整的堆栈跟踪吗?有相同的问题,这解决了它: