Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 如何用sbt组件制作多项目胖罐_Scala_Apache Spark_Jar_Sbt_Sbt Assembly - Fatal编程技术网

Scala 如何用sbt组件制作多项目胖罐

Scala 如何用sbt组件制作多项目胖罐,scala,apache-spark,jar,sbt,sbt-assembly,Scala,Apache Spark,Jar,Sbt,Sbt Assembly,我有一个使用spark的scala多项目,并尝试使用sbt插件sbt assembly 0.14.3制作一个胖罐子。我的buils.sbt看起来是这样的: lazy val commonSettings = Seq( organization := "blabla", version := "0.1.0", scalaVersion := "2.11.8" ) lazy val core = (project in file(".")) .settings(commonSet

我有一个使用spark的scala多项目,并尝试使用sbt插件sbt assembly 0.14.3制作一个胖罐子。我的buils.sbt看起来是这样的:

lazy val commonSettings = Seq(
  organization := "blabla",
  version := "0.1.0",
  scalaVersion := "2.11.8"
)


lazy val core = (project in file("."))
  .settings(commonSettings: _*)
  .settings(libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.6.1" % "provided",
"org.apache.spark" %% "spark-sql" % "1.6.1" % "provided",
"org.apache.spark" %% "spark-mllib" % "1.6.1" % "provided",...)


lazy val sub_project = project
  .settings(commonSettings: _*)
  .aggregate(core)
  .dependsOn(core)
我想为sub_项目创建一个胖jar,这样这个胖jar就包含了项目核心的所有库和代码。 我尝试了以下方法:

sbt
project sub_project
assembly
我得到以下错误:

[error] missing or invalid dependency detected while loading class file 'blabla.class'.
[error] Could not access term spark in package org.apache,
[error] because it (or its dependencies) are missing. Check your build definition for
[error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
[error] A full rebuild may help if 'blabla.class' was compiled against an incompatible version of org.apache.
[error] one error found

但是,当我在核心项目上使用“assembly”时,我可以得到我的fat jar。

您的构建表明,
子项目的类路径中不存在对Spark的库依赖(不管提供的
语句如何),并且您得到的错误消息与此匹配。您可能希望将此依赖项添加到公共设置。

您指定的Scala版本是否与Spark编译时所依据的版本匹配?是的,但这只是打包,我还没有尝试执行它,因此即使版本不同,打包也不会有问题