Scala Apache Spark安装失败

Scala Apache Spark安装失败,scala,apache-spark,Scala,Apache Spark,我试图在Ubuntu上安装Apache Spark standalone,在运行“sbt/sbt assembly”命令时,出现以下错误: java.lang.RuntimeException: Could not create directory /opt/spark-1.5.1/external/zeromq/target/streams/compile/$global/$global/discoveredMainClasses at scala.sys.package$.e

我试图在Ubuntu上安装Apache Spark standalone,在运行“sbt/sbt assembly”命令时,出现以下错误:

java.lang.RuntimeException: Could not create directory /opt/spark-1.5.1/external/zeromq/target/streams/compile/$global/$global/discoveredMainClasses
        at scala.sys.package$.error(package.scala:27)
        at sbt.IO$.createDirectory(IO.scala:166)
        at sbt.IO$.touch(IO.scala:142)
        at sbt.std.Streams$$anon$3$$anon$2.make(Streams.scala:129)
        at sbt.std.Streams$$anon$3$$anon$2.binary(Streams.scala:116)
        at sbt.SessionVar$$anonfun$persist$1.apply(SessionVar.scala:27)
        at sbt.SessionVar$$anonfun$persist$1.apply(SessionVar.scala:26)
        at sbt.std.Streams$class.use(Streams.scala:75)
        at sbt.std.Streams$$anon$3.use(Streams.scala:100)
        at sbt.SessionVar$.persist(SessionVar.scala:26)
        at sbt.SessionVar$.persistAndSet(SessionVar.scala:21)
        at sbt.Project$RichTaskSessionVar$$anonfun$storeAs$1$$anonfun$apply$5.apply(Project.scala:556)
        at sbt.Project$RichTaskSessionVar$$anonfun$storeAs$1$$anonfun$apply$5.apply(Project.scala:556)
        at sbt.SessionVar$$anonfun$1$$anonfun$apply$1.apply(SessionVar.scala:40)
        at sbt.SessionVar$$anonfun$1$$anonfun$apply$1.apply(SessionVar.scala:40)
        at scala.Function$$anonfun$chain$1$$anonfun$apply$1.apply(Function.scala:24)
        at scala.Function$$anonfun$chain$1$$anonfun$apply$1.apply(Function.scala:24)
        at scala.collection.IndexedSeqOptimized$class.foldl(IndexedSeqOptimized.scala:51)
        at scala.collection.IndexedSeqOptimized$class.foldLeft(IndexedSeqOptimized.scala:60)
        at scala.collection.mutable.ArrayBuffer.foldLeft(ArrayBuffer.scala:47)
        at scala.collection.TraversableOnce$class.$div$colon(TraversableOnce.scala:138)
        at scala.collection.AbstractTraversable.$div$colon(Traversable.scala:105)
        at scala.Function$$anonfun$chain$1.apply(Function.scala:24)
        at sbt.EvaluateTask$.applyResults(EvaluateTask.scala:370)
        at sbt.EvaluateTask$.liftedTree1$1(EvaluateTask.scala:344)
        at sbt.EvaluateTask$.run$1(EvaluateTask.scala:341)
        at sbt.EvaluateTask$.runTask(EvaluateTask.scala:361)
        at sbt.Aggregation$$anonfun$3.apply(Aggregation.scala:64)
        at sbt.Aggregation$$anonfun$3.apply(Aggregation.scala:62)
        at sbt.EvaluateTask$.withStreams(EvaluateTask.scala:293)
        at sbt.Aggregation$.timedRun(Aggregation.scala:62)
        at sbt.Aggregation$.runTasks(Aggregation.scala:71)
        at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:32)
        at sbt.Aggregation$$anonfun$applyTasks$1.apply(Aggregation.scala:31)
        at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:60)
        at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:60)
        at sbt.Aggregation$$anonfun$evaluatingParser$4$$anonfun$apply$5.apply(Aggregation.scala:153)
        at sbt.Aggregation$$anonfun$evaluatingParser$4$$anonfun$apply$5.apply(Aggregation.scala:152)
        at sbt.Act$$anonfun$sbt$Act$$actParser0$1$$anonfun$sbt$Act$$anonfun$$evaluate$1$1$$anonfun$apply$10.apply(Act.scala:244)
        at sbt.Act$$anonfun$sbt$Act$$actParser0$1$$anonfun$sbt$Act$$anonfun$$evaluate$1$1$$anonfun$apply$10.apply(Act.scala:241)
        at sbt.Command$.process(Command.scala:92)
        at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:98)
        at sbt.MainLoop$$anonfun$1$$anonfun$apply$1.apply(MainLoop.scala:98)
        at sbt.State$$anon$1.process(State.scala:184)
        at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:98)
        at sbt.MainLoop$$anonfun$1.apply(MainLoop.scala:98)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.MainLoop$.next(MainLoop.scala:98)
        at sbt.MainLoop$.run(MainLoop.scala:91)
        at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:70)
        at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(MainLoop.scala:65)
        at sbt.Using.apply(Using.scala:24)
        at sbt.MainLoop$.runWithNewLog(MainLoop.scala:65)
        at sbt.MainLoop$.runAndClearLast(MainLoop.scala:48)
        at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:32)
        at sbt.MainLoop$.runLogged(MainLoop.scala:24)
        at sbt.StandardMain$.runManaged(Main.scala:53)
        at sbt.xMain.run(Main.scala:28)
        at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
        at xsbt.boot.Launch$.withContextLoader(Launch.scala:128)
        at xsbt.boot.Launch$.run(Launch.scala:109)
        at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:35)
        at xsbt.boot.Launch$.launch(Launch.scala:117)
        at xsbt.boot.Launch$.apply(Launch.scala:18)
        at xsbt.boot.Boot$.runImpl(Boot.scala:41)
        at xsbt.boot.Boot$.main(Boot.scala:17)
        at xsbt.boot.Boot.main(Boot.scala)
[error] Could not create directory /opt/spark-1.5.1/external/zeromq/target/streams/compile/$global/$global/discoveredMainClasses
[error] Use 'last' for the full log.
还有其他人面临过这个问题吗

java版本“1.8.0_65”


Scala code runner版本2.11.7——版权所有2002-2013,LAMP/EPFL

,因为错误提到您没有对/opt目录的写入权限

Could not create directory /opt/spark-1.5.1/external/zeromq/target/streams/compile/$global/$global/discoveredMainClasses
您需要根访问权限才能写入此文件夹。 你也可以

  • 下载Apache Spark并将其编译到您的主文件夹中,然后将其移动到
    /opt
  • 运行
    sudo sbt/sbt assembly
    在构建spark时获得root访问权限(将其编译为root被认为是不安全的)

您必须具有root权限才能添加和操作/opt/文件。火花配置错误。我建议按照以下步骤安装spark和scala,然后尝试运行sbt。祝你一切顺利

我们不一致地得到了这个错误。这可能是由内部SBT错误引起的

“SBT中似乎存在竞争条件,只有导致多个编译进程并行运行的插件才会触发竞争条件。”

有关更多信息,请参见此处:


查看是否可以禁用某些插件并重新运行。

通常
/opt
属于
root
用户。您在此处使用的用户很可能没有足够的权限在此处写入。Fotgot要提到“sbt/sbt assembly”是在根用户下执行的:root@server:/opt/spark-1.5.1#sbt/sbt assembly您还可以提供其他调试信息吗?我不认为这是特定于火花的。你在ubuntu上安装了sbt吗?SBT与Spark捆绑在一起。我不需要单独安装它。我遵循这些说明,只是使用了最新的Spark版本。