Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/gwt/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 为什么sbt报告Spark 1.3.0-SNAPSHOT JAR的未解决依赖关系?_Scala_Sbt_Apache Spark - Fatal编程技术网

Scala 为什么sbt报告Spark 1.3.0-SNAPSHOT JAR的未解决依赖关系?

Scala 为什么sbt报告Spark 1.3.0-SNAPSHOT JAR的未解决依赖关系?,scala,sbt,apache-spark,Scala,Sbt,Apache Spark,我的sbt文件包含以下内容 name := "Simple Project" version := "1.3.0-SNAPSHOT" scalaVersion := "2.10.4" libraryDependencies += "org.apache.spark" % "spark-core" % "1.3.0-SNAPSHOT" 通过sbt包运行项目时,我得到以下错误: [info] Set current project to Simple Project (in build f

我的sbt文件包含以下内容

name := "Simple Project" 
version := "1.3.0-SNAPSHOT" 
scalaVersion := "2.10.4" 
libraryDependencies += "org.apache.spark" % "spark-core" % "1.3.0-SNAPSHOT" 
通过
sbt包运行项目时,我得到以下错误:

[info] Set current project to Simple Project (in build file:/home/roott/SparkProjects/checkProject/) 
[info] Updating {file:/home/roott/SparkProjects/checkProject/}default-9d4332... 
[info] Resolving org.scala-lang#scala-library;2.10.4 ... 
[info] Resolving org.apache.spark#spark-core;1.3.0-SNAPSHOT ... 
[warn]  module not found: org.apache.spark#spark-core;1.3.0-SNAPSHOT 
[warn] ==== local: tried 
[warn]   /home/roott/.ivy2/local/org.apache.spark/spark-core/1.3.0-SNAPSHOT/ivys/ivy.xml 
[warn] ==== public: tried 
[warn]   http://repo1.maven.org/maven2/org/apache/spark/spark-core/1.3.0-SNAPSHOT/spark-core-1.3.0-SNAPSHOT.pom
[warn]  :::::::::::::::::::::::::::::::::::::::::::::: 
[warn]  ::          UNRESOLVED DEPENDENCIES         :: 
[warn]  :::::::::::::::::::::::::::::::::::::::::::::: 
[warn]  :: org.apache.spark#spark-core;1.3.0-SNAPSHOT: not found 
[warn]  :::::::::::::::::::::::::::::::::::::::::::::: 
[error] {file:/home/roott/SparkProjects/checkProject/}default-9d4332/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core;1.3.0-SNAPSHOT: not found 
[error] Total time: 2 s, completed 28-Dec-2014 16:49:50 

这个错误意味着什么以及如何修复它?

我认为Spark project不会在任何地方发布
1.3.0-SNAPSHOT
二进制文件,所以您应该在本地构建Spark并在项目中引用它


使用ApacheMaven作为构建工具构建Spark Follow时,需要使用
resolvers+=Resolver.mavenLocal
将本地Maven存储库添加到sbt构建中。请阅读sbt的官方文档。

search.maven.org上没有任何内容