Sbt Spark未解析依赖项hadoop

Sbt Spark未解析依赖项hadoop,sbt,apache-spark,Sbt,Apache Spark,我尝试构建,但在运行sbt包时,我得到以下结果: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: org.apache.hadoop#hadoop-yarn-common;1.0.4: not f

我尝试构建,但在运行
sbt包时,我得到以下结果:

[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.hadoop#hadoop-yarn-common;1.0.4: not found
[warn]  :: org.apache.hadoop#hadoop-yarn-client;1.0.4: not found
[warn]  :: org.apache.hadoop#hadoop-yarn-api;1.0.4: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[error] {file:/home/niko/workspace/Spark/recommender/}default-3ebb80/*:update: sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-yarn-common;1.0.4: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-yarn-client;1.0.4: not found
[error] unresolved dependency: org.apache.hadoop#hadoop-yarn-api;1.0.4: not found
有人知道为了成功运行应用程序必须配置什么吗(如果可能,在没有安装hadoop的情况下)


谢谢

您启用了纱线配置文件,但未设置
hadoop.version
。默认的Hadoop版本是1.0.4,没有这样的版本。通常,无论发生什么情况,您都希望指定hadoop.version。

我在尝试构建spark程序时遇到了完全相同的问题。我发现我的sbt版本给了我错误。我建议完全移除sbt。然后从这里下载。下载.tgz。将其解压缩到主文件夹。然后将bin目录添加到您的路径。

问题是因为您的sbt无法检索目标文件,repos的URL不可用

下载最新的sbt版本,并将以下内容添加到
~/.sbt/repositories

[repositories]
  local
  sbt-releases-repo: http://repo.typesafe.com/typesafe/ivy-releases/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext]
  sbt-plugins-repo: http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext]
  maven-central: http://repo1.maven.org/maven2/

我通过启用自动导入选项,然后再次编译,解决了这个问题