Scala sbt本地发布后如何引用jar文件

Scala sbt本地发布后如何引用jar文件,scala,sbt,apache-spark,Scala,Sbt,Apache Spark,spark JAR已成功发布到本地存储库: sbt publish-local 以下是spark core的一个摘录-事情看起来很健康: [信息]发布spark-core_2.10至 C:\Users\s80035683.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT\spark-core_2.10-1.1.0-SNAPSHOT-javadoc.jar [信息]发布spark-core_2.10至 C:\Users\s

spark JAR已成功发布到本地存储库:

sbt publish-local
以下是spark core的一个摘录-事情看起来很健康:

[信息]发布spark-core_2.10至 C:\Users\s80035683.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT\spark-core_2.10-1.1.0-SNAPSHOT-javadoc.jar [信息]发布spark-core_2.10至 C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\poms\spark-core_2.10.pom [信息]发布spark-core_2.10至 C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\jars\spark-core_2.10.jar [信息]发布spark-core_2.10至 C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\srcs\spark-core_2.10-sources.jar [信息]发布spark-core_2.10至 C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\docs\spark-core_2.10-javadoc.jar [信息]将常春藤发布到 C:\Users\s80035683.ivy2\local\org.apache.spark\spark-core_2.10\1.1.0-SNAPSHOT\ivys\ivy.xml

特别是:.m2中有一个文件:

C:\Users\s80035683\.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT>dir

 Directory of C:\Users\s80035683\.m2\repository\org\apache\spark\spark-core_2.10\1.1.0-SNAPSHOT

06/26/2014  04:25 PM    <DIR>          .
06/26/2014  04:25 PM    <DIR>          ..
06/26/2014  04:25 PM         1,180,476 spark-core_2.10-1.1.0-SNAPSHOT-javadoc.jar
06/26/2014  04:24 PM           808,815 spark-core_2.10-1.1.0-SNAPSHOT-sources.jar
06/26/2014  02:27 PM         5,781,917 spark-core_2.10-1.1.0-SNAPSHOT.jar
06/26/2014  05:03 PM            13,436 spark-core_2.10-1.1.0-SNAPSHOT.pom
因此,我们:

  • 良好的本地回购协议
  • 参考本地回购协议的build.sbt
但当我们这样做时:

sbt package
我们对刚刚发布的相同spark工件有未解决的依赖关系:

[info] Loading project definition from C:\apps\hspark\project
[info] Set current project to hspark (in build file:/C:/apps/hspark/)
[info] Updating {file:/C:/apps/hspark/}hspark...
[info] Resolving org.scala-lang#scala-library;2.10.4 ...
  [info] Resolving org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT ...
  [info] Resolving org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT ...
  [info] Resolving org.scala-lang#scala-compiler;2.10.4 ...
  [info] Resolving org.scala-lang#scala-reflect;2.10.4 ...
  [info] Resolving org.scala-lang#jline;2.10.4 ...
  [info] Resolving org.fusesource.jansi#jansi;1.4 ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[warn]  :: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
unresolved dependency: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
        at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:217)
        at sbt.IvyActions$$anonfun$update$1.apply(IvyActions.scala:126)
..
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:744)
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile
[error] unresolved dependency: org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-sql_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHOT compile

[
[error] (*:update) sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: configuration not found in org.apache.spark#spark-core_2.10;1.1.0-SNAPSHOT: 'default'. It was required from default#hspark_2.10;0.1.0-SNAPSHO
更新根据@lpiepiora的回答,似乎删除compile->default确实(令人惊讶地)起到了作用。这是目前为止的证据

(使用依赖关系图插件):

完成更新。[信息]默认值:hspark_2.10:0.1.0-SNAPSHOT[S][info]
+-spark:spark-core_2.10:1.1.0-SNAPSHOT[S]


尝试删除依赖项的映射
compile->default
。无论如何,它是多余的,正如文档所说:

没有映射(否“->”)的配置映射到“默认”或 “编译”。->仅在映射到其他对象时才需要 配置比那些

因此,声明您的依赖项如下:

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.10" % sparkVersion withSources(),
  "org.apache.spark" % "spark-sql_2.10" % sparkVersion  withSources()
)

他们应该解决问题。

这似乎奏效了。我正在做更多的验证。期待尽快接受。实际上这似乎不起作用。你测试过这个吗?是的,我测试过。发布localy并有另一个项目使用它。我会再测试一次,然后把它放在某个地方。因此,也许您可以在您的配置上测试它。“sparkVersion with sources()”不起作用。我必须删除对我有效的withSources()@javadba-当使用
withSources()
时,您遇到了什么错误?我使用的是sbt 0.13.5。
libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.10" % sparkVersion withSources(),
  "org.apache.spark" % "spark-sql_2.10" % sparkVersion  withSources()
)