Intellij idea 为什么建筑火花源会产生“火花”;对象sbt不是包com.typesafe“的成员;?

Intellij idea 为什么建筑火花源会产生“火花”;对象sbt不是包com.typesafe“的成员;?,intellij-idea,sbt,apache-spark,Intellij Idea,Sbt,Apache Spark,我尝试在Windows上使用IntelliJ IDEA和sbt插件编译该项目 我面临着一个关于sbt的错误。由于我不熟悉sbt,我不知道如何修复它 错误消息如下所示: [info] Loading project definition from F:\codeReading\sbtt\spark-master\project [info] Compiling 3 Scala sources to F:\codeReading\sbtt\spark-master\project\target\sc

我尝试在Windows上使用IntelliJ IDEA和sbt插件编译该项目

我面临着一个关于sbt的错误。由于我不熟悉sbt,我不知道如何修复它

错误消息如下所示:

[info] Loading project definition from F:\codeReading\sbtt\spark-master\project
[info] Compiling 3 Scala sources to F:\codeReading\sbtt\spark-master\project\target\scala-2.10\sbt-0.13\classes...
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:26: object sbt is not a member of package com.typesafe
[error] import com.typesafe.sbt.pom.{PomBuild, SbtPomKeys}
[error]                     ^
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:51: not found: type PomBuild
[error] object SparkBuild extends PomBuild {
[error]                           ^
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:118: not found: value SbtPomKeys
[error]     otherResolvers <<= SbtPomKeys.mvnLocalRepository(dotM2 => Seq(Resolver.file("dotM2", dotM2))),
[error]                        ^
[error] F:\codeReading\sbtt\spark-master\project\SparkBuild.scala:178: value projectDefinitions is not a member of AnyRef
[error]     super.projectDefinitions(baseDirectory).map { x =>
[error]           ^
[error] four errors found
[error] (plugins/compile:compile) Compilation failed
[info]正在从F:\codeReading\sbtt\spark master\project加载项目定义
[信息]正在将3个Scala源代码编译为F:\codeReading\sbtt\spark master\project\target\Scala-2.10\sbt-0.13\classes。。。
[错误]F:\codeReading\sbtt\spark master\project\SparkBuild.scala:26:对象sbt不是包com.typesafe的成员
[错误]导入com.typesafe.sbt.pom.{PomBuild,SbtPomKeys}
[错误]^
[错误]F:\codeReading\sbtt\spark master\project\SparkBuild.scala:51:未找到:键入pomfild
[错误]对象SparkBuild扩展了PomBuild{
[错误]^
[错误]F:\codeReading\sbtt\spark master\project\SparkBuild.scala:118:未找到:值SbtPomKeys
[错误]其他解析程序
[错误]^
[错误]发现四个错误
[错误](插件/编译:编译)编译失败


看来IDEA不喜欢Spark构建定义所使用的git项目的项目引用

当我在克隆项目中运行
sbt
时,它出现了:

➜  spark git:(master) ✗ xsbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/oss/spark/project/project
[info] Loading project definition from /Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Loading project definition from /Users/jacek/oss/spark/project
[info] Set current project to spark-parent (in build file:/Users/jacek/oss/spark/)
>
在访问定义了引用的
插件
项目时,您可以看到sbt pom reader的git项目的项目引用:

> reload plugins
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/oss/spark/project/project
[info] Updating {file:/Users/jacek/oss/spark/project/project/}project-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Loading project definition from /Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Updating {file:/Users/jacek/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project/}sbt-pom-reader-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Loading project definition from /Users/jacek/oss/spark/project
> projects
[info] In file:/Users/jacek/oss/spark/project/
[info]   * plugins
[info]     spark-style
[info] In https://github.com/ScrapCodes/sbt-pom-reader.git
[info]     sbt-pom-reader

一个解决方案可以是执行
sbt gen idea
为idea生成项目文件。不过,这只是一个猜测。

Spark是用Maven构建的。sbt构建只是一种方便。将其作为Maven项目导入会有更好的结果。

您是如何在idea中执行编译的?这是在将项目导入到IDEA?我从github/apache/spark下载了spark-master.zip,将其解压缩,然后使用sbt而不是maven导入解压缩的项目。我在Intellij IDEA(Windows XP操作系统)中的操作是:文件->导入项目->选择spark主目录->选择“从外部模型导入项目”->选择“SBT项目”-->下一步->完成是的,我使用IDEA导入了项目。你是对的。我尝试使用maven编译spark,我成功了。哎哟,我错过了那个开关。他们什么时候转移到maven的?我记得1.0.0仍然使用SBT还有maven。谢谢。我用maven导入的版本是1.0.0。我用SBT在1.0.0和1.0.1两个版本都失败了。记住,还有一个从maven导出的SBT版本。我想这可能是1.0之后的版本,但它已经在master中运行了几个星期。我假设OP是从master生成的。