Sbt Spark:package.class中的签名仅引用类型compileTimeOnly

Sbt Spark:package.class中的签名仅引用类型compileTimeOnly,sbt,apache-spark,Sbt,Apache Spark,在尝试使用SBT构建Spark 1.2.1的MLlib示例时,我遇到了一大堆奇怪的编译错误。Spark 1.1.0也可以很好地构建相同的代码。对于Spark 1.2.1,我使用以下SBT构建文件: name := "Test" version := "1.0" scalaVersion := "2.10.4" libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "1.2.1" % "provided" 因此,我

在尝试使用SBT构建Spark 1.2.1的MLlib示例时,我遇到了一大堆奇怪的编译错误。Spark 1.1.0也可以很好地构建相同的代码。对于Spark 1.2.1,我使用以下SBT构建文件:

name := "Test"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "1.2.1" % "provided"
因此,我得到了以下一组奇怪的错误:

[info] Compiling 1 Scala source to /home/test/target/scala-2.10/classes...
[error] bad symbolic reference. A signature in package.class refers to type compileTimeOnly
[error] in package scala.annotation which is not available.
[error] It may be completely missing from the current classpath, or the version on
[error] the classpath might be incompatible with the version used when compiling package.class.
[error] /home/test/src/main/scala/Test.scala:16: Reference to method augmentString in object Predef should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error] val parsedData = data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
[error] /home/test/src/main/scala/Test.scala:16: Reference to method augmentString in object Predef should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error] val parsedData = data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
[error]                                                               ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 21 s, completed 26.02.2015 17:47:29

如何解决这个问题?如果有人能发布一个通用的SBT来构建Spark 1.2.1+MLlib代码,那就太好了。谢谢

尝试将libraryDependencies行更改为以下内容:

libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.2.1" % "provided"

您正在使用Scala 2.10.4,并且正在尝试为Scala 2.11.x安装Spark库-%%将自动为您选择正确的Scala库版本。

我正在使用IntelliJ编译Spark 1.6.0代码。面对同样的错误。[错误]错误:错误的符号引用。package.class中的签名仅引用类型compileTimeOnly

我通过向项目中添加与Scala语言相关的依赖项来解决这个问题。也许maven不能使用Intellij中的Scala配置。因此,我们应该明确指定Scala依赖项

<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-reflect</artifactId>
    <version>2.10.6</version>
</dependency>


<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>2.10.6</version>
</dependency>

<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-compiler</artifactId>
    <version>2.10.6</version>
</dependency>

org.scala-lang
斯卡拉反射
2.10.6
org.scala-lang
scala图书馆
2.10.6
org.scala-lang
scala编译器
2.10.6

不确定这是否有帮助,但我遇到了类似的错误。我的问题是,我添加了spark core和spark mllib作为maven依赖项。这引起了一些问题。