Java 构建Spark 1.3.0 JDK 1.6.045 maven 3.0.5 CentOS 6时出错

Java 构建Spark 1.3.0 JDK 1.6.045 maven 3.0.5 CentOS 6时出错,java,scala,maven,apache-spark,spark-streaming,Java,Scala,Maven,Apache Spark,Spark Streaming,当我试图在包中添加依赖项来构建Spark 1.3.0时 我得到一个与类不匹配相关的错误 `[warn] /u01/spark/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:23: imported `Clock' is permanently hidden by definition of trait Clock in package spark [warn] import org.apache.spar

当我试图在包中添加依赖项来构建Spark 1.3.0时 我得到一个与类不匹配相关的错误

`[warn] /u01/spark/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:23: imported `Clock' is permanently hidden by definition of trait Clock in package spark
[warn] import org.apache.spark.util.{SystemClock, Clock}
[warn]                                            ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala:127: type mismatch;
[error]  found   : org.apache.spark.util.SystemClock
[error]  required: org.apache.spark.Clock
[error]   private var clock: Clock = new SystemClock()
[error]                              ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala:66: reference to Clock is ambiguous;
[error] it is imported twice in the same scope by
[error] import org.apache.spark.util._
[error] and import org.apache.spark._
[error]     clock: Clock = new SystemClock())
[error]            ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:34: imported `Clock' is permanently hidden by definition of trait Clock in package worker
[warn] import org.apache.spark.util.{Clock, SystemClock}
[warn]                               ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:61: type mismatch;
[error]  found   : org.apache.spark.util.SystemClock
[error]  required: org.apache.spark.deploy.worker.Clock
[error]   private var clock: Clock = new SystemClock()
[error]                              ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:190: value getTimeMillis is not a member of org.apache.spark.deploy.worker.Clock
[error]       val processStart = clock.getTimeMillis()
[error]                                ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala:192: value getTimeMillis is not a member of org.apache.spark.deploy.worker.Clock
[error]       if (clock.getTimeMillis() - processStart > successfulRunDuration * 1000) {
[error]                 ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:37: imported `MutableURLClassLoader' is permanently hidden by definition of trait MutableURLClassLoader in package executor
[warn] import org.apache.spark.util.{ChildFirstURLClassLoader, MutableURLClassLoader,
[warn]                                                         ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:319: type mismatch;
[error]  found   : org.apache.spark.util.ChildFirstURLClassLoader
[error]  required: org.apache.spark.executor.MutableURLClassLoader
[error]       new ChildFirstURLClassLoader(urls, currentLoader)
[error]       ^
[error] /u01/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:321: trait MutableURLClassLoader is abstract; cannot be instantiated
[error]       new MutableURLClassLoader(urls, currentLoader)
[error]       ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/scheduler/local/LocalBackend.scala:89: postfix operator millis should be enabled
[warn] by making the implicit value scala.language.postfixOps visible.
[warn] This can be achieved by adding the import clause 'import scala.language.postfixOps'
[warn] or by setting the compiler option -language:postfixOps.
[warn] See the Scala docs for value scala.language.postfixOps for a discussion
[warn] why the feature should be explicitly enabled.
[warn]       context.system.scheduler.scheduleOnce(1000 millis, self, ReviveOffers)
[warn]                                                  ^
[warn] /u01/spark/core/src/main/scala/org/apache/spark/util/MutableURLClassLoader.scala:26: imported `ParentClassLoader' is permanently hidden by definition of class ParentClassLoader in package util
[warn] import org.apache.spark.util.ParentClassLoader
[warn]                              ^
[warn] 5 warnings found
[error] 7 errors found
`
我在尝试构建包含的maven+JDK1.7时也遇到了同样的错误

完整构建输出位于pastebin id i9PFEVJ8上 完整的pom.xml pastebin id 8gEgT5EE

[更新]

我已将spark版本更改为与1.3.0相匹配,现在出现循环依赖项错误。 org.apache.spark spark-2.10 1.3.0 org.apache.spark spark-streaming-kafka_2.10 1.3.0

我意识到kafka流媒体模块附带了针对MapR 3.x的预构建Spark 1.3.0,如果需要在其应用程序中生成流,则需要这些模块和依赖项。

为什么这些依赖项不也是版本1.3.0?谢谢,我将其更改为1.3.0,但现在我遇到另一个错误[错误]反应器中的项目包含一个循环引用:位于“Vertex{label='org.apache.spark:spark-parent_2.10:1.3.0'}”和“Vertex{label='org.apache.spark:spark-streaming-kafka_2.10:1.3.0'}之间的边在org.apache.spark:spark-streaming-kafka_2.10:1.3.0->org.apache.spark:spark-parent_2.10:1.3.0->org.apache.spark:spark-streaming-kafka_2.10:1.3.0->[Help 1]抱歉,我在原始答案中误读了一些内容。您需要同时具有这两个依赖项。@MylesBaker我添加了这两个依赖项,但在我将版本更新为1.3.0后,仍然会出现循环依赖项