在CentOS上安装Spark时出现Java兼容错误

在CentOS上安装Spark时出现Java兼容错误,java,apache-spark,Java,Apache Spark,我正在尝试在CentOS上安装Spark。使用sbt/sbt assembly命令生成spark时,会出现以下错误 [warn] /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:129: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for mo

我正在尝试在CentOS上安装Spark。使用
sbt/sbt assembly
命令生成spark时,会出现以下错误

[warn] /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:129: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.
[warn]     getOutputCommitter().cleanupJob(getJobContext())
[warn]                          ^
[warn] /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:592: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.
[warn]     jobCommitter.cleanupJob(jobTaskContext)
[warn]                  ^
[warn] two warnings found
[error] ----------
[error] 1. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 22)
[error]         import io.netty.channel.ChannelFuture;
[error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[error] The import io.netty.channel.ChannelFuture is never used
[error] ----------
[error] 2. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 23)
[error]         import io.netty.channel.ChannelFutureListener;
[error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[error] The import io.netty.channel.ChannelFutureListener is never used
[error] ----------
[error] ----------
[error] 3. WARNING in /root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileServer.java (at line 23)
[error]         import io.netty.channel.Channel;
[error]                ^^^^^^^^^^^^^^^^^^^^^^^^
[error] The import io.netty.channel.Channel is never used
[error] ----------
[error] ----------
[error] 4. WARNING in /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/JavaSparkContextVarargsWorkaround.java (at line 20)
[error]         import java.util.Arrays;
[error]                ^^^^^^^^^^^^^^^^
[error] The import java.util.Arrays is never used
[error] ----------
[error] ----------
[error] 5. ERROR in /root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java (at line 36)
[error]         public final Iterable<Double> apply(T t) { return call(t); }
[error]                                       ^^^^^^^^^^
[error] The method apply(T) of type DoubleFlatMapFunction<T> must override a superclass method
[error] ----------
[error] 5 problems (1 error, 4 warnings)
[error] (core/compile:compile) javac returned nonzero exit code
[error] Total time: 431 s, completed Oct 24, 2013 7:42:21 AM
[warn]/root/spark-0.8.0-incubating/core/src/main/scala/org/apache/SparkHadoopWriter.scala:129:OutputCommitter类中的方法cleanupJob已被弃用:有关详细信息,请参阅相应的Javadoc。
[警告]getOutputCommitter().cleanupJob(getJobContext())
[警告]^
[warn]/root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/pairddfunctions.scala:592:OutputCommitter类中的方法cleanupJob已被弃用:有关详细信息,请参阅相应的Javadoc。
[警告]jobCommitter.cleanupJob(jobTaskContext)
[警告]^
[警告]发现两个警告
[错误]----------
[错误]1。/root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java中的警告(第22行)
[错误]导入io.netty.channel.ChannelFuture;
[错误]^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[错误]从未使用导入io.netty.channel.ChannelFuture
[错误]----------
[错误]2。/root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java中的警告(第23行)
[错误]导入io.netty.channel.ChannelFutureListener;
[错误]^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[错误]从未使用导入io.netty.channel.ChannelFutureListener
[错误]----------
[错误]----------
[错误]3。/root/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileServer.java中的警告(第23行)
[错误]导入io.netty.channel.channel;
[错误]^^^^^^^^^^^^^^^^^^^^^^^^
[错误]从未使用导入io.netty.channel.channel
[错误]----------
[错误]----------
[错误]4。在/root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/JavaSparkContextVarargsWorkaround.java(第20行)中出现警告
[错误]导入java.util.array;
[错误]^^^^^^^^^^^^^^^^
[错误]从未使用导入java.util.array
[错误]----------
[错误]----------
[错误]5。/root/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java中出错(第36行)
[错误]公共最终Iterable应用(T){返回调用(T);}
[错误]^^^^^^^^^^
[错误]DoubleFlatMapFunction类型的方法apply(T)必须重写超类方法
[错误]----------
[错误]5个问题(1个错误,4个警告)
[error](core/compile:compile)javac返回非零退出代码
[错误]总时间:431s,已完成2013年10月24日上午7:42:21
我的机器上安装的java版本是1.7.0_45。
早些时候,我使用了JDK1.6.035,它给出了相同的错误集。
我还尝试了Java1.4,它给出了不同类型的错误。我应该使用哪个版本的java?还是其他问题

这看起来很像sbt引入的旧版本Java。Spark至少需要1.6版本,但不知何故,sbt引入了旧版本

我按照下面的链接在CentOS 5上安装了java 6,并删除了其他JDK,现在它可以工作了


供参考:这一问题也是