Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/340.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
java.lang.UnsupportedClassVersionError不支持的主版本。次版本51.0 rhdfs_Java_Hadoop_Rhadoop - Fatal编程技术网

java.lang.UnsupportedClassVersionError不支持的主版本。次版本51.0 rhdfs

java.lang.UnsupportedClassVersionError不支持的主版本。次版本51.0 rhdfs,java,hadoop,rhadoop,Java,Hadoop,Rhadoop,我知道这和编译和运行时Java版本之间的差异有关,但我认为我已经正确设置了所有环境变量,所以我不知道这仍然是导致此问题的原因 $ java -version java version "1.7.0_79" Java(TM) SE Runtime Environment (build 1.7.0_79-b15) Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode) $ javac -version java 1.7.0_79

我知道这和编译和运行时Java版本之间的差异有关,但我认为我已经正确设置了所有环境变量,所以我不知道这仍然是导致此问题的原因

$ java -version
java version "1.7.0_79"
Java(TM) SE Runtime Environment (build 1.7.0_79-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)
$ javac -version
java 1.7.0_79
$ echo $JAVA_HOME
/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home
$ hadoop version
Hadoop 2.7.1
在RStudio,我有

> Sys.getenv("JAVA_HOME")
[1] "/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home"
> library(rhdfs)
Loading required package: rJava

HADOOP_CMD=/usr/local/Cellar/hadoop/2.7.1/bin/hadoop

Be sure to run hdfs.init()
Warning message:
package ‘rJava’ was built under R version 3.1.3 
> hdfs.init()
Error in .jnew("org/apache/hadoop/conf/Configuration") : 
  java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
此外,我还将Hadoop的Hadoop-env.sh中的$JAVA_HOME设置为1.7.0

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home

如果有人能指出这里发生了什么,我将不胜感激。

毫无疑问,您已经到处搜索,发现Java“主要”51版是1.7版,所以很接近了

对我来说,唯一清楚的方法是检查正在检查的类文件--
org.apache.hadoop.conf.Configuration
。下面是a的开始定义。请注意,
minor\u version
major\u version
分别是第2和第3个字段。这将告诉您编译类的依据,因此您需要执行的最小运行时

struct Class_File_Format {
   u4 magic_number;

   u2 minor_version;   
   u2 major_version;

   u2 constant_pool_count;   

   cp_info constant_pool[constant_pool_count - 1];

   u2 access_flags;