Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop单节点启动问题_Hadoop - Fatal编程技术网

Hadoop单节点启动问题

Hadoop单节点启动问题,hadoop,Hadoop,我试图通过执行 start-dfs.sh文件,但出现以下错误 Starting namenodes on [ip-xxx-xx-xxx-xx] ip-xxx-xx-xxx-xx: Permission denied (publickey). Starting datanodes localhost: Permission denied (publickey). Exception in thread "main" java.lang.UnsupportedClassVersionError: o

我试图通过执行 start-dfs.sh文件,但出现以下错误

Starting namenodes on [ip-xxx-xx-xxx-xx]
ip-xxx-xx-xxx-xx: Permission denied (publickey).
Starting datanodes
localhost: Permission denied (publickey).
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/hdfs/tools/GetConf : Unsupported major.minor version 52.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:808)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:442)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:64)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:354)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:348)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:347)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:430)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:323)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:363)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:482)
安装的Java版本是JavaC1.7.0_181 Hadoop是3.0.3

下面是概要文件中的路径内容

export JAVA_HOME=/usr
export PATH=$PATH:$JAVA_HOME/bin

export HADOOP_HOME=/usr/local/hadoop
export PATH=$PATH:$HADOOP_HOME/bin

export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
#export PATH=$PATH:$HADOOP_CONF_DIR

export SCALA_HOME=/usr/local/scala
export PATH=$PATH:$SCALA_HOME/bin
问题是什么?我有什么遗漏吗

谢谢

ssh-keygen

2.它将询问它将复制密钥的文件夹位置,我输入了/home/hadoop/.ssh/id\u rsa

3.为了简单起见,它会要求传递短语,并保持其为空

cat/home/hadoop/.ssh/id_rsa.pub.>>ssh/authorized_keys将新生成的公钥复制到用户home/.ssh目录中的auth文件 ssh localhost不应请求密码


start-dfs.sh现在应该可以了

我按要求做了,但是现在我得到了一个不同的错误-mkdir:cannotcreatedirectory'/usr/local/hadoop/logs':权限被拒绝,有什么建议吗?您尝试过创建目录吗?或者HADOOP_LOG_DIR到您可以访问的目录?我相信HADOOP 3已经放弃了对Java 7的支持。但你为什么不直接使用电子病历呢?