Java 使用openjdk的Hadoop:start-dfs.sh(SSH?)处出错

Java 使用openjdk的Hadoop:start-dfs.sh(SSH?)处出错,java,hadoop,ssh,Java,Hadoop,Ssh,我在设置4集群hadoop体系结构时遇到了一个问题。我有以下4台机器(虚拟化): 主节点 节点1 节点2 节点3 我在主节点上设置了所有conf文件,并使用scp将它们导出到其他文件。主节点可以通过ssh访问从节点。我在所有机器上都将JAVA_HOME设置为.bashrc。然而,我得到的是: hadoop@master-node:~$ start-dfs.sh WARNING: An illegal reflective access operation has occurred WARNI

我在设置4集群hadoop体系结构时遇到了一个问题。我有以下4台机器(虚拟化):

  • 主节点
  • 节点1
  • 节点2
  • 节点3
我在主节点上设置了所有conf文件,并使用scp将它们导出到其他文件。主节点可以通过ssh访问从节点。我在所有机器上都将JAVA_HOME设置为.bashrc。然而,我得到的是:

hadoop@master-node:~$ start-dfs.sh
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Starting namenodes on [node-master]
node-master: ssh: connect to host node-master port 22: Connection timed out
node1: Error: JAVA_HOME is not set and could not be found.
node2: Error: JAVA_HOME is not set and could not be found.
node3: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
hadoop@0.0.0.0's password: 
0.0.0.0: Error: JAVA_HOME is not set and could not be found.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
[3种可能性]使用openJDK 11似乎存在问题,尽管我不太确定这是否是造成这种混乱的原因。这些错误表明ssh存在问题,但i)我上传了conf文件,没有任何问题;ii)我可以从主节点访问所有节点。这可能与设置JAVA_主路径的方式有关吗?下面是我的.bashrc的结尾:

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export PATH=PATH:$PATH/bin
提前感谢您的每一条线索(我不太使用java,在这里我感到有点迷茫)

[编辑]与OracleJDK8相同

hadoop@master-node:~$  readlink -f /usr/bin/java
/usr/lib/jvm/java-8-oracle/jre/bin/java
hadoop@master-node:~$ export JAVA_HOME=/usr/lib/jvm/java-8-oracle/jre
hadoop@master-node:~$ start-dfs.sh
Starting namenodes on [node-master]
node-master: ssh: connect to host node-master port 22: Connection timed out
node1: Error: JAVA_HOME is not set and could not be found.
node3: Error: JAVA_HOME is not set and could not be found.
node2: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
hadoop@0.0.0.0's password: 

0.0.0.0:错误:未设置JAVA_HOME并且找不到它。

是否可以导出如下路径:

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
export PATH=$PATH:$JAVA_HOME/bin
然后,您必须执行以下命令,以确保您的路径包含JAVA_HOME变量。 在.bashrc文件中添加JAVA&PATH变量后,执行以下命令

source ~/.bashrc
然后检查是否有echo$PATH,
如果该值包含JAVA_HOME值,那么它应该可以工作。

找到它了!!!!!!事实证明,JAVA_HOME是通过ssh连接丢失的(为什么,我不知道)

为了克服这个问题,

export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
还应添加到

hadoop/etc/hadoop/hadoop-env.sh

与OracleJDK8相同不幸的是,它没有(只是尝试过):完全相同的行为。顺便说一句,它看起来越来越像是一个访问权限问题,ADOOP并不正式支持Java9及以上版本