Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/ssh/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop |本地主机与目标主机不同_Hadoop_Ssh_Localhost - Fatal编程技术网

Hadoop |本地主机与目标主机不同

Hadoop |本地主机与目标主机不同,hadoop,ssh,localhost,Hadoop,Ssh,Localhost,我正试图在我的本地计算机上安装Hadoop,但我对这一点非常着迷 ▶ hadoop fs -mkdir /home/hadoop mkdir: Failed on local exception: org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length; Host Details : local host is: "aleph-pc.local/192.168.1.129"; destination

我正试图在我的本地计算机上安装Hadoop,但我对这一点非常着迷

▶ hadoop fs -mkdir /home/hadoop
mkdir: Failed on local exception: org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length; Host Details : local host is: "aleph-pc.local/192.168.1.129"; destination host is: "aleph-pc":8020;
我认为这与我配置sshd的方式有关,或者与
fs.default.name
中的值有关,它与另一个密切相关。我能从这里走到哪里?我非常感谢你的帮助

▶ cat /etc/hosts
127.0.0.1 localhost
::1 localhost

▶ hadoop version
Hadoop 2.9.2
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r 826afbeae31ca687bc2f8471dc841b66ed2c6704
Compiled by ajisaka on 2018-11-13T12:42Z
Compiled with protoc 2.5.0
From source with checksum 3a9939967262218aa556c684d107985
This command was run using /home/aleph/Documents/Projects/Hadoop/hadoop-2.9.2/share/hadoop/common/hadoop-common-2.9.2.jar

▶ tail hadoop-2.9.2/etc/hadoop/core-site.xml 
-->

<!-- Put site-specific property overrides in this file. -->

<configuration>
    <property>                                                                                    
        <name>fs.default.name</name>                                                              
        <value>hdfs://aleph-pc/127.0.0.1</value>
    </property>      


</configuration>

▶ jps
17619 Main
32133 NodeManager
32037 ResourceManager
31879 SecondaryNameNode
17816 RemoteMavenServer36
1020 Jps
31678 DataNode
▶ cat/etc/hosts
127.0.0.1本地主机
::1本地主机
▶ hadoop版本
Hadoop 2.9.2
颠覆https://git-wip-us.apache.org/repos/asf/hadoop.git -r 826AFBEAE31CA687BC2F847DC841B66ED2C6704
ajisaka于2018-11-13T12:42Z编制
用protoc2.5.0编译
来自校验和为3a9939967262218aa556c684d107985的源
此命令是使用/home/aleph/Documents/Projects/Hadoop/Hadoop-2.9.2/share/Hadoop/common/Hadoop-common-2.9.2.jar运行的
▶ tail-hadoop-2.9.2/etc/hadoop/core-site.xml
-->
fs.default.name
hdfs://aleph-pc/127.0.0.1
▶ 太平绅士
17619主要
32133节点管理器
32037资源经理
31879第二名称节点
17816 RemoteMavenServer 36
1020日元
31678数据节点

hdfs://aleph-pc/127.0.0.1
应该是
hdfs://aleph-pc
。此外,NameNode似乎从未启动,因此您应该检查其日志文件Stack Overflow是一个用于编程和开发问题的站点。对于这个问题,您可能应该使用上的另一个站点。另请参见帮助中心中的。您使用的端口号可能有问题。试试这个: