Hadoop 使用start-dfs.sh时出现问题

Hadoop 使用start-dfs.sh时出现问题,hadoop,installation,Hadoop,Installation,我使用此链接创建了一个4节点集群:,但一旦我到达启动hadoop集群的部分,就会出现如下错误: $HADOOP_HOME/sbin/start-dfs.sh Starting namenodes on [namenode_dns] namenode_dns: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied namenode_dns: chown: cannot access '/usr/lo

我使用此链接创建了一个4节点集群:,但一旦我到达启动hadoop集群的部分,就会出现如下错误:

$HADOOP_HOME/sbin/start-dfs.sh

Starting namenodes on [namenode_dns]
namenode_dns: mkdir: cannot create 
directory ‘/usr/local/hadoop/logs’: Permission denied
namenode_dns: chown: cannot access 
'/usr/local/hadoop/logs': No such file or directory
namenode_dns: starting namenode, logging 
to /usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out
namenode_dns: 
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: 
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No 
such file or directory
namenode_dns: head: cannot open 
'/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out' 
for reading: No such file or directory
namenode_dns: 
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: 
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No 
such file or directory
namenode_dns: 
/usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: 
/usr/local/hadoop/logs/hadoop-ubuntu-namenode-ip-172-31-2-168.out: No 
such file or directory
ip-172-31-1-82: starting datanode, logging to 
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-1-82.out
ip-172-31-7-221: starting datanode, logging to 
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-7-221.out
ip-172-31-14-230: starting datanode, logging to 
/usr/local/hadoop/logs/hadoop-ubuntu-datanode-ip-172-31-14-230.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: 
Permission denied
0.0.0.0: chown: cannot access '/usr/local/hadoop/logs': No such file 
or directory
0.0.0.0: starting secondarynamenode, logging to 
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: 
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory
0.0.0.0: head: cannot open '/usr/local/hadoop/logs/hadoop-ubuntu-
secondarynamenode-ip-172-31-2-168.out' for reading: No such file or 
directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: 
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: 
/usr/local/hadoop/logs/hadoop-ubuntu-secondarynamenode-ip-172-31-2-
168.out: No such file or directory
以下是我运行jps时发生的情况:

20688 Jps

我不确定我在配置等方面哪里出了问题。我不熟悉hadoop和map reduce,所以请保持简单。

这是一个与权限相关的问题,看起来您用来启动hadoop服务的用户(我想是ubuntu)在日志目录(/usr/local/hadoop)中没有写权限-您应该将hadoop文件复制为sudo/root。尝试递归地更改Hadoop主目录的所有权,或授予/usr/local/Hadoop/logs目录的写访问权

chown -R ububunt:ubuntu /usr/local/hadoop  


就这样。谢谢。如果我有代表,我会+1。酷!!别担心。开始为社区做贡献,以获得足够的声誉。
chmod 777 /usr/local/hadoop/logs