Hadoop安装问题:

Hadoop安装问题:,hadoop,installation,hadoop2,Hadoop,Installation,Hadoop2,我按照教程安装Hadoop。不幸的是,当我运行start-all.sh脚本时,控制台上显示了以下错误: hduser@dennis-HP:/usr/local/hadoop/sbin$ start-all.sh This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh hadoop config script is run... hdfs script is run... Config parameter : 1

我按照教程安装Hadoop。不幸的是,当我运行start-all.sh脚本时,控制台上显示了以下错误:

hduser@dennis-HP:/usr/local/hadoop/sbin$ start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
hadoop config script is run...
hdfs script is run...
Config parameter : 
16/04/10 23:45:40 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
localhost: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out: No such file or directory
localhost: head: cannot open ‘/usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out’ for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-hduser-namenode-dennis-HP.out: No such file or directory
localhost: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
localhost: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out: No such file or directory
localhost: head: cannot open ‘/usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out’ for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out: No such file or directory
localhost: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-hduser-datanode-dennis-HP.out: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
0.0.0.0: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 159: /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out: No such file or directory
0.0.0.0: head: cannot open ‘/usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out’ for reading: No such file or directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 177: /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out: No such file or directory
0.0.0.0: /usr/local/hadoop/sbin/hadoop-daemon.sh: line 178: /usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-dennis-HP.out: No such file or directory
16/04/10 23:45:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
yarn script is run...
starting yarn daemons
mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
starting resourcemanager, logging to /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out
/usr/local/hadoop/sbin/yarn-daemon.sh: line 124: /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out: No such file or directory
head: cannot open ‘/usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out’ for reading: No such file or directory
/usr/local/hadoop/sbin/yarn-daemon.sh: line 129: /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out: No such file or directory
/usr/local/hadoop/sbin/yarn-daemon.sh: line 130: /usr/local/hadoop/logs/yarn-hduser-resourcemanager-dennis-HP.out: No such file or directory
localhost: mkdir: cannot create directory ‘/usr/local/hadoop/logs’: Permission denied
localhost: chown: cannot access ‘/usr/local/hadoop/logs’: No such file or directory
localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 124: /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out: No such file or directory
localhost: head: cannot open ‘/usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out’ for reading: No such file or directory
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 129: /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out: No such file or directory
localhost: /usr/local/hadoop/sbin/yarn-daemon.sh: line 130: /usr/local/hadoop/logs/yarn-hduser-nodemanager-dennis-HP.out: No such file or directory
当我执行JPS命令时,抛出了以下错误

hduser@dennis-HP:/usr/local/hadoop/sbin$ jps
3802 Jps
我是hadoop新手,所以请给我看一篇文章,这篇文章将帮助我安装hadoop而不会出现问题


或者,如果可能(更可取)解决所面临的问题,请让我知道哪里出了问题以及如何解决

检查
/usr/local/hadoop/logs
权限。如果不在hduser下,请更改所有权


sudo chown-R username:group directory
您是否尝试过
启动dfs.sh
试试下面的命令,看看有什么反应

hdfs namenode -format
start-dfs.sh
start-yarn.sh

老实说,我不知道我为什么会犯那样的错误。。但我按照中提供的说明拆除了我的整个装置,并使用官方网站中描述的安装方法重新安装了它-


但是你是对的@Krishna,日志是在安装后自动创建的。我的猜测是,我之前使用的安装的配置细节已经过时,并且很可能与Hadoop的安装有关。您当前的用户对/usr/local/Hadoop的权限有限。 尝试更改权限

sudo chmod 777-R/usr/local/hadoop/


请检查是否使用命令chmod或chown在文件夹上正确设置了权限

Hadoop提供单个节点来启动和停止服务,即:。, Hadoop-daemon.sh启动[节点]

同样,也有启动/停止纱线的脚本。 下面的文章详细介绍了如何安装ApacheHadoop

我遇到了类似的问题,我发现HADOOP-env.sh中的HADOOP_前缀路径不完整。 它没有指向我的安装目录,而是指向根目录。 解决了这个问题,一切正常

正道

export HADOOP_PREFIX=/home/karan/hadoop-install/hadoop-3.2.1
走错路

export HADOOP_PREFIX=/hadoop-install/hadoop-3.2.1
我也有同样的问题

我删除了这个文件夹:

hadoop-hadoop_amine-datanode-amine.out位于:usr/local/hadoopy/logs

start-dfs.sh start-warn.sh

太平绅士:

14048日元 32226资源经理 32403节点管理器 6164第二名称节点 13548数据节点
5806 NameNode

实际上“/usr/local/hadoop”下没有日志目录。我是否遗漏了某个配置?如果您查看hadoop-env.sh文件内部,您将找到存储日志文件的位置$默认情况下,HADOOP_HOME/logs#导出HADOOP_LOG_DIR=${HADOOP_LOG_DIR}/$USER。如果日志目录不存在,则意味着它将自动创建日志文件,但尚未创建日志文件。。我跟着教程走了。这是不是把安装搞砸了?