在hadoop安装中找不到start-all.sh
我正试图在我的本地机器上安装hadoop,并且正在跟踪。我还安装了hadoop home 这是我现在尝试运行的命令在hadoop安装中找不到start-all.sh,hadoop,installation,ubuntu-14.04,Hadoop,Installation,Ubuntu 14.04,我正试图在我的本地机器上安装hadoop,并且正在跟踪。我还安装了hadoop home 这是我现在尝试运行的命令 hduser@ubuntu:~$ /usr/local/hadoop/bin/start-all.sh 这就是我得到的错误 -su: /usr/local/hadoop/bin/start-all.sh: No such file or directory 这是我添加到$HOME/.bashrc文件中的内容 # Set Hadoop-related environment va
hduser@ubuntu:~$ /usr/local/hadoop/bin/start-all.sh
这就是我得到的错误
-su: /usr/local/hadoop/bin/start-all.sh: No such file or directory
这是我添加到$HOME/.bashrc文件中的内容
# Set Hadoop-related environment variables
export HADOOP_HOME=/usr/local/hadoop
# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/java-8-oracle
# Some convenient aliases and functions for running Hadoop-related commands
unalias fs &> /dev/null
alias fs="hadoop fs"
unalias hls &> /dev/null
alias hls="fs -ls"
# If you have LZO compression enabled in your Hadoop cluster and
# compress job outputs with LZOP (not covered in this tutorial):
# Conveniently inspect an LZOP compressed file from the command
# line; run via:
#
# $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
#
# Requires installed 'lzop' command.
#
lzohead () {
hadoop fs -cat $1 | lzop -dc | head -1000 | less
}
# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/bin
编辑尝试mahendra给出的解决方案后,我得到以下输出
此脚本已弃用。而是使用start-dfs.sh和start-warn.sh
正在[localhost]上启动namenodes
localhost:启动namenode,登录到/usr/local/hadoop/logs/hadoop-hduser-namenode-mmt-HP-ProBook-430-G3.out
localhost:启动datanode,登录到/usr/local/hadoop/logs/hadoop-hduser-datanode-mmt-HP-ProBook-430-G3.out
正在启动辅助名称节点[0.0.0.0]
0.0.0.0:启动secondarynamenode,登录到/usr/local/hadoop/logs/hadoop-hduser-secondarynamenode-mmt-HP-ProBook-430-G3.out
起始线程守护进程
启动resourcemanager,登录到/usr/local/hadoop/logs/warn-hduser-resourcemanager-mmt-HP-ProBook-430-G3.out
localhost:启动nodemanager,登录到/usr/local/hadoop/logs/warn-hduser-nodemanager-mmt-HP-ProBook-430-G3。输出
尝试运行:
hduser@ubuntu:~$ /usr/local/hadoop/sbin/start-all.sh
因为start-all.sh和stop-all.sh位于sbin
目录,而hadoop二进制文件位于bin
目录
还更新了您的.bashrc
:
导出路径=$PATH:$HADOOP\u HOME/bin:$HADOOP\u HOME/sbin
这样您就可以直接访问
start all.sh
嗨,我很久没有问过这个问题了。您上面的命令起作用,并给出了我在编辑部分发布的输出。这是正确的输出吗?请参阅的已接受答案。在本地模式下,您还可以使用$java\u HOME/bin/jps
命令检查java进程