Hadoop:运行start-all.sh时出错

Hadoop:运行start-all.sh时出错,hadoop,Hadoop,运行start all.sh时,出现错误 yunweiguo@172.16.192.134's password: 172.16.192.135: bash: line 0: cd: /Users/yunweiguo/hadoop/hadoop-1.2.1/libexec/..: No such file or directory 172.16.192.135: bash: /Users/yunweiguo/hadoop/hadoop-1.2.1/bin/hadoop-daemon.sh: N

运行start all.sh时,出现错误

yunweiguo@172.16.192.134's password: 
172.16.192.135: bash: line 0: cd: /Users/yunweiguo/hadoop/hadoop-1.2.1/libexec/..: No such file or directory
172.16.192.135: bash: /Users/yunweiguo/hadoop/hadoop-1.2.1/bin/hadoop-daemon.sh: No such file or directory

我猜您没有正确设置bashrc,请遵循以下步骤:

vi$HOME/.bashrc在文件末尾添加以下行:(将hadoop HOME更改为您的)

 # Set Hadoop-related environment variables 
 export HADOOP_HOME=/usr/local/hadoop

 # Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on) 
 export JAVA_HOME=/usr/lib/jvm/java-6-sun

 # Some convenient aliases and functions for running Hadoop-related commands 
  unalias fs &> /dev/null
  alias fs="hadoop fs"
  unalias hls &> /dev/null 
  alias hls="fs -ls"

 # If you have LZO compression enabled in your Hadoop cluster and
 # compress job outputs with LZOP (not covered in this tutorial):
 # Conveniently inspect an LZOP compressed file from the command
 # line; run via:
 #
 # $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
 #
 # Requires installed 'lzop' command.
  lzohead () {
     hadoop fs -cat $1 | lzop -dc | head -1000 | less 
    }

 # Add Hadoop bin/ directory to PATH
  export PATH=$PATH:$HADOOP_HOME/bin