缺少Hadoop 2.2.0 jar文件

缺少Hadoop 2.2.0 jar文件,hadoop,Hadoop,我已经安装了hadoop-2.2.0,并试图运行附带的mapreduce示例代码。但是,每次使用ClassNotFoundException时它都会失败,我发现的原因是因为在hadoop.sh文件中设置了什么。下面是sh文件中的内容,安装中没有捆绑任何类文件。我确实看到他们在源头中存在 if [ "$COMMAND" = "fs" ] ; then CLASS=org.apache.hadoop.fs.FsShell elif [ "$COMMAND" = "version" ] ; the

我已经安装了hadoop-2.2.0,并试图运行附带的mapreduce示例代码。但是,每次使用
ClassNotFoundException
时它都会失败,我发现的原因是因为在
hadoop.sh
文件中设置了什么。下面是sh文件中的内容,安装中没有捆绑任何类文件。我确实看到他们在源头中存在

if [ "$COMMAND" = "fs" ] ; then
  CLASS=org.apache.hadoop.fs.FsShell
elif [ "$COMMAND" = "version" ] ; then
  CLASS=org.apache.hadoop.util.VersionInfo
elif [ "$COMMAND" = "jar" ] ; then
  CLASS=org.apache.hadoop.util.RunJar
elif [ "$COMMAND" = "checknative" ] ; then
  CLASS=org.apache.hadoop.util.NativeLibraryChecker
elif [ "$COMMAND" = "distcp" ] ; then
  CLASS=org.apache.hadoop.tools.DistCp
  CLASSPATH=${CLASSPATH}:${TOOL_PATH}
elif [ "$COMMAND" = "daemonlog" ] ; then
  CLASS=org.apache.hadoop.log.LogLevel
elif [ "$COMMAND" = "archive" ] ; then
  CLASS=org.apache.hadoop.tools.HadoopArchives
  CLASSPATH=${CLASSPATH}:${TOOL_PATH}
elif [[ "$COMMAND" = -*  ]] ; then
    # class and package names cannot begin with a -
    echo "Error: No command named \`$COMMAND' was found. Perhaps you meant \`hadoop ${COMMAND#-}'"
    exit 1
else
  CLASS=$COMMAND
以下是错误:

Exception in thread "main" java.lang.NoClassDefFoundError: org.apache.hadoop.util.RunJar
   at gnu.java.lang.MainThread.run(libgcj.so.13)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.RunJar not found in gnu.gcj.runtime.SystemClassLoader{urls=[file:/usr/local/hadoop-2.2.0/etc/hadoop/,file:/usr/local/hadoop-2.2.0/share/hadoop/hdfs/], parent=gnu.gcj.runtime.ExtensionClassLoader{urls=[], parent=null}}
   at java.net.URLClassLoader.findClass(libgcj.so.13)
   at gnu.gcj.runtime.SystemClassLoader.findClass(libgcj.so.13)
   at java.lang.ClassLoader.loadClass(libgcj.so.13)
   at java.lang.ClassLoader.loadClass(libgcj.so.13)
   at gnu.java.lang.MainThread.run(libgcj.so.13)

我终于解决了这个问题。纱线(用于mapreduce)和DFS进程需要在后台运行才能运行任何hadoop作业。我错过了Hadoop的n00b。要启动这两个进程,请在命令窗口中键入
start warn
start dfs
。它们都启动了两个控制台窗口,每个窗口都会显示大量诊断信息。

能否列出您运行的完整命令和收到的错误消息?