Java 运行hadoopizer时出错

Java 运行hadoopizer时出错,java,hadoop,Java,Hadoop,我正在尝试使用以下命令运行hadoopizer: hadoop jar hadd.jar -c config.xml -w /home/salma/hdfs/tmp 其中“/home/salma/hdfs/tmp”在我的hdfs文件系统上指定一个目录,hadoopizer将在其中写入一些临时数据 INFO: Adding file 'file:/home/salma/Desktop/data_rna_seq/chr22_ERCC92.fa' to distributed cache (/hom

我正在尝试使用以下命令运行hadoopizer:

hadoop jar hadd.jar -c config.xml -w /home/salma/hdfs/tmp
其中“/home/salma/hdfs/tmp”在我的hdfs文件系统上指定一个目录,hadoopizer将在其中写入一些临时数据

INFO: Adding file 'file:/home/salma/Desktop/data_rna_seq/chr22_ERCC92.fa' to distributed cache (/home/salma/hdfs/tmp/static_data/db/chr22_ERCC92.fa)
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/filecache/DistributedCache
    at org.genouest.hadoopizer.Hadoopizer.addToDistributedCache(Hadoopizer.java:469)
    at org.genouest.hadoopizer.Hadoopizer.prepareJob(Hadoopizer.java:282)
    at org.genouest.hadoopizer.Hadoopizer.main(Hadoopizer.java:71)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.filecache.DistributedCache
这是代码为java的addToDistributedCache方法:

private void addToDistributedCache(String fileId, URI uri, Path hdfsBasePath) throws IOException {

        FileSystem fs = hdfsBasePath.getFileSystem(jobConf);
        Path localPath = new Path(uri);
        Path hdfsPath = new Path(hdfsBasePath.toString() + Path.SEPARATOR + localPath.getName());

        if (uri.getScheme().equalsIgnoreCase("file")) {
            logger.info("Adding file '" + uri + "' to distributed cache (" + hdfsPath + ")");
            fs.copyFromLocalFile(false, true, localPath, hdfsPath);
        }
        else if (uri.getScheme().equalsIgnoreCase("hdfs")) {
            logger.info("Adding file '" + uri + "' to distributed cache");
            hdfsPath = localPath;
        }
        else {
            // TODO support other protocols (s3? ssh? http? ftp?)
            System.err.println("Unsupported URI scheme: " + uri.getScheme() + " (in " + uri + ")");
            System.exit(1);
        }

        // Add a fragment to the uri: hadoop will automatically create a symlink in the work dir pointing to this file
        // Don't add the fragment to hdfsPath because it would be encoded in a strange way
        URI hdfsUri = URI.create(hdfsPath.toString() + "#" + jobConf.get("hadoopizer.static.data.link.prefix") + fileId + "__" + localPath.getName());
        DistributedCache.addCacheFile(hdfsUri, jobConf);
    }
谁能给我解释一下这个错误。

我正在使用hadoop 2.7.3

什么是hadoopizer?@vefthym hadoopizer是一个基于hadoop并分析生物信息学数据的框架,