Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/386.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/sorting/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Hadoop MapFile reader没有';无法在分布式缓存中检测文件_Java_Hadoop_Mapreduce_Distributed Cache_Map Files - Fatal编程技术网

Java Hadoop MapFile reader没有';无法在分布式缓存中检测文件

Java Hadoop MapFile reader没有';无法在分布式缓存中检测文件,java,hadoop,mapreduce,distributed-cache,map-files,Java,Hadoop,Mapreduce,Distributed Cache,Map Files,我需要关于地图文件阅读器的帮助 我通过文件选项将文件添加到缓存中 jar HadoopProjects.jar rsProject.driver-文件hdfs://localhost:8020/data/mapFileTestFolder.tar.gz.... 我叫它这儿 @SuppressWarnings("deprecation") @Override protected void setup(Context context) { try { Path[] cache

我需要关于地图文件阅读器的帮助
我通过文件选项将文件添加到缓存中

jar HadoopProjects.jar rsProject.driver-文件hdfs://localhost:8020/data/mapFileTestFolder.tar.gz....

我叫它这儿

@SuppressWarnings("deprecation")
@Override
protected void setup(Context context) {
    try {
        Path[] cacheLocalFiles = DistributedCache.getLocalCacheFiles(context.getConfiguration());
        logF.info("reducer started setup");

            for (Path path:cacheLocalFiles) {
                logF("reducer setup " + path.getName().toString());
                if (path.getName().toString().contains("mapFileTestFolder.tar.gz")) {
                    URI mapUri = new File(path.toString() + "/mapFileTestFolder").toURI();
                    logF.info("depReader init begins URI = " + mapUri.toString());
                    depReader = new MapFile.Reader(FileSystem.get(context.getConfiguration()),mapUri.toString(), context.getConfiguration());
                    logF.info("depReader init ends");

                }
            }
        } catch (IOException e) {
            e.printStackTrace();
            logF.info("depReader init error - " + e);
        } 
        //some other lines
}

下面是我在日志中看到的内容 2014-03-11 08:31:09305 INFO[main]rsProject.myReducer:解压器初始化开始URI=file:/home/hadoop/Training/hadoop\u work/mapred/nodemanager/usercache/hadoop/appcache/application\u 1394318775013\u 0079/container\u 1394318775013\u 0079\u 01\u000005/mapFileTestFolder.tar.gz/mapFileTestFolder
2014-03-11 08:31:09345信息[main]rsProject.myReducer:depReader init error-java.io.FileNotFoundException:文件文件:/home/hadoop/Training/hadoop\u work/mapred/nodemanager/usercache/hadoop/appcache/application\u 1394318775013\u 0079/container\u 1394318775013\u 0079\u 01\u000005/mapFileTestFolder.tar.gz/mapFileTestFolder/data不存在

mapFileTestFolder.tar.gz-这是一个压缩的映射文件(其中包含索引和数据)

我猜这个文件存在于分布式缓存中,因为如果相同的匹配,运行程序将进入条件
为什么会发生这种情况=/

感谢您的帮助

谢谢

问题解决了。我愚蠢的错误=/I应该使用命令将存档添加到分布式缓存,而不是作为文件