Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/330.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 使用新api的分布式缓存使用';_Java_Hadoop_Mapreduce_Bigdata - Fatal编程技术网

Java 使用新api的分布式缓存使用';

Java 使用新api的分布式缓存使用';,java,hadoop,mapreduce,bigdata,Java,Hadoop,Mapreduce,Bigdata,我正在尝试使用hadoop中的新api运行hadoop程序。有谁能指导我如何使用新的api使用安装方法吗 映射器类 protected void setup(Context context) throws IOException, InterruptedException { Configuration conf = context.getConfiguration(); URI[] cacheFiles = context.getCacheFiles

我正在尝试使用hadoop中的新api运行hadoop程序。有谁能指导我如何使用新的api使用安装方法吗

映射器类

protected void setup(Context context)
          throws IOException, InterruptedException {
      Configuration conf = context.getConfiguration();
      URI[] cacheFiles = context.getCacheFiles();
      BufferedReader br = new BufferedReader(new FileReader(cacheFiles[0].toString()));
  }

public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException
 {

    ....?   // How should I access the above cached file here?
String line=br.readLine();
}
驾驶员等级

Job job = Job.getInstance(getConf());
job.setJobName("wordcount");
job.setJarByClass(driver.class);

Configuration conf = new Configuration();         

job.addCacheFile(new Path("hdfs://master:54310/usr/local/hadoop/input/normal_small").toUri());

以下是搜索后得到的结果,但我不确定如何在mapper类的map方法中访问缓存文件。请提及同样的问题。

看看这个答案@Ashrith:我收到的错误文件不存在:hdfs://master:54310/usr/local/hadoop/input/normal_small#inp