Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 尝试运行MRUnit示例时发生冲突的API_Java_Hadoop_Junit_Mrunit - Fatal编程技术网

Java 尝试运行MRUnit示例时发生冲突的API

Java 尝试运行MRUnit示例时发生冲突的API,java,hadoop,junit,mrunit,Java,Hadoop,Junit,Mrunit,我一直在玩MRUnit,并尝试在和的教程之后为hadoop wordcount示例运行它 虽然我不是一个粉丝,但我一直在使用Eclipse来运行代码,并且我一直在为setMapper函数获取一个错误 import java.io.IOException; import java.util.ArrayList; import java.util.List; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.L

我一直在玩MRUnit,并尝试在和的教程之后为hadoop wordcount示例运行它

虽然我不是一个粉丝,但我一直在使用Eclipse来运行代码,并且我一直在为setMapper函数获取一个错误

import java.io.IOException;
import java.util.ArrayList;
import java.util.List;

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;


import org.apache.hadoop.mrunit.mapreduce.MapDriver;
import org.apache.hadoop.mrunit.mapreduce.MapReduceDriver;
import org.apache.hadoop.mrunit.mapreduce.ReduceDriver;

import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;

import org.junit.Before;
import org.junit.Test;

public class TestWordCount {
  MapReduceDriver<LongWritable, Text, Text, IntWritable, Text, IntWritable> mapReduceDriver;
  MapDriver<LongWritable, Text, Text, IntWritable> mapDriver;
  ReduceDriver<Text, IntWritable, Text, IntWritable> reduceDriver;

  @Before
  public void setUp() throws IOException
  {
      WordCountMapper mapper = new WordCountMapper();
      mapDriver = new MapDriver<LongWritable, Text, Text, IntWritable>();
      mapDriver.setMapper(mapper);  //<--Issue here

      WordCountReducer reducer = new WordCountReducer();
      reduceDriver = new ReduceDriver<Text, IntWritable, Text, IntWritable>();
      reduceDriver.setReducer(reducer);

      mapReduceDriver = new MapReduceDriver<LongWritable, Text, Text, IntWritable,     Text, IntWritable>();
      mapReduceDriver.setMapper(mapper); //<--Issue here
      mapReduceDriver.setReducer(reducer);
  }

因为泛型有问题。还需要将mockito库导入到我的用户定义库中。

这是您的问题

public class WordCountMapper extends Mapper<Object, Text, Text, IntWritable>
....
MapDriver<LongWritable, Text, Text, IntWritable> mapDriver;
您可能还想将
map
方法参数从
Object key
更改为
LongWritable key

这是您的问题

public class WordCountMapper extends Mapper<Object, Text, Text, IntWritable>
....
MapDriver<LongWritable, Text, Text, IntWritable> mapDriver;

您可能还想将
map
方法参数从
Object key
更改为
LongWritable key

确保导入了正确的类,我遇到了与上面不同的错误,我的程序在Reducer和reduce_测试中都有正确的参数,但由于导入了错误的类,我遇到了上面报告的相同错误消息

错误导入的类--

导入org.apache.hadoop.mrunit.ReduceDriver

正确等级---

导入org.apache.hadoop.mrunit.mapreduce.ReduceDriver


对于mapper_测试,同样的解决方案,如果您确定mapper_类和mapper_测试中的参数相同,请确保导入了正确的类,我遇到了与上面不同的错误,我的程序在Reducer和reduce_测试中都有正确的参数,但由于导入了错误的类,我遇到了上面报告的相同错误消息

错误导入的类--

导入org.apache.hadoop.mrunit.ReduceDriver

正确等级---

导入org.apache.hadoop.mrunit.mapreduce.ReduceDriver

如果您确信mapper_类和mapper_测试中的参数相同,则在mapper_测试中使用相同的解决方案

WordCountMapper mapper = new WordCountMapper();
Mapper mapper = new WordCountMapper();
public class WordCountMapper extends Mapper<Object, Text, Text, IntWritable>
....
MapDriver<LongWritable, Text, Text, IntWritable> mapDriver;
class WordCountMapper extends Mapper<LongWritable, Text, Text, IntWritable>