Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
错误的键类:类org.apache.hadoop.io.IntWritable不是类org.apache.hadoop.io.Text_Hadoop_Mapreduce - Fatal编程技术网

错误的键类:类org.apache.hadoop.io.IntWritable不是类org.apache.hadoop.io.Text

错误的键类:类org.apache.hadoop.io.IntWritable不是类org.apache.hadoop.io.Text,hadoop,mapreduce,Hadoop,Mapreduce,我有一台绘图机和一台减速机。整个代码都是从WordCount示例中修改的,但是输入和输出类型是根据我的需要修改的 错误似乎是由不匹配的输入/输出类型造成的,但不确定出了什么问题 public static class TokenizerMapper extends Mapper<Object, Text, Text, Text>{ private Text word = new Text(); public void ma

我有一台绘图机和一台减速机。整个代码都是从WordCount示例中修改的,但是输入和输出类型是根据我的需要修改的

错误似乎是由不匹配的输入/输出类型造成的,但不确定出了什么问题

public static class TokenizerMapper
            extends Mapper<Object, Text, Text, Text>{

        private Text word = new Text();


        public void map(Object key, Text value, Context context
        ) throws IOException, InterruptedException {
            //blah blah
        }
    }

    public static class IntSumReducer
            extends Reducer<Text,Text,IntWritable,Text> {
        private IntWritable result = new IntWritable();

        public void reduce(Text key, Iterable<Text> values,
                           Context context
        ) throws IOException, InterruptedException {
            //blah blah
    }

    public static void main(String[] args) throws Exception {
        Configuration conf = new Configuration();
        Job job = Job.getInstance(conf, "word count");
        job.setJarByClass(WordCount.class);
        job.setMapperClass(TokenizerMapper.class);
        job.setCombinerClass(IntSumReducer.class);
        job.setReducerClass(IntSumReducer.class);
        job.setOutputKeyClass(IntWritable.class);
        job.setOutputValueClass(Text.class);
        job.setMapOutputKeyClass(Text.class);
        job.setMapOutputValueClass(Text.class);
        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileInputFormat.addInputPath(job, new Path(args[1]));
        FileOutputFormat.setOutputPath(job, new Path(args[2]));
        System.exit(job.waitForCompletion(true) ? 0 : 1);
    }
在这里找到了答案:
java.io.IOException: wrong key class: class org.apache.hadoop.io.IntWritable is not class org.apache.hadoop.io.Text
    at org.apache.hadoop.mapred.IFile$Writer.append(IFile.java:191)
    at org.apache.hadoop.mapred.Task$CombineOutputCollector.collect(Task.java:1574)
    at org.apache.hadoop.mapred.Task$NewCombinerRunner$OutputConverter.write(Task.java:1891)
    at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
    at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.write(WrappedReducer.java:105)
    at WordCount$IntSumReducer.reduce(WordCount.java:47)
    at WordCount$IntSumReducer.reduce(WordCount.java:35)
    at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:171)
    at org.apache.hadoop.mapred.Task$NewCombinerRunner.combine(Task.java:1912)
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1662)
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1505)
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.close(MapTask.java:735)
    at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:2076)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:809)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
    at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:271)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.lang.Thread.run(Thread.java:834)