Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 未调用减速机_Hadoop_Reducers - Fatal编程技术网

Hadoop 未调用减速机

Hadoop 未调用减速机,hadoop,reducers,Hadoop,Reducers,这是埃博拉病毒数据集的代码。这里根本不调用减速机。仅打印映射器输出 驾驶员等级: import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.*; import org.apache.hadoop.mapreduce.lib.input.FileInputForm

这是埃博拉病毒数据集的代码。这里根本不调用减速机。仅打印映射器输出

驾驶员等级:

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.KeyValueTextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
public class Ebola {
        public static void main(String[] args) throws Exception , ArrayIndexOutOfBoundsException{

                Configuration con1 = new Configuration();
                con1.set("mapreduce.input.keyvaluelinerecordreader.key.value.separator", " "); 
                Job job1 = new Job(con1, "Ebola");

                job1.setJarByClass(Ebola.class); 
                job1.setInputFormatClass(KeyValueTextInputFormat.class);
                job1.setOutputFormatClass(TextOutputFormat.class);        
                job1.setOutputKeyClass(Text.class);
                job1.setOutputValueClass(Text.class);
                job1.setMapperClass(EbolaMapper.class);      
                job1.setReducerClass(EbolReducer.class);

                FileInputFormat.addInputPath(job1, new Path(args[0]));        
                FileOutputFormat.setOutputPath(job1, new Path(args[1]));
                job1.waitForCompletion(true);
        }
}
这是地图绘制程序:

import java.io.IOException;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.Mapper;
public class EbolaMapper extends Mapper <Text, Text, Text, Text> {
        public void map(Text key, Text value, Context con) throws IOException, InterruptedException {
                Text cumValues = new Text();
                String record = value.toString();

                String p[] = record.split(" ",2);

                String cases = p[0];
                String death = p[1];

                String cValues =  death + "->" + cases;

                cumValues.set(cValues);

                con.write(key, cumValues);                  
        }
}
import java.io.IOException;
导入org.apache.hadoop.io.*;
导入org.apache.hadoop.mapreduce.Mapper;
公共类EbolaMapper扩展映射器{
公共void映射(文本键、文本值、上下文con)引发IOException、InterruptedException{
文本值=新文本();
字符串记录=value.toString();
字符串p[]=record.split(“,2);
字符串大小写=p[0];
字符串死亡=p[1];
字符串C值=死亡+“->”+案例;
cumValues.set(cValues);
con.write(键、值);
}
}
最后,减速器:

import java.io.IOException;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;
public class EbolReducer extends Reducer<Text, Text, Text, Text> {
        public void reduce(Text key, Text value, Context con) throws IOException{
                Text cumulValues = new Text();                  
                String cumVal = value.toString();
                String[] p = cumVal.split("->",2);
                String death = p[0];
                String cases = p[1];
                Float d = Float.parseFloat(death);
                Float c = Float.parseFloat(cases);
                Float perc = (d/c)*100;
                String percent = String.valueOf(perc);
                cumulValues.set(percent);
                con.write(key,cumulValues);
        }
}
import java.io.IOException;
导入org.apache.hadoop.io.Text;
导入org.apache.hadoop.mapreduce.Reducer;
公共类EbolReducer扩展减速器{
公共void reduce(文本键、文本值、上下文con)引发IOException{
Text cumulValues=新文本();
字符串cumVal=value.toString();
字符串[]p=cumVal.split(“->”,2);
字符串死亡=p[0];
字符串大小写=p[1];
Float d=Float.parseFloat(死亡);
Float c=Float.parseFloat(案例);
浮动汇率=(d/c)*100;
字符串百分比=String.valueOf(perc);
累积值。设置(百分比);
con.write(键、累积值);
}
}
输出只是映射器输出。未调用减速器。任何帮助都可能是错误的
谢谢

代替公共void reduce(文本键、文本值、上下文con)

你需要使用iterable


public void reduce(Text key,Iterablevalue,Context con)

请添加一些示例数据..以及ur
EbolReducer
是如何编译的?没有适当的异常处理,你是对的。她重载了
reduce
方法。这就是编译没有错误的原因。