Java hadoop自定义可写无法生成预期输出
我有一组从映射器到reducer的输入:Java hadoop自定义可写无法生成预期输出,java,hadoop,mapreduce,Java,Hadoop,Mapreduce,我有一组从映射器到reducer的输入: (1939, [121, 79, 83, 28]) (1980, [0, 211, −113]) 我希望得到如下输出: 1939 max:121 min:28 avg: 77.75 如果在我的reducer类中不使用如下自定义可写文件,我可以得到它: public static class MaxTemperatureReducer extends Reducer<Text, IntWritable, Text, Text>
(1939, [121, 79, 83, 28])
(1980, [0, 211, −113])
我希望得到如下输出:
1939 max:121 min:28 avg: 77.75
如果在我的reducer类中不使用如下自定义可写文件,我可以得到它:
public static class MaxTemperatureReducer
extends Reducer<Text, IntWritable, Text, Text> {
Text yearlyValue = new Text();
@Override
public void reduce(Text key, Iterable<IntWritable> values,
Context context)
throws IOException, InterruptedException {
int sum = 0;
int CounterForAvg = 0;
int minValue = Integer.MAX_VALUE;
int maxValue = Integer.MIN_VALUE;
float avg;
for (IntWritable val : values) {
int currentValue = val.get();
sum += currentValue;
CounterForAvg++;
minValue = Math.min(minValue, currentValue);
maxValue = Math.max(maxValue, currentValue);
}
avg = sum / CounterForAvg;
String requiredValue = "max temp:"+maxValue + "\t" +"avg temp: "+ avg + "\t"+ "min temp: " +minValue;
yearlyValue.set(requiredValue);
context.write(key, yearlyValue);
}
}
下面是我如何实现自定义类和reducer的。我将iterables发送到定制类并在那里执行计算。我不知道我在这里做错了什么。我在java中有0个exp
public class CompositeWritable implements Writable {
String data = "";
public CompositeWritable() {
}
public CompositeWritable(String data) {
this.data = data;
}
@Override
public void readFields(DataInput in) throws IOException {
data = WritableUtils.readString(in);
}
@Override
public void write(DataOutput out) throws IOException {
WritableUtils.writeString(out, data);
}
public void merge(Iterable<IntWritable> values) {
int sum = 0;
int CounterForAvg = 0;
int minValue = Integer.MAX_VALUE;
int maxValue = Integer.MIN_VALUE;
float avg;
for (IntWritable val : values) {
int currentValue = val.get();
sum += currentValue;
CounterForAvg++;
minValue = Math.min(minValue, currentValue);
maxValue = Math.max(maxValue, currentValue);
}
avg = sum / CounterForAvg;
data = "max temp:"+maxValue + "\t" +"avg temp: "+ avg + "\t"+ "min temp: " +minValue;
}
@Override
public String toString() {
return data;
}
}
对合并的调用不应该帮助我确定这些值吗
当然,但你没有正确使用它<代码>输出从未初始化
CompositeWritable out; // null here
Text textYearlyValue = new Text();
public void reduce(Text key, Iterable<IntWritable> values,
Context context)
throws IOException, InterruptedException {
out.merge(values); // still null, should throw an exception
如果您正确地实现了这一点,
out.merge(值)
应该抛出nullpointerexception,因为out
从来都不是initialized@cricket_007当reducer成功运行时,它不会抛出nullpointerexception代码>。。。如果您试图输出TextPlus,这将不起作用,您的compositewriteable
几乎与Text
完全相同,因此不清楚您为什么需要it@cricket_007,如果我不使用CompositeWritable,则相同的配置正在工作。是否需要将输出值类设置为CompositeWritable。我试图得到一个自定义的可写类来将三维值写入特定的日期注:这种计算需要更少的Spark、Pig或Hivethanks代码来为我指明正确的方向,我确实研究了yahoo给出的文章,但是现在我明白了这篇文章的意思。我不确定你指的是哪篇文章
public static class MaxTemperatureReducer
extends Reducer<Text, CompositeWritable,Text, Text> {
CompositeWritable out;
Text textYearlyValue = new Text();
public void reduce(Text key, Iterable<IntWritable> values,
Context context)
throws IOException, InterruptedException {
out.merge(values);
String requiredOutput = out.toString();
textYearlyValue.set(requiredOutput);
context.write(key,textYearlyValue );
}
}
Job job = Job.getInstance(getConf(), "MaxAvgMinTemp");
job.setJarByClass(this.getClass());
FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
job.setMapperClass(MaxTemperatureMapper.class);
job.setReducerClass(MaxTemperatureReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
return job.waitForCompletion(true) ? 0 : 1;
CompositeWritable out; // null here
Text textYearlyValue = new Text();
public void reduce(Text key, Iterable<IntWritable> values,
Context context)
throws IOException, InterruptedException {
out.merge(values); // still null, should throw an exception
1939 MinMaxAvgWritable{min=28, max=121, avg=77.75}
1980 MinMaxAvgWritable{min=-113, max=211, avg=32.67}
public class MinMaxAvgWritable implements Writable {
private int min, max;
private double avg;
private DecimalFormat df = new DecimalFormat("#.00");
@Override
public String toString() {
return "MinMaxAvgWritable{" +
"min=" + min +
", max=" + max +
", avg=" + df.format(avg) +
'}';
}
@Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
MinMaxAvgWritable that = (MinMaxAvgWritable) o;
return min == that.min &&
max == that.max &&
avg == that.avg;
}
@Override
public int hashCode() {
return Objects.hash(min, max, avg);
}
@Override
public void write(DataOutput dataOutput) throws IOException {
dataOutput.writeInt(min);
dataOutput.writeInt(max);
dataOutput.writeDouble(avg);
}
@Override
public void readFields(DataInput dataInput) throws IOException {
this.min = dataInput.readInt();
this.max = dataInput.readInt();
this.avg = dataInput.readDouble();
}
public void set(int min, int max, double avg) {
this.min = min;
this.max = max;
this.avg = avg;
}
public void set(Iterable<IntWritable> values) {
this.min = Integer.MAX_VALUE;
this.max = Integer.MIN_VALUE;
int sum = 0;
int count = 0;
for (IntWritable iw : values) {
int i = iw.get();
if (i < this.min) this.min = i;
if (i > max) this.max = i;
sum += i;
count++;
}
this.avg = count < 1 ? sum : (sum / (1.0*count));
}
}
public class CompositeReducer extends Reducer<Text, IntWritable, Text, MinMaxAvgWritable> {
private final MinMaxAvgWritable output = new MinMaxAvgWritable();
@Override
protected void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException, InterruptedException {
// This 'set/merge' method could just as easily be defined here, and return a String to be set on a Text object
output.set(values);
context.write(key, output);
}
}
// outputs for mapper and reducer
job.setOutputKeyClass(Text.class);
// setup mapper
job.setMapperClass(TokenizerMapper.class); // Replace with your mapper
job.setMapOutputValueClass(IntWritable.class);
// setup reducer
job.setReducerClass(CompositeReducer.class);
job.setOutputValueClass(MinMaxAvgWritable.class); // notice custom writable
FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
return job.waitForCompletion(true) ? 0 : 1;