Hadoop 将自定义对象传递给reducer,完全获取null
我是mapreduce的新手,我的映射器输出了一个DbWritable对象,在reducer过程中,我无法从传递的对象中获得任何值,也许它根本就没有传递? 这是我的密码 可写数据库Hadoop 将自定义对象传递给reducer,完全获取null,hadoop,mapreduce,phoenix,bigdata,Hadoop,Mapreduce,Phoenix,Bigdata,我是mapreduce的新手,我的映射器输出了一个DbWritable对象,在reducer过程中,我无法从传递的对象中获得任何值,也许它根本就没有传递? 这是我的密码 可写数据库 public class StWritable implements DBWritable,Writable { private String stockName; private double q1,q2,q3,q4; // quarter of year ... @O
public class StWritable implements DBWritable,Writable {
private String stockName;
private double q1,q2,q3,q4; // quarter of year
...
@Override
public void readFields(DataInput input) throws IOException {
}
@Override
public void write(DataOutput output) throws IOException {
}
@Override
public void readFields(ResultSet rs) throws SQLException {
stockName = rs.getString("STOCK_NAME");
// year = rs.getInt("RECORDING_YEAR");
q1 = rs.getDouble("Q1");
q2 = rs.getDouble("Q2");
q3 = rs.getDouble("Q3");
q4 = rs.getDouble("Q4");
}
@Override
public void write(PreparedStatement pstmt) throws SQLException {
pstmt.setString(1, stockName);
pstmt.setDouble(2, q1);
pstmt.setDouble(3, q2);
pstmt.setDouble(4, q3);
pstmt.setDouble(5, q4);
}
@Override
public String toString() {
return Double.toString(q1)+","+Double.toString(q2)+","+Double.toString(q3)+","+Double.toString(q4);
}
}
制图员
public static class StockMapper extends Mapper<NullWritable, StWritable, Text , StWritable> {
private Text stock = new Text();
private Text value = new Text();
private StWritable stockq= new StWritable() ;
public static final Log log = LogFactory.getLog(StockMapper.class);
@Override
protected void map(NullWritable key, StWritable stockqWritable, Context context) throws IOException, InterruptedException {
final String stockName = stockqWritable.getStockName();
final Double q1 = stockqWritable.getQ1();
final Double q2 = stockqWritable.getQ2();
final Double q3 = stockqWritable.getQ3();
final Double q4 = stockqWritable.getQ4();
stock.set(stockName);
stockq.setStockName(stockName);
stockq.setQ1(q1);
stockq.setQ2(q2);
stockq.setQ3(q3);
stockq.setQ4(q4);
// value.set(stockq.toString());
log.info("map the stockq value is "+ stockqWritable.toString());
context.write(stock,stockq);
}
}
一些日志信息
INFO StockMr$StockMapper: map the stockq value is 86.29,86.58,81.9,83.8
INFO StockMr$StockMapper: map the stockq value is 199.27,200.26,192.55,194.84
INFO StockMr$StockReducer: reduce the stockq value is 4.9E-324,4.9E-324,4.9E-324,4.9E-324
INFO StockMr$StockReducer: reduce the stockq value is 4.9E-324,4.9E-324,4.9E-324,4.9E-324
谢谢
参考文献
新编辑
我有两种解决问题的方法:一种是映射程序将复合值作为文本发送到reducer,然后将Text.toString().split(“,”)发送到reducer;
另一种方法是映射程序将MapWritable发送到reducer。
我不知道哪个更好,或者两者都不好
configuration.set("hbase.zookeeper.quorum", "backup103");
final String selectQuery = "SELECT STOCK_NAME,Q1,Q2,Q3,Q4 FROM STOCKQ";
final Job job = Job.getInstance(configuration, "phoenix-mr-job");
job.setJarByClass(StockMr.class);
PhoenixMapReduceUtil.setInput(job, StWritable.class, "STOCKQ", selectQuery);
PhoenixMapReduceUtil.setOutput(job, "STOCKQ_SUM", "STOCK_NAME,Q1,Q2,Q3,Q4");
job.setMapperClass(StockMapper.class);
//job.setCombinerClass(StockCombiner.class);
job.setReducerClass(StockReducer.class);
job.setOutputFormatClass(PhoenixOutputFormat.class);
job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(StWritable.class);
job.setOutputKeyClass(NullWritable.class);
job.setOutputValueClass(StWritable.class);
job.setNumReduceTasks(1);
TableMapReduceUtil.addDependencyJars(job);
job.waitForCompletion(true);
INFO StockMr$StockMapper: map the stockq value is 86.29,86.58,81.9,83.8
INFO StockMr$StockMapper: map the stockq value is 199.27,200.26,192.55,194.84
INFO StockMr$StockReducer: reduce the stockq value is 4.9E-324,4.9E-324,4.9E-324,4.9E-324
INFO StockMr$StockReducer: reduce the stockq value is 4.9E-324,4.9E-324,4.9E-324,4.9E-324