Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/389.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 执行MapReduce程序时获取空指针异常_Java_Hadoop_Mapreduce_Yarn_Bigdata - Fatal编程技术网

Java 执行MapReduce程序时获取空指针异常

Java 执行MapReduce程序时获取空指针异常,java,hadoop,mapreduce,yarn,bigdata,Java,Hadoop,Mapreduce,Yarn,Bigdata,我在运行Map Reduce程序时遇到空指针异常。 请帮助理解为什么会出现此错误 public class AvgDriver extends Configured implements Tool{ @Override public int run(String[] arg0) throws Exception { Job job=Job.getInstance(); job.setJar("AvgSalary.jar");

我在运行Map Reduce程序时遇到空指针异常。 请帮助理解为什么会出现此错误

public class AvgDriver extends Configured implements Tool{

    @Override
    public int run(String[] arg0) throws Exception {

        Job job=Job.getInstance();
        job.setJar("AvgSalary.jar");

        job.setMapperClass(AvgMapper.class);
        job.setMapOutputKeyClass(NullWritable.class);
        job.setMapOutputValueClass(DoubleWritable.class);



        //job.setInputFormatClass(TextInputFormat.class);

        job.setReducerClass(AvgReducer.class);
        job.setOutputKeyClass(NullWritable.class);
        job.setOutputValueClass(DoubleWritable.class);

        FileInputFormat.setInputPaths(job, new Path(arg0[0]));
        FileOutputFormat.setOutputPath(job, new Path(arg0[1]));

        return job.waitForCompletion(true)?0:1;
    }

    public void main(String [] args) throws Exception
    {

        System.exit(ToolRunner.run(new AvgDriver(), args));
    }
}




public class AvgMapper extends Mapper<LongWritable, Text, NullWritable, DoubleWritable> {

    public void map(LongWritable key , Text value , Context context) throws IOException, InterruptedException
    {
        String values=value.toString();
        String [] val=values.split("\t");

        double convertVal=Double.parseDouble(val[2]);

        context.write(NullWritable.get(), new DoubleWritable(convertVal));
    }

} 


public class AvgReducer extends Reducer<NullWritable, DoubleWritable, NullWritable, DoubleWritable> {

    double total=0.0;
    int count=0;

    public void Reduce(NullWritable key , Iterator<DoubleWritable> value , Context context) throws IOException, InterruptedException
    {
        while (value.hasNext()) {
            total = total+ ((DoubleWritable) value.next()).get();
            count++;
        }

        total=total/count;

        context.write(key, new DoubleWritable(total));
    }
}
public类AvgDriver扩展配置的工具{
@凌驾
公共int运行(字符串[]arg0)引发异常{
Job Job=Job.getInstance();
setJar(“AvgSalary.jar”);
setMapperClass(AvgMapper.class);
setMapOutputKeyClass(NullWritable.class);
job.setMapOutputValueClass(DoubleWritable.class);
//setInputFormatClass(TextInputFormat.class);
job.setReducerClass(AvgReducer.class);
setOutputKeyClass(NullWritable.class);
job.setOutputValueClass(DoubleWritable.class);
setInputPath(作业,新路径(arg0[0]);
setOutputPath(作业,新路径(arg0[1]);
返回作业。等待完成(true)?0:1;
}
public void main(字符串[]args)引发异常
{
exit(ToolRunner.run(new AvgDriver(),args));
}
}
公共类AvgMapper扩展映射器{
公共void映射(LongWritable键、文本值、上下文上下文)引发IOException、InterruptedException
{
字符串值=value.toString();
字符串[]val=values.split(“\t”);
double convertVal=double.parseDouble(val[2]);
write(nullwriteable.get(),新的doublewriteable(convertVal));
}
} 
公共类AvgReducer扩展减速器{
双倍合计=0.0;
整数计数=0;
公共void Reduce(NullWritable键、迭代器值、上下文上下文)引发IOException、InterruptedException
{
while(value.hasNext()){
total=total+((可双写)value.next()).get();
计数++;
}
总数=总数/计数;
write(key,新的DoubleWritable(total));
}
}

您在main方法中缺少了static。更新如下

public static void main(String [] args) throws Exception

堆栈跟踪必须指向某行代码。那是哪一个[cloudera@quickstartPracticeNew]$hadoop jar AvgSalary.jar com.ankur.practics.AvgDriver/user/hdfs/empSal.txt/user/hdfs/output15 sun.reflect.NativeMethodAccessorImpl.invoke的线程“main”java.lang.NullPointerException中的sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)异常(NativeMethodAccessorImpl.java:57)在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)在java.lang.reflect.Method.invoke(Method.java:606)在org.apache.hadoop.util.RunJar.run(RunJar.java:221)在org.apache.hadoop.util.RunJar.main(RunJar.java:136)上您可以在上面的堆栈跟踪中看到,没有类提到此异常的发生位置。我制作了另一个程序,其中Job Job=Job.getInstance()工作正常。你试过了吗?它解决了问题吗?在你得到空指针的地方,你现在将从堆栈跟踪中得到特定的java行,请发布。我试过了,但仍然是相同的错误。cloudera@quickstartPracticeNew]$hadoop jar AvgSalary.jar com.ankur.practics.AvgDriver/user/hdfs/empSal.txt/user/hdfs/output11 thr中的异常sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)在java.lang.reflect.Method.invoke(Method.java:606)在sun.reflect.DelegatingMethodAccessorImplorg.apache.hadoop.util.RunJar.run(RunJar.java:221)org.apache.hadoop.util.RunJar.main(RunJar.java:136)public int run(String[]arg0)抛出异常{Configuration conf=new Configuration();Job Job=new Job(conf);Job.setJar(“AvgSalary.jar”);Job.setmapperperclass(AvgMapper.class);Job.setMapOutputKeyClass(NullWritable.class);job.setMapOutputValueClass(DoubleWritable.class);