Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/314.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/opencv/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java MapReduce链接作业永不结束_Java_Eclipse_Hadoop_Mapreduce - Fatal编程技术网

Java MapReduce链接作业永不结束

Java MapReduce链接作业永不结束,java,eclipse,hadoop,mapreduce,Java,Eclipse,Hadoop,Mapreduce,我试图用org.apache.hadoop.mapred.jobcontrol.*库而不是经典的库来链接两个作业,但是当我用hadoop执行.jar文件时,它永远不会结束,即使它产生了我所期望的正确输出 我想使用这个库,并且知道如何在第二个作业完成时停止执行。就像我使用了job.waitForCompletion(true),也像我只使用一个作业执行一个.jar import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.*; im

我试图用
org.apache.hadoop.mapred.jobcontrol.*
库而不是经典的库来链接两个作业,但是当我用hadoop执行.jar文件时,它永远不会结束,即使它产生了我所期望的正确输出

我想使用这个库,并且知道如何在第二个作业完成时停止执行。就像我使用了
job.waitForCompletion(true)
,也像我只使用一个作业执行一个.jar

import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.mapred.jobcontrol.Job;
import org.apache.hadoop.mapred.jobcontrol.JobControl;

public class Interest {
    public static void main(String[] args) throws Exception {

        JobConf conf1 = new JobConf(Interest.class);
        conf1.setJobName("Interest");
        conf1.setMapperClass(InterestMapperA.class);
        conf1.setReducerClass(InterestReducerA.class);
        conf1.setMapOutputKeyClass(Text.class);
        conf1.setMapOutputValueClass(IntWritable.class);
        conf1.setOutputKeyClass(Text.class);
        conf1.setOutputValueClass(IntWritable.class);
        FileInputFormat.addInputPath(conf1, new Path(args[0]));
        FileOutputFormat.setOutputPath(conf1, new Path("temp"));

        JobConf conf2 = new JobConf(Interest.class);
        conf2.setJobName("Interest");
        conf2.setMapperClass(InterestMapperB.class);
        conf2.setReducerClass(InterestReducerB.class);
        conf2.setMapOutputKeyClass(IntWritable.class);
        conf2.setMapOutputValueClass(Text.class);
        conf2.setOutputKeyClass(Text.class);
        conf2.setOutputValueClass(IntWritable.class);
        FileInputFormat.addInputPath(conf2, new Path("temp"));
        FileOutputFormat.setOutputPath(conf2, new Path(args[1]));

        Job job1 = new Job(conf1);
        Job job2 = new Job(conf2);
        JobControl jbcntrl = new JobControl("jbcntrl");
        jbcntrl.addJob(job1);
        jbcntrl.addJob(job2);
        job2.addDependingJob(job1);
        jbcntrl.run();        
    }
}

我已经用旧库解决了这个问题,事实上最新的库没有管理作业依赖关系的属性