Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Hadoop:如何启动2 Mapper和2 reducer_Java_Hadoop_Mapreduce_Mapper - Fatal编程技术网

Java Hadoop:如何启动2 Mapper和2 reducer

Java Hadoop:如何启动2 Mapper和2 reducer,java,hadoop,mapreduce,mapper,Java,Hadoop,Mapreduce,Mapper,我正在尝试开发和Hadoop应用程序。我想在我的main方法中启动2个Mapper和2个Reducer。但是,我不断得到一个演员的错误,这使我问我如何才能做到这一点 Mapper1: @SuppressWarnings("javadoc") public class IntervallMapper1 extends Mapper<LongWritable, Text, Text, LongWritable> { private static Logger logger = L

我正在尝试开发和Hadoop应用程序。我想在我的main方法中启动2个Mapper和2个Reducer。但是,我不断得到一个演员的错误,这使我问我如何才能做到这一点

Mapper1:

@SuppressWarnings("javadoc")
public class IntervallMapper1 extends Mapper<LongWritable, Text, Text, LongWritable> {
    private static Logger logger = Logger.getLogger(IntervallMapper1.class.getName());

    private static Category categoriy;
    private static Value value;

    private String[] values = new String[4];
    private final static LongWritable one = new LongWritable(1);

    @Override
    public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {

        if(!this.categoriy.valueIsMissing(value.toString())){ // Luftdruck und Windstärke vorhanden...
            this.logger.info("Key: " + values[0] + values[1]);
            values = this.value.getValues(value.toString());
            context.write(new Text(values[0] + values[1]), this.one); // Station-Datum als Key und Value = 1
        }
    }
}
错误:

Error: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.Text
    at ncdcW03.IntervallMapper2.map(IntervallMapper2.java:1)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)

我找不到IntervallApper2类??我想你用了两次IntervallApper1而不是粘贴IntervallApper2你到底是什么意思。。。我不明白?解析两次。。。如何?假设我想启动2个不同的映射器和2个不同的还原器。。。如何播放1的输出。映射到1的输入中。减速机和使用1的输出。减速器作为2的输入。减速机?在您的例外情况中,您提到了IntervallApper2类。但在上面的代码中,这样的类是存在的。您能更正您提供的代码吗看起来您在这里粘贴的是Mapper1的代码而不是Mapper2的代码。Mapper2部分中应该有
IntervallMapper2
类,而不是
IntervallMapper1
@SuppressWarnings("javadoc")
public class IntervallMapper1 extends Mapper<LongWritable, Text, Text, LongWritable> {
    private static Logger logger = Logger.getLogger(IntervallMapper1.class.getName());

    private static Category categoriy;
    private static Value value;

    private String[] values = new String[4];
    private final static LongWritable one = new LongWritable(1);

    @Override
    public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {

        if(!this.categoriy.valueIsMissing(value.toString())){ // Luftdruck und Windstärke vorhanden...
            this.logger.info("Key: " + values[0] + values[1]);
            values = this.value.getValues(value.toString());
            context.write(new Text(values[0] + values[1]), this.one); // Station-Datum als Key und Value = 1
        }
    }
}
@SuppressWarnings("javadoc")
    public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException {
        Job job = Job.getInstance(new Configuration());

        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(LongWritable.class);

        job.setMapperClass(IntervallMapper1.class);
//      job.setCombinerClass(IntervallReducer1.class);
        job.setReducerClass(IntervallReducer1.class);
        job.setMapperClass(IntervallMapper2.class);

        job.setInputFormatClass(TextInputFormat.class);
        job.setOutputFormatClass(TextOutputFormat.class);

        FileInputFormat.setInputPaths(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));

        job.setJarByClass(IntervallStart.class);

        job.waitForCompletion(true);
    }
Error: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.io.Text
    at ncdcW03.IntervallMapper2.map(IntervallMapper2.java:1)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)