Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/370.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
hadoop,java.lang.RuntimeException:java.lang.ClassNotFoundException错误_Java_Hadoop_Mapreduce - Fatal编程技术网

hadoop,java.lang.RuntimeException:java.lang.ClassNotFoundException错误

hadoop,java.lang.RuntimeException:java.lang.ClassNotFoundException错误,java,hadoop,mapreduce,Java,Hadoop,Mapreduce,我正在阅读这本书(hadoop权威指南) 在尝试使用localhost执行书中的一个示例时,我遇到了一个错误 14/06/13 22:24:57 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). ****hdfs://localhost/usr/kim/input/ncdc 14/06/13 22:24:

我正在阅读这本书(hadoop权威指南)

在尝试使用localhost执行书中的一个示例时,我遇到了一个错误

14/06/13 22:24:57 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
****hdfs://localhost/usr/kim/input/ncdc
14/06/13 22:24:57 INFO input.FileInputFormat: Total input paths to process : 3
14/06/13 22:24:57 WARN snappy.LoadSnappy: Snappy native library is available
14/06/13 22:24:57 INFO util.NativeCodeLoader: Loaded the native-hadoop library
14/06/13 22:24:57 INFO snappy.LoadSnappy: Snappy native library loaded
14/06/13 22:24:57 INFO mapred.JobClient: Running job: job_201406132100_0011
14/06/13 22:24:58 INFO mapred.JobClient:  map 0% reduce 0%
14/06/13 22:25:11 INFO mapred.JobClient: Task Id : attempt_201406132100_0011_m_000000_0, Status : FAILED
java.lang.RuntimeException: java.lang.ClassNotFoundException: mapred.MaxTemperatureMapper_v1
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:867)
    at org.apache.hadoop.mapreduce.JobContext.getMapperClass(JobContext.java:199)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:719)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.ClassNotFoundException: mapred.MaxTemperatureMapper_v1
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:270)
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:820)
    at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:865)
    ... 8 more
我使用命令行执行作业,
hadoop jar MaxTemperatureDriver.jar mapred.MaxTemperatureDriver-conf/hadoop\u conf/hadoop-localhost.xml/input/ncdc/max temp

jar文件中有2个文件夹
META-INF
mapred
,mapred文件夹中有5个类(这些类在
mapred
包中)

  • MaxtMeratureReducer.class
  • MaxTemperatureDriver.class
  • MaxTemperatureMapper_v1.class
  • MaxTemperatureMapper_v1$Temperature.class
  • NcdcRecordParser.class
  • 这些是配置文件MaxTemperatureDriver和MaxTemperatureMapper_v1

    <?xml version="1.0"?>
    <configuration>
        <property>
            <name>fs.default.name</name>
            <value>hdfs://localhost/</value>
        </property>
    
        <property>
            <name>mapred.job.tracker</name>
            <value>localhost:8021</value>
        </property>
    
        <property>
            <name>dfs.replication</name>    
            <value>1</value>
        </property>
    </configuration>
    
    
    fs.default.name
    hdfs://localhost/
    mapred.job.tracker
    本地主机:8021
    dfs.replication
    1.
    

    公共类MaxTemperatureDriver扩展配置的工具{
    @凌驾
    公共int运行(字符串[]args)引发异常{
    如果(参数长度!=2){
    System.err.printf(“用法:%s[通用选项]\n”,getClass().getSimpleName());
    ToolRunner.printGenericCommandUsage(System.err);
    返回-1;
    }
    作业作业=新作业(getConf(),“最高温度”);
    setJarByClass(getClass());
    addInputPath(作业,新路径(args[0]);
    setOutputPath(作业,新路径(args[1]);
    job.setReducerClass(MaxTemeratureReducer.class);
    setMapperClass(MaxTemperatureMapper_v1.class);
    job.setCombinerClass(MaxTemeratureReducer.class);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(IntWritable.class);
    返回作业。waitForCompletion(true)?0:1;
    }
    }
    

    公共类MaxTemperatureMapper\u v1扩展映射器{
    枚举温度{
    超过100
    }
    专用NcdcRecordParser parser=新的NcdcRecordParser();
    @凌驾
    公共void映射(LongWritable键、文本值、上下文上下文)引发IOException、InterruptedException{
    parser.parse(value);//Text.toString()
    if(parser.isValidTemperature()){
    int airtimperature=parser.getairtimperature();
    如果(空气温度>1000){
    系统错误println(“输入温度超过100度:“+值”);
    setStatus(“检测到可能损坏的记录:请参阅日志”);
    获取计数器(温度超过100)。增量(1);
    }
    write(新文本(parser.getYear()),新IntWritable(parser.getAirTemperature());
    }
    }
    }
    
    根据类名设置jar有时会给出错误,并设置inputformat和output format,我希望这将对您有所帮助

    job.setJarByClass(MaxTemperatureDriver.class);
    job.setInputFormatClass(TextInputFormat.class);
    job.setOutputFormatClass(TextOutputFormat.class);
    
    job.setJarByClass(“主类”)


    Main类意味着拥有Main()method

    我想你指的是job.setJarByClass(MaxTemperatureMapper.class);
    public class MaxTemperatureMapper_v1 extends Mapper<LongWritable, Text, Text, IntWritable>{
        enum Temperature{
            OVER_100
        }
    
        private NcdcRecordParser parser = new NcdcRecordParser();
    
        @Override
        public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException{
            parser.parse(value);    //Text.toString()
            if(parser.isValidTemperature()){
                int airTemperature = parser.getAirTemperature();
                if(airTemperature > 1000){
                    System.err.println("Temperature over 100 degrees for input: " + value);
                    context.setStatus("Detected possibly corrupt record: see logs.");
                    context.getCounter(Temperature.OVER_100).increment(1);
                }
                context.write(new Text(parser.getYear()), new IntWritable(parser.getAirTemperature()));
            }
        }
    }
    
    job.setJarByClass(MaxTemperatureDriver.class);
    job.setInputFormatClass(TextInputFormat.class);
    job.setOutputFormatClass(TextOutputFormat.class);