无法使用hbase和hadoop在map函数中执行“put”

无法使用hbase和hadoop在map函数中执行“put”,hadoop,mapreduce,hbase,Hadoop,Mapreduce,Hbase,大家好,我用mr处理一些日志文件,文件在hdfs上。我想从文件中检索一些信息并将它们存储到hbase 所以我开始工作 HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar crm_hbase-1.0.jar /datastream/music/useraction/2014-11-30/music_useraction_20141130-230003072+0800.245760153

大家好,我用mr处理一些日志文件,文件在hdfs上。我想从文件中检索一些信息并将它们存储到hbase

所以我开始工作

HADOOP_CLASSPATH=`${HBASE_HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar crm_hbase-1.0.jar /datastream/music/useraction/2014-11-30/music_useraction_20141130-230003072+0800.24576015364769354.00018022.lzo
若我只是以hadoop jar xxxx运行作业,它将显示not find HBaseConfiguration

我的代码很简单

  public int run(String[] strings) throws Exception {


    Configuration config = HBaseConfiguration.create();  

    String  kerbConfPrincipal = "ndir@HADOOP.HZ.NETEASE.COM";
    String  kerbKeytab = "/srv/zwj/ndir.keytab";



    UserGroupInformation.loginUserFromKeytab(kerbConfPrincipal, kerbKeytab);
    UserGroupInformation ugi = UserGroupInformation.getLoginUser();

    System.out.println(" auth: " + ugi.getAuthenticationMethod());
    System.out.println(" name: " + ugi.getUserName());
    System.out.println(" using keytab:" + ugi.isFromKeytab());

    HBaseAdmin.checkHBaseAvailable(config);

    //set job name
    Job job = new Job(config, "Import from file ");
    job.setJarByClass(LogRun.class);
    //set map class
    job.setMapperClass(LogMapper.class);

    //set output format and output table name
    job.setOutputFormatClass(TableOutputFormat.class);
    job.getConfiguration().set(TableOutputFormat.OUTPUT_TABLE, "crm_data");
    job.setOutputKeyClass(ImmutableBytesWritable.class);
    job.setOutputValueClass(Put.class);
    job.setNumReduceTasks(0);
    TableMapReduceUtil.addDependencyJars(job);
但是当我试图运行这个MR时,我无法执行上下文。writenull,put,似乎地图停在这一行。
我认为它与kerbKeytab有关,这是否意味着在添加TableMapReduceUtil后运行映射过程时需要登录

    Job job = new Job(config, "Import from file ");
    job.setJarByClass(LogRun.class);
    //set map class
    job.setMapperClass(LogMapper.class);

    TableMapReduceUtil.initTableReducerJob(table, null, job);
    job.setNumReduceTasks(0);
    TableMapReduceUtil.addDependencyJars(job);   
    FileInputFormat.setInputPaths(job,input);
    //FileInputFormat.addInputPath(job, new Path(strings[0]));
    int ret = job.waitForCompletion(true) ? 0 : 1;

问题可能出在context.writenull上,也可以是put,检查这个post-Job Job=newjobconfig,Import from file;job.setJarByClassLogRun.class//设置映射类job.setMapperClassLogMapper.class;TableMapReduceUtil.initTableReducerJobtable,null,作业;job.setNumReduceTasks0;TableMapReduceUtil.addDependencyJarsjob;setInputPathsjob,输入//FileInputFormat.addInputPathjob,新路径字符串[0];int ret=job.waitForCompletiontrue?0 : 1;