Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/ssis/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
java.io.IOException:运行MapReduce作业时,Mkdirs无法创建_Java_Macos_Hadoop_Mapreduce_Hbase - Fatal编程技术网

java.io.IOException:运行MapReduce作业时,Mkdirs无法创建

java.io.IOException:运行MapReduce作业时,Mkdirs无法创建,java,macos,hadoop,mapreduce,hbase,Java,Macos,Hadoop,Mapreduce,Hbase,我试图运行一个简单的MapReduce作业将数据导入HBase,但它无法运行,下面是错误stacktrace Exception in thread "main" java.io.IOException: Mkdirs failed to create /user/SOME_PATH/hbase-staging (exists=false, cwd=file:/Users/SOME_PATH/2ND_PATH/HFileIntoHBase) at org.apache.hadoop.fs

我试图运行一个简单的MapReduce作业将数据导入HBase,但它无法运行,下面是错误stacktrace

Exception in thread "main" java.io.IOException: Mkdirs failed to create /user/SOME_PATH/hbase-staging (exists=false, cwd=file:/Users/SOME_PATH/2ND_PATH/HFileIntoHBase)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:440)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:426)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)
    at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1071)
    at org.apache.hadoop.io.SequenceFile$RecordCompressWriter.<init>(SequenceFile.java:1371)
    at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:272)
    at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:294)
    at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.writePartitions(HFileOutputFormat2.java:335)
    at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configurePartitioner(HFileOutputFormat2.java:596)
    at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:440)
    at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:405)
    at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:367)
正如其他几个类似职位所建议的那样: 我已验证我有权在此目录中执行
mkdir

我的机器是MacOSX:10.11.6

请帮忙

谢谢

public int run(String[] arg0) throws Exception {

        Configuration conf = new Configuration();
        conf.set(MAPRED_JOB_NAME, "steve_test");
        conf.set(HBASE_TABLE, "steve1");
        Job job = new Job(conf, conf.get(MAPRED_JOB_NAME));
        String output_table = conf.get(HBASE_TABLE);

        job.setJarByClass(PutUrlIntoHbase.class);
        job.setMapperClass(PutUrlIntoHbaseMapper.class);
        job.setReducerClass(PutSortReducer.class);

        job.setMapOutputKeyClass(ImmutableBytesWritable.class);
        job.setMapOutputValueClass(Put.class);

        HTable table = new HTable(conf, output_table);
        job.setOutputFormatClass(HFileOutputFormat2.class);
        HFileOutputFormat2.configureIncrementalLoad(job, table);

        if (job.waitForCompletion(true) && job.isSuccessful()) {
            return 0;
        }
        return -1;
    }

    public static void main(String[] args) throws Exception {
        Configuration conf = HBaseConfiguration.create();
        int res = ToolRunner.run(conf, new PutUrlIntoHbase(), args);
        System.exit(res);
    }