Java 在windos上运行时,配置单元orc writer.close()返回空指针异常

Java 在windos上运行时,配置单元orc writer.close()返回空指针异常,java,apache,hadoop,Java,Apache,Hadoop,我正在创建orc文件并向文件中添加行。它在linux上工作。但是它在windows.writer.close()返回NPE时不起作用。 请找到下面的代码和堆栈跟踪,并给我相同的帮助。 代码:-- 包com.testing import java.io.IOException; import java.util.Arrays; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import

我正在创建orc文件并向文件中添加行。它在linux上工作。但是它在windows.writer.close()返回NPE时不起作用。 请找到下面的代码和堆栈跟踪,并给我相同的帮助。 代码:-- 包com.testing

import java.io.IOException;
import java.util.Arrays;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hive.ql.io.orc.OrcFile;
import org.apache.hadoop.hive.ql.io.orc.OrcFile.WriterOptions;
import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector;
import org.apache.hadoop.hive.serde2.typeinfo.TypeInfo;
import org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils;
import org.apache.hadoop.hive.ql.io.orc.OrcFile;
import org.apache.hadoop.hive.ql.io.orc.OrcFile.WriterOptions;
import org.apache.hadoop.hive.ql.io.orc.Writer;


public class Typical  {

    public static void main(String args[]){
        String filePath ="C:/usr/tmp/EDMS_FILE_ARCHIVE_.orc";
        TypeInfo typeInfo=TypeInfoUtils.getTypeInfoFromTypeString("struct<a:string>");
          ObjectInspector inspector=TypeInfoUtils.getStandardJavaObjectInspectorFromTypeInfo(typeInfo);
          WriterOptions options=OrcFile.writerOptions(new Configuration()).inspector(inspector);
          //Path path=new Path(temporaryFolder.getRoot().getCanonicalPath(),"part-00000");
          Writer writer;
        try {
            writer = OrcFile.createWriter(new Path(filePath),options);
             writer.addRow(Arrays.asList("hello"));
              writer.close();

        } catch (IllegalArgumentException | IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }

    }

}


statcktrace:--

Exception in thread "main" java.lang.NullPointerException
    at java.lang.ProcessBuilder.start(Unknown Source)
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:482)
    at org.apache.hadoop.util.Shell.run(Shell.java:455)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:656)
    at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:490)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:462)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:428)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:908)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:889)
    at org.apache.hadoop.hive.ql.io.orc.WriterImpl.getStream(WriterImpl.java:1967)
    at org.apache.hadoop.hive.ql.io.orc.WriterImpl.flushStripe(WriterImpl.java:1984)
    at org.apache.hadoop.hive.ql.io.orc.WriterImpl.close(WriterImpl.java:2289)
    at com.testing.Typical.main(Typical.java:30)


Advance Thanks
Hanuman
import java.io.IOException;
导入java.util.array;
导入org.apache.hadoop.conf.Configuration;
导入org.apache.hadoop.fs.Path;
导入org.apache.hadoop.hive.ql.io.orc.orc文件;
导入org.apache.hadoop.hive.ql.io.orc.OrcFile.WriterOptions;
导入org.apache.hadoop.hive.serde2.objectinspector.objectinspector;
导入org.apache.hadoop.hive.serde2.typeinfo.typeinfo;
导入org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils;
导入org.apache.hadoop.hive.ql.io.orc.orc文件;
导入org.apache.hadoop.hive.ql.io.orc.OrcFile.WriterOptions;
导入org.apache.hadoop.hive.ql.io.orc.Writer;
公共类典型{
公共静态void main(字符串参数[]){
String filePath=“C:/usr/tmp/EDMS\u FILE\u ARCHIVE\uu.orc”;
TypeInfo-TypeInfo=TypeInfoUtils.getTypeInfoFromTypeString(“结构”);
ObjectInspector inspector=TypeInfoUtils.getStandardJavaObjectInspectorFromTypeInfo(typeInfo);
WriterOptions=OrcFile.WriterOptions(新配置()).inspector(inspector);
//路径路径=新路径(temporaryFolder.getRoot().getCanonicalPath(),“part-00000”);
作家;
试一试{
writer=OrcFile.createWriter(新路径(filePath),选项);
writer.addRow(Arrays.asList(“hello”));
writer.close();
}捕获(IllegalArgumentException | IOE异常){
//TODO自动生成的捕捉块
e、 printStackTrace();
}
}
}
statcktrace:--
线程“main”java.lang.NullPointerException中出现异常
位于java.lang.ProcessBuilder.start(未知源)
位于org.apache.hadoop.util.Shell.runCommand(Shell.java:482)
位于org.apache.hadoop.util.Shell.run(Shell.java:455)
位于org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
位于org.apache.hadoop.util.Shell.execCommand(Shell.java:808)
位于org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
位于org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:656)
位于org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:490)
位于org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:462)
位于org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:428)
位于org.apache.hadoop.fs.FileSystem.create(FileSystem.java:908)
位于org.apache.hadoop.fs.FileSystem.create(FileSystem.java:889)
位于org.apache.hadoop.hive.ql.io.orc.WriterImpl.getStream(WriterImpl.java:1967)
位于org.apache.hadoop.hive.ql.io.orc.WriterImpl.flushripe(WriterImpl.java:1984)
位于org.apache.hadoop.hive.ql.io.orc.WriterImpl.close(WriterImpl.java:2289)
位于com.testing.Typical.main(Typical.java:30)
提前感谢
哈努曼

在windows上使用orc库并写入本地文件是一个问题:

java.lang.NullPointerException
at java.lang.ProcessBuilder.start(Unknown Source)
程序想要执行chmod命令,但它在windows上不存在-这里是NPE。 我发现了我不喜欢的解决方法:

  • dowload winutils.exe
  • set属性(“hadoop.home.dir”,“c:\path\to\winutils”);wintutils libs的实际路径应该是c:\path\to\winutils\bin