Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/370.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/sql-server-2005/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java HDF fsDataOutputStream写入空文件创建失败_Java_Hadoop_Hdfs - Fatal编程技术网

Java HDF fsDataOutputStream写入空文件创建失败

Java HDF fsDataOutputStream写入空文件创建失败,java,hadoop,hdfs,Java,Hadoop,Hdfs,我在hadoop上写小文件时遇到了一个奇怪的问题。下面是示例程序 public void writeFile(Configuration conf, String message, String filename) throws Exception { FSDataOutputStream fsDataOutputStream = null; DistributedFileSystem fs = null; try { fs

我在hadoop上写小文件时遇到了一个奇怪的问题。下面是示例程序

public void writeFile(Configuration conf, String message, String filename) throws Exception {
        FSDataOutputStream fsDataOutputStream = null;
        DistributedFileSystem fs = null;
        try {
            fs = (DistributedFileSystem) FileSystem.get(URI.create(properties.getHadoop().getRawLocation()), conf);
            Path hdfswritepath = new Path(properties.getHadoop().getRawLocation() + "/" + filename + ".json");
            fsDataOutputStream = fs.create(hdfswritepath);
            fsDataOutputStream.write(message.getBytes());
            fsDataOutputStream.close();
            fsDataOutputStream.hsync();
        } catch (IllegalArgumentException | IOException e) {
            System.out.println("Got Exception");
            e.printStackTrace();
            throw e;
        } finally {
            fs.close();
            System.out.println("clean up done");
        }

    }
上面的代码正在hadoop位置创建空文件。这里有一些我试过的东西

  • 客户端和hadoop服务器之间没有防火墙
  • 从本地复制到hadoop正在工作
  • 问题是只创建了0字节的文件。

    我对这件事感到不满

    09:12:02,129 INFO  [org.apache.hadoop.hdfs.DFSClient] (Thread-118) Exception in createBlockOutputStream: java.net.ConnectException: Connection timed out: no further information
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
        at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1533)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1309)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1262)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:448)
    

    我能用电脑解决这个问题

    conf.set(“dfs.client.use.datanode.hostname”、“true”)