Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/322.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 从本地计算机创建文件并将其写入hdfs_Java_Hadoop_Hdfs - Fatal编程技术网

Java 从本地计算机创建文件并将其写入hdfs

Java 从本地计算机创建文件并将其写入hdfs,java,hadoop,hdfs,Java,Hadoop,Hdfs,我在网络中连接了两个系统。一个是hdfs正在运行。我想创建一个文件并从另一台机器写入数据 package myorg; import java.io.*; import java.util.*; import java.net.*; import org.apache.hadoop.fs.*; import org.apache.hadoop.conf.*; import org.apache.hadoop.io.*; import org.apache.hado

我在网络中连接了两个系统。一个是hdfs正在运行。我想创建一个文件并从另一台机器写入数据

  package myorg;
  import java.io.*;
  import java.util.*;
  import java.net.*;
  import org.apache.hadoop.fs.*;
  import org.apache.hadoop.conf.*;
  import org.apache.hadoop.io.*;
  import org.apache.hadoop.mapred.*;
  import org.apache.hadoop.util.*;

  public class Write1{
    public static void main (String [] args) throws Exception{
            try{
                    System.out.println("Starting...");
                    Path pt=new Path("hdfs://10.236.173.95:8020/user/jfor/out/gwmdfd");
                    FileSystem fs = FileSystem.get(new Configuration());
                    BufferedWriter br=new BufferedWriter(new OutputStreamWriter(fs.create(pt,true)));
                                               // TO append data to a file, use fs.append(Path f)
                    String line;
                    line="Disha Dishu Daasha dfasdasdawqeqwe";
                    System.out.println(line);
                    br.write(line);
                    br.close();
            }catch(Exception e){
                    System.out.println("File not found");
            }
    }
}

我使用

   javac -classpath hadoop-0.20.1-dev-core.jar -d Write1/ Write1.java
使用

  jar -cvf Write1.jar -C Write1/ .
运行命令

  hadoop jar Write1.jar myorg.Write1
如果我运行这个,我将

 starting...
 File not found
原因可能是什么?如果我在我的hadoop机器上运行这个程序,它可以正常工作[我用localhost替换了ip]

错误位于BufferedWriter行。上面说找不到文件。这是什么意思?我使用了fs.creat。如果它不存在,那么它应该创建。不是吗

 java.lang.IllegalArgumentException: Wrong FS: hdfs://10.72.40.68:8020/user/jfor/..... expected localhost:8020                                                   

 So i modified the following line

 FileSystem fs = FileSystem.get(new URI("hdfs://<ip>:8020"),new Configuration());

It says Connection refused. What could be the reason

未解析的IP可能会ping到IP works和traceroute到IP works。我还应该做什么?hadoop节点通过ssh进行交互,并且由于localhost可能已经有资格在未经授权的情况下访问自身,因此它工作正常,而您运行此java程序的远程计算机没有资格访问它。所以我需要启用无密码ssh吗?不,如果我在本地主机本身中使用ip而不是localhost,则不起作用