Hadoop 文件是在HDFS中创建的,但可以';不要写内容 我在Vmware中安装了HDFP 3.0.1 DataNode和NameNode正在运行 我将文件从Ambarui/Terminal上传到HDFS,一切正常
当我尝试写入数据时:Hadoop 文件是在HDFS中创建的,但可以';不要写内容 我在Vmware中安装了HDFP 3.0.1 DataNode和NameNode正在运行 我将文件从Ambarui/Terminal上传到HDFS,一切正常,hadoop,hdfs,hadoop2,webhdfs,Hadoop,Hdfs,Hadoop2,Webhdfs,当我尝试写入数据时: Configuration conf = new Configuration(); conf.set("fs.defaultFS", "hdfs://172.16.68.131:8020"); FileSystem fs = FileSystem.get(conf); OutputStream os = fs.create(new Path("hdfs://172.16.68.131:8020/tmp/write.txt")); I
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://172.16.68.131:8020");
FileSystem fs = FileSystem.get(conf);
OutputStream os = fs.create(new Path("hdfs://172.16.68.131:8020/tmp/write.txt"));
InputStream is = new BufferedInputStream(new FileInputStream("/home/vq/hadoop/test.txt"));
IOUtils.copyBytes(is, os, conf);
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://172.16.68.131:8020");
FileSystem fs = FileSystem.get(conf);
FSDataInputStream inputStream = fs.open(new Path("hdfs://172.16.68.131:8020/tmp/ui.txt"));
System.out.println(inputStream.available());
byte[] bs = new byte[inputStream.available()];
日志:
它在HDFS中创建文件,但它是空的
当我读取数据时也是如此:
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://172.16.68.131:8020");
FileSystem fs = FileSystem.get(conf);
OutputStream os = fs.create(new Path("hdfs://172.16.68.131:8020/tmp/write.txt"));
InputStream is = new BufferedInputStream(new FileInputStream("/home/vq/hadoop/test.txt"));
IOUtils.copyBytes(is, os, conf);
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://172.16.68.131:8020");
FileSystem fs = FileSystem.get(conf);
FSDataInputStream inputStream = fs.open(new Path("hdfs://172.16.68.131:8020/tmp/ui.txt"));
System.out.println(inputStream.available());
byte[] bs = new byte[inputStream.available()];
I可以读取可用的字节。但无法读取文件
日志:
我在互联网上看到了很多答案,但都没有成功。您是否查看了以下链接:您是否查看了以下链接: