Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/google-cloud-platform/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 将文件从HDFS复制到本地计算机_Java_Hadoop_Hdfs - Fatal编程技术网

Java 将文件从HDFS复制到本地计算机

Java 将文件从HDFS复制到本地计算机,java,hadoop,hdfs,Java,Hadoop,Hdfs,我在尝试将文件从HDFS文件系统“下载”到本地系统时遇到问题。(即使相反的操作没有问题)。 *注意:文件存在于HDFS文件系统的指定路径上 以下是一段代码片段: Configuration conf = new Configuration(); conf.set("fs.defaultFS", "${NAMENODE_URI}"); FileSystem hdfsFileSystem = FileSystem.get(conf); String result =

我在尝试将文件从HDFS文件系统“下载”到本地系统时遇到问题。(即使相反的操作没有问题)。 *注意:文件存在于HDFS文件系统的指定路径上

以下是一段代码片段:

    Configuration conf = new Configuration();
    conf.set("fs.defaultFS", "${NAMENODE_URI}");
    FileSystem hdfsFileSystem = FileSystem.get(conf);

    String result = "";

    Path local = new Path("${SOME_LOCAL_PATH}");
    Path hdfs = new Path("${SOME_HDFS_PATH}");

    String fileName = hdfs.getName();

    if (hdfsFileSystem.exists(hdfs))
    {
        hdfsFileSystem.copyToLocalFile(hdfs, local);
        result = "File " + fileName + " copied to local machine on location: " + localPath;
    }
    else
    {
        result = "File " + fileName + " does not exist on HDFS on location: " + localPath;
    }

    return result;
我得到的例外情况如下:

12/07/13 14:57:46 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.io.IOException: Cannot run program "cygpath": CreateProcess error=2, The system cannot find the file specified
    at java.lang.ProcessBuilder.start(Unknown Source)
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:206)
    at org.apache.hadoop.util.Shell.run(Shell.java:188)
    at org.apache.hadoop.fs.FileUtil$CygPathCommand.<init>(FileUtil.java:412)
    at org.apache.hadoop.fs.FileUtil.makeShellPath(FileUtil.java:438)
    at org.apache.hadoop.fs.FileUtil.makeShellPath(FileUtil.java:465)
    at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:573)
    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:565)
    at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:403)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:452)
    at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:420)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:774)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:755)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:654)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:259)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:232)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:183)
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1837)
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1806)
    at org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1782)
    at com.hmeter.hadoop.hdfs.hdfsoperations.HdfsOperations.fileCopyFromHdfsToLocal(HdfsOperations.java:75)
    at com.hmeter.hadoop.hdfs.hdfsoperations.HdfsOperations.main(HdfsOperations.java:148)
Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified
    at java.lang.ProcessImpl.create(Native Method)
    at java.lang.ProcessImpl.<init>(Unknown Source)
    at java.lang.ProcessImpl.start(Unknown Source)
    ... 22 more
12/07/13 14:57:46警告util.NativeCodeLoader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类
线程“main”java.io.IOException中出现异常:无法运行程序“cygpath”:CreateProcess error=2,系统找不到指定的文件
位于java.lang.ProcessBuilder.start(未知源)
位于org.apache.hadoop.util.Shell.runCommand(Shell.java:206)
位于org.apache.hadoop.util.Shell.run(Shell.java:188)
位于org.apache.hadoop.fs.FileUtil$CygPathCommand。(FileUtil.java:412)
位于org.apache.hadoop.fs.FileUtil.makeShellPath(FileUtil.java:438)
位于org.apache.hadoop.fs.FileUtil.makeShellPath(FileUtil.java:465)
位于org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:573)
位于org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:565)
位于org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:403)
位于org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:452)
位于org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:420)
位于org.apache.hadoop.fs.FileSystem.create(FileSystem.java:774)
位于org.apache.hadoop.fs.FileSystem.create(FileSystem.java:755)
位于org.apache.hadoop.fs.FileSystem.create(FileSystem.java:654)
位于org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:259)
位于org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:232)
位于org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:183)
位于org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1837)
位于org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1806)
位于org.apache.hadoop.fs.FileSystem.copyToLocalFile(FileSystem.java:1782)
在com.hmeter.hadoop.hdfs.hdfsoperations.hdfsoperations.fileCopyFromHdfsToLocal(hdfsoperations.java:75)
位于com.hmeter.hadoop.hdfs.hdfsoperations.hdfsoperations.main(hdfsoperations.java:148)
原因:java.io.IOException:CreateProcess error=2,系统找不到指定的文件
在java.lang.ProcessImpl.create(本机方法)
位于java.lang.ProcessImpl。(未知源)
位于java.lang.ProcessImpl.start(未知源)
... 还有22个
知道什么是问题吗?为什么需要Cygwin的cyqpath?我在Windows7上运行此代码


谢谢

尝试从API使用此方法:

//where delSrc is do you want to delete the source, src and dst you already have and useRawLocalFileSystem should be set to true in your case
hdfsFileSystem.copyToLocalFile(delSrc, src, dst, useRawLocalFileSystem);
在您的情况下,请更换:

hdfsFileSystem.copyToLocalFile(hdfs, local);
与:


您可以按照如下所示的代码进行操作:

public static void main(String args[]){
    try {
        Configuration conf = new Configuration();
        conf.set("fs.defaultFS", "hdfs://localhost:54310/user/hadoop/");
        FileSystem fs = FileSystem.get(conf);
        FileStatus[] status = fs.listStatus(new Path("hdfsdirectory"));
        for(int i=0;i<status.length;i++){
            System.out.println(status[i].getPath());
            fs.copyToLocalFile(false, status[i].getPath(), new Path("localdir"));
        }
    } catch (IOException e) {
        e.printStackTrace();
    }

}
publicstaticvoidmain(字符串参数[]){
试一试{
Configuration conf=新配置();
conf.set(“fs.defaultFS”hdfs://localhost:54310/user/hadoop/");
FileSystem fs=FileSystem.get(conf);
FileStatus[]status=fs.listStatus(新路径(“hdfsdirectory”);

对于(int i=0;我认为它有效,当通过oozie提交作业时?@Abhinay我不知道不再使用它了抱歉
public static void main(String args[]){
    try {
        Configuration conf = new Configuration();
        conf.set("fs.defaultFS", "hdfs://localhost:54310/user/hadoop/");
        FileSystem fs = FileSystem.get(conf);
        FileStatus[] status = fs.listStatus(new Path("hdfsdirectory"));
        for(int i=0;i<status.length;i++){
            System.out.println(status[i].getPath());
            fs.copyToLocalFile(false, status[i].getPath(), new Path("localdir"));
        }
    } catch (IOException e) {
        e.printStackTrace();
    }

}