Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/6/mongodb/12.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 2.6.0浏览文件系统Java_Java_Hadoop_Directory_Centos_Filesystems - Fatal编程技术网

Hadoop 2.6.0浏览文件系统Java

Hadoop 2.6.0浏览文件系统Java,java,hadoop,directory,centos,filesystems,Java,Hadoop,Directory,Centos,Filesystems,我已经在CentOS 6.6上安装了一个基本的hadoop群集,并想编写一些基本程序(浏览文件系统、删除/添加文件等),但我很难让最基本的应用程序正常工作 在运行一些基本代码以将目录的内容列出到控制台时,我遇到以下错误: Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.ipc.RPC.getProxy(Ljava/lang/Class;JLjava/net/InetSocketAddress;Lo

我已经在CentOS 6.6上安装了一个基本的hadoop群集,并想编写一些基本程序(浏览文件系统、删除/添加文件等),但我很难让最基本的应用程序正常工作

在运行一些基本代码以将目录的内容列出到控制台时,我遇到以下错误:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.ipc.RPC.getProxy(Ljava/lang/Class;JLjava/net/InetSocketAddress;Lorg/apache/hadoop/security/UserGroupInformation;Lorg/apache/hadoop/conf/Configuration;Ljavax/net/SocketFactory;ILorg/apache/hadoop/io/retry/RetryPolicy;Z)Lorg/apache/hadoop/ipc/VersionedProtocol;
    at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:135)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:280)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
    at mapreducetest.MapreduceTest.App.main(App.java:36)

调用fs.initialize()后引发错误。我真的不知道这里有什么问题。我是否缺少依赖项?它们是错误的版本吗

我是通过调用“java-jar app.jar…等”来运行这个程序的 我应该使用“hadoopjarapp.jar”

当我正确运行它时,它按预期工作

          <dependencies>    
            <dependency>                                                                                                                                       
                <groupId>org.apache.hadoop</groupId>                                                                                                           
                <artifactId>hadoop-common</artifactId>                                                                                                         
                <version>2.6.0</version>                                                                                            
            </dependency>  

            <dependency>
              <groupId>org.apache.hadoop</groupId>
              <artifactId>hadoop-core</artifactId>
              <version>1.2.1</version>
            </dependency>       
          </dependencies>
import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hdfs.DistributedFileSystem;

public class App 
{   
    public static void main( String[] args ) throws IOException, URISyntaxException
    {


        Configuration conf = new Configuration();
        FileSystem fs = new DistributedFileSystem();
        fs.initialize(new URI("hdfs://localhost:9000/"), conf);


        for (FileStatus f :fs.listStatus(new Path("/")))
        {
            System.out.println(f.getPath().getName());                  
        }

        fs.close();

    }
}