Java NoSuchMethodError:org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy

Java NoSuchMethodError:org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy,java,hadoop,hdfs,Java,Hadoop,Hdfs,以前,我在单节点集群上从java在hdfs中创建目录,它运行得很平稳,但一旦我创建了集群多节点,我就遇到了这个错误 我得到的stacktrace是这样的 Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;Z

以前,我在
单节点集群上从java在
hdfs
中创建目录,它运行得很平稳,但一旦我创建了集群多节点,我就遇到了这个错误 我得到的stacktrace是这样的

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;ZLjava/lang/String;Ljava/lang/String;Ljava/lang/Class;)Lorg/apache/hadoop/io/retry/RetryPolicy;
    at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:410)
    at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:316)
    at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:601)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2811)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
    at CreateDirectory.main(CreateDirectory.java:44)
线程“main”java.lang.NoSuchMethodError中的异常:org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy(Lorg/apache/hadoop/conf/conf/Configuration;Ljava/lang/String;Ljava/lang/String;Ljava/lang/Class;)Lorg/apache/hadoop/io/retry/RetryPolicy; 位于org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:410) 位于org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:316) 位于org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178) 位于org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:665) 位于org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:601) 位于org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148) 位于org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2811) 位于org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100) 位于org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848) 位于org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830) 位于org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389) 在CreateDirectory.main(CreateDirectory.java:44)
下面是CreateDirectory类

public static void main(String[] args) throws SQLException, ClassNotFoundException {
        String hdfsUri = "hdfs://localhost:9000/";
       //String dirName = args[0];
        String dirName=null;
      // String filename = args[1];
        String filename;

        if(args.length<=0) dirName = "ekbana"; filename = "text.csv";

        URL url = null;
        BufferedReader in = null;
        FileSystem hdfs = null;
        FSDataOutputStream outStream = null;
        HttpURLConnection conn = null;
        List<Map<String, String>> flatJson;
        Configuration con = new Configuration();
        try {
            url = new URL("http://crm.bigmart.com.np:81/export/export-sales-data.php?sdate=2016-12-01&edate=2016-12-02&key=jdhcvuicx8ruqe9djskjf90ueddishr0uy8v9hbjncvuw0er8idsnv");
        } catch (MalformedURLException ex) {
        }

        try {
            con.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
            con.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
            hdfs = FileSystem.get(URI.create(hdfsUri), con); // this is line 44 
        } catch (IOException e) {
            e.printStackTrace();
        }

        try {
            System.out.println(hdfs.mkdirs(new Path(hdfsUri + "/" + dirName)));
        } catch (IOException e) {
            e.printStackTrace();
        }
publicstaticvoidmain(字符串[]args)抛出SQLException、ClassNotFoundException{
字符串hdfsUri=”hdfs://localhost:9000/";
//字符串dirName=args[0];
字符串dirName=null;
//字符串文件名=args[1];
字符串文件名;

如果(args.length添加maven依赖项有帮助:

```
    <dependency>
       <groupId>org.apache.hadoop</groupId>
       <artifactId>hadoop-hdfs</artifactId>
       <version>2.8.1</version>
    </dependency>

```
```
org.apache.hadoop
hadoop hdfs
2.8.1
```

我非常赞同尤莉亚的回答,她让我理解并解决了我遇到的问题。你可以按照她的建议或补充:

/path/to/hadoop-client/client/hadoop-hdfs-client-<version_number>.jar
/path/to/hadoop-client/client/hadoop-hdfs-client-.jar

到您的类路径,以便在运行时包含hadoop hdfs客户端jar。

通常,
NoSuchMethodError
指向不兼容的jarfiles@Jens感谢您的输入。我使用的是hadoop-2.8,hadoop commons也是2.8。在这个版本中,没有包重试,您可以看到@Jens I刚才的t尽管我对最新的hadoop.io软件包(2.5版)感到厌倦,但我还是遇到了同样的错误。您可以尝试在这里获得答案: