Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/spring-mvc/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
java.lang.UnsupportedOperationException:未由DistributedFileSystem实现实现_Java_Spring Mvc_Hadoop - Fatal编程技术网

java.lang.UnsupportedOperationException:未由DistributedFileSystem实现实现

java.lang.UnsupportedOperationException:未由DistributedFileSystem实现实现,java,spring-mvc,hadoop,Java,Spring Mvc,Hadoop,如果需要将hdfs声明为分布式的,我将使用单节点hadoop集群(2.6.2)。我添加了hdfs、core、commonjar到项目构建路径,也需要jar到项目库。但是我现在得到了错误java.lang.UnsupportedOperationException:不是由我在应用程序上下文中配置的分布式文件系统实现实现的 Configuration conf = new Configuration(); conf.addResource(new Path("/usr/local/hadoop-2

如果需要将hdfs声明为分布式的,我将使用单节点hadoop集群(2.6.2)。我添加了hdfs、core、commonjar到项目构建路径,也需要jar到项目库。但是我现在得到了错误java.lang.UnsupportedOperationException:不是由我在应用程序上下文中配置的分布式文件系统实现实现的

Configuration conf = new Configuration();

conf.addResource(new Path("/usr/local/hadoop-2.6.2/etc/hadoop/core-site.xml"));
        conf.addResource(new Path("/usr/local/hadoop-2.6.2/etc/hadoop/hdfs-site.xml"));
        conf.addResource(new Path("/usr/local/hadoop-2.6.2/etc/hadoop/mapred-site.xml"));


conf.set("fs.defaultFS", "hdfs://localhost:8088");
FileSystem fileSystem = FileSystem.get(conf);

您使用的依赖jar似乎有问题

我正在使用Hadoop 2.7.1

我试用了你的程序,在集群中得到了正确的结果。我得到了正确的输出
hdfs
作为方案

节目:

package com.myorg.hadooptests;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;

public class GetConfTest {
    public static void main(String[] args) throws Exception    {

        Configuration conf = new Configuration();

        conf.set("fs.defaultFS", "hdfs://MBALLUR:8020");
        FileSystem fs = FileSystem.get(conf);
        System.out.println(fs.getScheme());
    }
}
<dependencies>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-mapreduce-client-core</artifactId>
        <version>2.7.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>1.2.1</version>
    </dependency>

</dependencies>
Maven依赖项:

package com.myorg.hadooptests;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;

public class GetConfTest {
    public static void main(String[] args) throws Exception    {

        Configuration conf = new Configuration();

        conf.set("fs.defaultFS", "hdfs://MBALLUR:8020");
        FileSystem fs = FileSystem.get(conf);
        System.out.println(fs.getScheme());
    }
}
<dependencies>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-mapreduce-client-core</artifactId>
        <version>2.7.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>1.2.1</version>
    </dependency>

</dependencies>

org.apache.hadoop
hadoop通用
2.7.1
org.apache.hadoop
hadoop mapreduce客户端核心
2.7.1
org.apache.hadoop
hadoop内核
1.2.1
我的类路径设置为(我正在Windows上运行):

package com.myorg.hadooptests;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;

public class GetConfTest {
    public static void main(String[] args) throws Exception    {

        Configuration conf = new Configuration();

        conf.set("fs.defaultFS", "hdfs://MBALLUR:8020");
        FileSystem fs = FileSystem.get(conf);
        System.out.println(fs.getScheme());
    }
}
<dependencies>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-mapreduce-client-core</artifactId>
        <version>2.7.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>1.2.1</version>
    </dependency>

</dependencies>
E:\HadoopTests\target>echo%CLASSPATH% .;e:\hdp\hadoop-2.7.1.2.3.0.0-2557\etc\hadoop\;e:\hdp\hadoop-2.7.1.2.3.0.0-2557\ 共享\hadoop\common*;e:\hdp\hadoop-2.7.1.2.3.0.0-2557\share\hadoop\common\lib* ;e:\hdp\hadoop-2.7.1.2.3.0.0-2557\share\hadoop\hdfs*;e:\hdp\hadoop-2.7.1.2.3.0。 0-2557\share\hadoop\hdfs\lib*;e:\hdp\hadoop-2.7.1.2.3.0.0-2557\share\hadoop\map 减少*;e:\hdp\hadoop-2.7.1.2.3.0.0-2557\share\hadoop\mapreduce\lib*;e:\hdp\ha doop-2.7.1.2.3.0.0-2557\share\hadoop\tools*;e:\hdp\hadoop-2.7.1.2.3.0.0-2557\sh 是\hadoop\tools\lib*;e:\hdp\hadoop-2.7.1.2.3.0.0-2557\share\hadoop\warn*;e:\h dp\hadoop-2.7.1.2.3.0.0-2557\share\hadoop\Thread\lib*


您在哪一行获得此异常?最后一行FileSystem=FileSystem.get(conf);请帮帮我。这个问题发生了很多次…我会检查并回答答案。几个月前,我曾研究过一个类似的问题,并得到了解决。你能把你的maven依赖项发布给我吗?事实上,对于这个项目,我开始使用maven。但后来它给出了一个问题,所以我改为纯JavaEE透视。