Java initMiniDFSCluster抛出NoClassDefFoundError(hadoop客户端测试)

Java initMiniDFSCluster抛出NoClassDefFoundError(hadoop客户端测试),java,maven,hadoop,Java,Maven,Hadoop,我正在编写一个软件,它应该在hadoop hdfs中存储文件,当然,我想为这个特定的功能编写测试用例。不幸的是,当我尝试构建()MiniDFSCluster时,我得到了以下结果 16/10/07 16:16:33 INFO hdfs.MiniDFSCluster: starting cluster: numNameNodes=1, numDataNodes=2 16/10/07 16:16:33 INFO hdfs.MiniDFSCluster: Shutting down the Mini H

我正在编写一个软件,它应该在hadoop hdfs中存储文件,当然,我想为这个特定的功能编写测试用例。不幸的是,当我尝试构建()MiniDFSCluster时,我得到了以下结果

16/10/07 16:16:33 INFO hdfs.MiniDFSCluster: starting cluster: numNameNodes=1, numDataNodes=2
16/10/07 16:16:33 INFO hdfs.MiniDFSCluster: Shutting down the Mini HDFS Cluster

java.lang.NoClassDefFoundError: org/apache/hadoop/net/StaticMapping

  at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:792)
  at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:475)
  at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:434)
  at de.tuberlin.cit.storageassistant.ArchiveManagerTest.setUp(ArchiveManagerTest.java:33)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:497)
  at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
  at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
  at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
  at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
  at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
  at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
  at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
  at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
  at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
  at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
  at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
  at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
  at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
  at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
  at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
  at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:69)
  at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:234)
  at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:74)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:497)
  at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.net.StaticMapping
  at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
  at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
  ... 31 more


Process finished with exit code 255

有什么帮助或建议吗?

面对同样的问题,通过向
hadoop common:tests添加依赖项,可以解决这个问题:

<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>${hadoop.version}</version>
    <classifier>tests</classifier>
</dependency>

org.apache.hadoop
hadoop通用
${hadoop.version}
测验
用于生产代码的
hadoop common
工件中缺少相应的类

@org.junit.Before
public void setUp() throws Exception {
//        super.setUp();
    Configuration conf = new Configuration();
//        conf.set("fs.defaultFS", "hdfs://localhost:9000");
    File baseDir = new File("./target/hdfs/").getAbsoluteFile();
    FileUtil.fullyDelete(baseDir);
    conf.set(MiniDFSCluster.HDFS_MINIDFS_BASEDIR, baseDir.getAbsolutePath());
    dfsCluster = new MiniDFSCluster
            .Builder(conf)
            .checkExitOnShutdown(true)
            .numDataNodes(2)
            .format(true)
            .racks(null)
            .build();
    hdfsURI = "hdfs://localhost:"+ dfsCluster.getNameNodePort() + "/";
}

@org.junit.After
public void tearDown() throws Exception {
    if (dfsCluster != null) {
        dfsCluster.shutdown();
    }
}
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>${hadoop.version}</version>
    <classifier>tests</classifier>
</dependency>