Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/wpf/12.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 在测试中启动minidfscluster_Java_Testing_Hadoop_Junit_Classloader - Fatal编程技术网

Java 在测试中启动minidfscluster

Java 在测试中启动minidfscluster,java,testing,hadoop,junit,classloader,Java,Testing,Hadoop,Junit,Classloader,我正在测试中启动MiniDfsCluster(我的依赖项是2.0.0-cdh4.5.0)。我使用一个简单的例程来启动它: File baseDir = new File("./target/hdfs/" + RunWithHadoopCluster.class.getSimpleName()).getAbsoluteFile(); FileUtil.fullyDelete(baseDir); Configuration conf = new Configuration(); conf.set(M

我正在测试中启动MiniDfsCluster(我的依赖项是2.0.0-cdh4.5.0)。我使用一个简单的例程来启动它:

File baseDir = new File("./target/hdfs/" + RunWithHadoopCluster.class.getSimpleName()).getAbsoluteFile();
FileUtil.fullyDelete(baseDir);
Configuration conf = new Configuration();
conf.set(MiniDFSCluster.HDFS_MINIDFS_BASEDIR, baseDir.getAbsolutePath());
MiniDFSCluster.Builder builder = new MiniDFSCluster.Builder(conf);
MiniDFSCluster hdfsCluster = builder.build();
String hdfsURI = "hdfs://localhost:"+ hdfsCluster.getNameNodePort() + "/";
并不断得到以下错误

12:02:15.994 [main] WARN  o.a.h.metrics2.impl.MetricsConfig - Cannot locate configuration: tried hadoop-metrics2-namenode.properties,hadoop-metrics2.properties
12:02:16.047 [main] INFO  o.a.h.m.impl.MetricsSystemImpl - Scheduled snapshot period at 10 second(s).
12:02:16.047 [main] INFO  o.a.h.m.impl.MetricsSystemImpl - NameNode metrics system started

java.lang.IncompatibleClassChangeError: Implementing class
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at org.apache.hadoop.metrics2.source.JvmMetrics.getEventCounters(JvmMetrics.java:162)
    at org.apache.hadoop.metrics2.source.JvmMetrics.getMetrics(JvmMetrics.java:96)
    at org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMetrics(MetricsSourceAdapter.java:194)
    at org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateJmxCache(MetricsSourceAdapter.java:171)
    at org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMBeanInfo(MetricsSourceAdapter.java:150)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getNewMBeanClassName(DefaultMBeanServerInterceptor.java:333)
    at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:319)
    at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
    at org.apache.hadoop.metrics2.util.MBeans.register(MBeans.java:57)
    at org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.startMBeans(MetricsSourceAdapter.java:220)
    at org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.start(MetricsSourceAdapter.java:95)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.registerSource(MetricsSystemImpl.java:244)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:222)
    at org.apache.hadoop.metrics2.source.JvmMetrics.create(JvmMetrics.java:80)
    at org.apache.hadoop.hdfs.server.namenode.metrics.NameNodeMetrics.create(NameNodeMetrics.java:94)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.initMetrics(NameNode.java:278)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:436)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:613)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:598)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1169)
    at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:879)
    at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:770)
    at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:628)
    at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:323)
    at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:113)
    at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:305)
12:02:15.994[main]警告o.a.h.metrics2.impl.MetricsConfig-找不到配置:已尝试hadoop-metrics2-namenode.properties,hadoop-metrics2.properties
12:02:16.047[主]信息o.a.h.m.impl.MetricsSystemImpl-计划的快照周期为10秒。
12:02:16.047[主]信息o.a.h.m.impl.MetricsSystemImpl-名称节点度量系统已启动
java.lang.CompatibleClassChangeError:实现类
位于java.lang.ClassLoader.defineClass1(本机方法)
位于java.lang.ClassLoader.defineClass(ClassLoader.java:800)
位于java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
位于java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
在java.net.URLClassLoader.access$100(URLClassLoader.java:71)
在java.net.URLClassLoader$1.run(URLClassLoader.java:361)
在java.net.URLClassLoader$1.run(URLClassLoader.java:355)
位于java.security.AccessController.doPrivileged(本机方法)
位于java.net.URLClassLoader.findClass(URLClassLoader.java:354)
位于java.lang.ClassLoader.loadClass(ClassLoader.java:425)
位于sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
位于java.lang.ClassLoader.loadClass(ClassLoader.java:358)
位于org.apache.hadoop.metrics2.source.JvmMetrics.getEventCounters(JvmMetrics.java:162)
位于org.apache.hadoop.metrics2.source.JvmMetrics.getMetrics(JvmMetrics.java:96)
位于org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMetrics(MetricsSourceAdapter.java:194)
位于org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.updateJmxCache(MetricsSourceAdapter.java:171)
位于org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.getMBeanInfo(MetricsSourceAdapter.java:150)
位于com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getNewMBeanClassName(DefaultMBeanServerInterceptor.java:333)
位于com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:319)
位于com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
位于org.apache.hadoop.metrics2.util.MBeans.register(MBeans.java:57)
位于org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.startMBeans(MetricsSourceAdapter.java:220)
位于org.apache.hadoop.metrics2.impl.MetricsSourceAdapter.start(MetricsSourceAdapter.java:95)
位于org.apache.hadoop.metrics2.impl.MetricSystemImpl.registerSource(MetricSystemImpl.java:244)
在org.apache.hadoop.metrics2.impl.MetricSystemImpl.register(metricSystemImpl.java:222)上
位于org.apache.hadoop.metrics2.source.JvmMetrics.create(JvmMetrics.java:80)
位于org.apache.hadoop.hdfs.server.namenode.metrics.NameNodeMetrics.create(NameNodeMetrics.java:94)
位于org.apache.hadoop.hdfs.server.namenode.namenode.initMetrics(namenode.java:278)
位于org.apache.hadoop.hdfs.server.namenode.namenode.initialize(namenode.java:436)
位于org.apache.hadoop.hdfs.server.namenode.namenode.(namenode.java:613)
位于org.apache.hadoop.hdfs.server.namenode.namenode.(namenode.java:598)
位于org.apache.hadoop.hdfs.server.namenode.namenode.createNameNode(namenode.java:1169)
位于org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:879)
在org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:770)上
位于org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:628)
位于org.apache.hadoop.hdfs.MiniDFSCluster.(MiniDFSCluster.java:323)
位于org.apache.hadoop.hdfs.MiniDFSCluster。(MiniDFSCluster.java:113)
位于org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:305)

原因是什么?

仔细检查您的依赖关系。此错误表示类路径上存在不兼容的日志JAR版本。我遇到了一个类似的问题,不得不排除另一个第三方库引入的log4j-over-slf4j依赖项。

升级到slf4j 1.7.6应该可以解决这个问题(尽管我们使用了1.7.7),因为log4j-over-slf4j v1.7.5缺少AppenderSkeleton

很可能有一个类在调用AppenderSkleton的某个地方使用了Log4J,但是通过slf4j重定向Log4J的桥却缺少了这一点,它与海报上显示的堆栈跟踪一起爆炸了。上的发行说明说明了这一点在1.7.6中得到了解决


登录到纱线,我们在那里看到了问题:。

没错!当我删除log4j依赖项时,并没有出现这样的错误。实际上,我最好还是按照您的建议,从storm core中删除log4j-over-slf4j桥:)。