NodeWritable java.lang.NoClassDefFoundError Hadoop Jena

NodeWritable java.lang.NoClassDefFoundError Hadoop Jena,java,hadoop,mapreduce,jena,Java,Hadoop,Mapreduce,Jena,我的Jena Hadoop MapReduce示例抛出java.lang.NoClassDefFoundError。 这是一个专业项目。我读到它可能与缺少依赖项有关,但我不知道我缺少了哪一个! 有什么问题吗 控制台日志 java.lang.NoClassDefFoundError: org/apache/jena/hadoop/rdf/types/NodeWritable at org.apache.jena.hadoop.rdf.stats.RdfMapReduceExampl

我的Jena Hadoop MapReduce示例抛出java.lang.NoClassDefFoundError。 这是一个专业项目。我读到它可能与缺少依赖项有关,但我不知道我缺少了哪一个! 有什么问题吗

控制台日志

java.lang.NoClassDefFoundError: org/apache/jena/hadoop/rdf/types/NodeWritable
        at org.apache.jena.hadoop.rdf.stats.RdfMapReduceExample.main(RdfMapReduceExample.java:29)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: org.apache.jena.hadoop.rdf.types.NodeWritable
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 7 more
地图代码第1部分

package org.apache.jena.hadoop.rdf.mapreduce.count;

import java.io.IOException;

import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.jena.hadoop.rdf.types.AbstractNodeTupleWritable;
import org.apache.jena.hadoop.rdf.types.NodeWritable;

public abstract class AbstractNodeTupleNodeCountMapper<TKey, TValue, T extends AbstractNodeTupleWritable<TValue>>
        extends Mapper<TKey, T, NodeWritable, LongWritable> {
    private LongWritable initialCount = new LongWritable(1);

    @Override
    protected void map(TKey key, T value, Context context) throws IOException, InterruptedException {
        NodeWritable[] ns = this.getNodes(value);
        for (NodeWritable n : ns) {
            context.write(n, this.initialCount);
        }
    }

    protected abstract NodeWritable[] getNodes(T tuple);
}
Pom.xml依赖项

<dependencies>
    <!-- https://mvnrepository.com/artifact/org.apache.jena/jena-elephas-common -->
    <dependency>
        <groupId>org.apache.jena</groupId>
        <artifactId>jena-elephas-common</artifactId>
        <version>3.1.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.jena</groupId>
        <artifactId>jena-elephas-io</artifactId>
        <version>3.1.1</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-common -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-mapreduce-client-common</artifactId>
        <version>2.7.1</version>
        <scope>provided</scope>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.1</version>
        <scope>provided</scope>
    </dependency>
</dependencies>

org.apache.jena
耶拿埃勒法斯普通酒店
3.1.1
org.apache.jena
耶拿·埃莱法斯·伊奥
3.1.1
org.apache.hadoop
hadoop mapreduce客户端公用程序
2.7.1
假如
org.apache.hadoop
hadoop通用
2.7.1
假如

依赖项声明是正确的,否则代码根本无法编译

您的问题是,您的JAR可能只包含您的代码,而不包含任何必要的依赖项。因此,当Map Reduce尝试运行代码时,不存在任何依赖项

通常,在构建MapReduce时,最好创建一个包含代码和所有依赖项的胖JAR。可以使用maven assembly插件来实现这一点(如果愿意,也可以使用maven shade)

将此添加到您的
pom.xml

      <plugin>
        <artifactId>maven-assembly-plugin</artifactId>
        <configuration>
          <descriptors>
            <descriptor>hadoop-job.xml</descriptor>
          </descriptors>
        </configuration>
        <executions>
          <execution>
            <id>make-assembly</id>
            <phase>package</phase>
            <goals>
              <goal>single</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
<assembly>
  <id>hadoop-job</id>
  <formats>
    <format>jar</format>
  </formats>
  <includeBaseDirectory>false</includeBaseDirectory>
  <dependencySets>
    <dependencySet>
      <unpack>false</unpack>
      <scope>provided</scope>
      <outputDirectory>lib</outputDirectory>
      <excludes>
        <exclude>${groupId}:${artifactId}</exclude>
      </excludes>
    </dependencySet>
    <dependencySet>
      <unpack>true</unpack>
      <includes>
        <include>${groupId}:${artifactId}</include>
      </includes>
    </dependencySet>
  </dependencySets>
</assembly>

本质上,这要求Maven构建一个包含所有未提供的依赖项的胖JAR。这将创建一个额外的人工制品,名为
your artifact VERSION hadoop job.jar
,您应该运行它,而不是正常的jar

谢谢,我已经创建了src/hadoop-job.xml文件,并在pom.xml中实现了新行,但是当我运行jar时,它会不断抛出异常:java.lang.NoClassDefFoundError:org/apache/jena/hadoop/rdf/types/NodeWritable,位于org.apache.jena.hadoop.rdf.stats.RdfMapReduceExample.main(RdfMapReduceExample.java:29)。。。原因:java.lang.ClassNotFoundException:org.apache.jena.hadoop.rdf.types.NodeWritable。。。另外,正在打印的新警告警告fs.FileUtil:无法删除文件或目录[C:\Users\…\DepJarElement.jar]它仍然存在
<dependencies>
    <!-- https://mvnrepository.com/artifact/org.apache.jena/jena-elephas-common -->
    <dependency>
        <groupId>org.apache.jena</groupId>
        <artifactId>jena-elephas-common</artifactId>
        <version>3.1.1</version>
    </dependency>

    <dependency>
        <groupId>org.apache.jena</groupId>
        <artifactId>jena-elephas-io</artifactId>
        <version>3.1.1</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-common -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-mapreduce-client-common</artifactId>
        <version>2.7.1</version>
        <scope>provided</scope>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.1</version>
        <scope>provided</scope>
    </dependency>
</dependencies>
      <plugin>
        <artifactId>maven-assembly-plugin</artifactId>
        <configuration>
          <descriptors>
            <descriptor>hadoop-job.xml</descriptor>
          </descriptors>
        </configuration>
        <executions>
          <execution>
            <id>make-assembly</id>
            <phase>package</phase>
            <goals>
              <goal>single</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
<assembly>
  <id>hadoop-job</id>
  <formats>
    <format>jar</format>
  </formats>
  <includeBaseDirectory>false</includeBaseDirectory>
  <dependencySets>
    <dependencySet>
      <unpack>false</unpack>
      <scope>provided</scope>
      <outputDirectory>lib</outputDirectory>
      <excludes>
        <exclude>${groupId}:${artifactId}</exclude>
      </excludes>
    </dependencySet>
    <dependencySet>
      <unpack>true</unpack>
      <includes>
        <include>${groupId}:${artifactId}</include>
      </includes>
    </dependencySet>
  </dependencySets>
</assembly>