Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/383.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 将Mahout依赖项添加到Gradle时导致的错误_Java_Hadoop_Gradle_Mahout - Fatal编程技术网

Java 将Mahout依赖项添加到Gradle时导致的错误

Java 将Mahout依赖项添加到Gradle时导致的错误,java,hadoop,gradle,mahout,Java,Hadoop,Gradle,Mahout,我正在尝试使用gradle运行hadoop作业来构建我的项目,当我添加mahout依赖项时 apply plugin: 'java' repositories { mavenCentral() } dependencies { compile group: 'org.apache.hadoop', name: 'hadoop-mapreduce-client-core', version: '3.1.2' compile group: 'org.apache.had

我正在尝试使用gradle运行hadoop作业来构建我的项目,当我添加mahout依赖项时

apply plugin: 'java'

repositories {
    mavenCentral()
}

dependencies {
    compile group: 'org.apache.hadoop', name: 'hadoop-mapreduce-client-core', version: '3.1.2'

    compile group: 'org.apache.hadoop', name: 'hadoop-common', version: '3.1.2'

    compile group: 'org.apache.mahout', name: 'mahout-hdfs', version: '0.13.0'
    compile group: 'org.apache.mahout', name: 'mahout-mr', version: '0.13.0'
    compile group: 'org.apache.mahout', name: 'mahout-math', version: '0.13.0'
}

jar {
    from configurations.compile.collect { it.isDirectory() ? it : zipTree(it) }
}

ext.hadoopVersion = "3.1.2"
在生成文件中,我遇到以下错误:

Exception in thread "main" java.lang.IllegalAccessError: class org.apache.hadoop.hdfs.web.HftpFileSystem cannot access its superinterface org.apache.hadoop.hdfs.web.TokenAspect$TokenManagementDelegator
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
    at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
    at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
    at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:3217)
    at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3262)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3301)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:227)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:463)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
    at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:542)
    at ConvertText.ConvertTextJob.main(ConvertTextJob.java:25)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
但如果没有这种依赖性,这项工作就很好

以下是我使用库的代码:

package ConvertText;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat;
import org.apache.mahout.math.VectorWritable;

public class ConvertTextJob {
  public static void main(String[] args){
    try {
      //Setup for the first job
      Configuration conf = new Configuration();

      //Setup for jar of class
      Job job = Job.getInstance(conf, "Convert Text");
      job.setJarByClass(ConvertTextJob.class);

      // path to input/output in HDFS
      FileInputFormat.addInputPath(job, new Path(args[0]));
      FileOutputFormat.setOutputPath(job, new Path(args[1]));

      //Set Mapper class
      job.setMapperClass(ConvertTextMapper.class);

      // Outputs from the Mapper
      job.setOutputKeyClass(NullWritable.class);
      job.setOutputValueClass(VectorWritable.class);

      //Set format of the key/value format
      job.setOutputFormatClass(SequenceFileOutputFormat.class);

      job.setNumReduceTasks(0);

      // Block until the job is completed.
      System.exit(job.waitForCompletion(true) ? 0 : 1);

    } catch (IOException | InterruptedException | ClassNotFoundException e) {
      System.err.println(e.getMessage());
    }
  }

}

是否有人知道问题是什么,以及我如何修复它以便使用依赖关系?我正在做一个涉及mahout的项目,需要这种依赖性

mahout-0.13.0
中,将
hdfs
模块从mahout核心中分解出来

您必须添加
mahouthdfs
arifact来构建0.13.0

// https://mvnrepository.com/artifact/org.apache.mahout/mahout-hdfs
compile group: 'org.apache.mahout', name: 'mahout-hdfs', version: '0.13.0'

mahout-0.13.0
中,将
hdfs
模块从mahout核心中分解出来

您必须添加
mahouthdfs
arifact来构建0.13.0

// https://mvnrepository.com/artifact/org.apache.mahout/mahout-hdfs
compile group: 'org.apache.mahout', name: 'mahout-hdfs', version: '0.13.0'

您需要从类路径中查找并删除/排除
hadoop-hdfs-2.x.x.jar
。它与Mahout引入的较新版本的HDFS冲突。

您需要从类路径中查找并删除/排除
hadoop-HDFS-2.x.x.jar
。它与Mahout推出的新版本HDFS冲突。

能否请您发布一个代码示例,以便我们知道您是如何使用该库的,以及从哪里得到错误的?至少
VectorWritable
取决于
HDFS
。是的,只需在gradle构建中添加依赖项,
spark
我相信
mahout mr
模块也将取决于ity。请发布一个代码示例,以便我们知道您是如何使用库的,以及从哪里得到错误的?至少
VectorWritable
取决于
HDFS
。是的,只需在gradle构建中添加依赖项,
spark
,我相信
mahout mr
模块也将依赖于City。感谢您的快速回复。我添加了您建议的所有依赖项(我在问题中添加了完整的构建文件),但仍然存在相同的错误。你知道还有没有遗漏的地方吗?谢谢你的快速回复。我添加了您建议的所有依赖项(我在问题中添加了完整的构建文件),但仍然存在相同的错误。你知道还有没有可能丢失的?