Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Spark应用程序中的错误java.lang.NoClassDefFoundError_Java_Apache Spark - Fatal编程技术网

Spark应用程序中的错误java.lang.NoClassDefFoundError

Spark应用程序中的错误java.lang.NoClassDefFoundError,java,apache-spark,Java,Apache Spark,我正在尝试使用/usr/local/Spark./bin/Spark submit-class DataframeExample-master local[2]~/new/hbfinance-module-1.0-SNAPSHOT.JAR/向Spark提交一个JAR文件。我使用的是Apache Spark 1.5.0,控制台显示了应用程序负载,但随后出现以下问题: /usr/local/spark# ./bin/spark-submit --class "DataframeExampl

我正在尝试使用/usr/local/Spark./bin/Spark submit-class DataframeExample-master local[2]~/new/hbfinance-module-1.0-SNAPSHOT.JAR/向Spark提交一个JAR文件。我使用的是Apache Spark 1.5.0,控制台显示了应用程序负载,但随后出现以下问题:

/usr/local/spark# ./bin/spark-submit --class "DataframeExample --master local[2] ~/new/hbfinance-module-1.0-SNAPSHOT.jar /
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/10/12 12:42:08 INFO SparkContext: Running Spark version 1.5.0
16/10/12 12:42:09 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/10/12 12:42:09 INFO SecurityManager: Changing view acls to: root
16/10/12 12:42:09 INFO SecurityManager: Changing modify acls to: root
16/10/12 12:42:09 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
16/10/12 12:42:10 INFO Slf4jLogger: Slf4jLogger started
16/10/12 12:42:10 INFO Remoting: Starting remoting
16/10/12 12:42:10 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@178.62.18.22:44153]
16/10/12 12:42:10 INFO Utils: Successfully started service 'sparkDriver' on port 44153.
16/10/12 12:42:10 INFO SparkEnv: Registering MapOutputTracker
16/10/12 12:42:10 INFO SparkEnv: Registering BlockManagerMaster
16/10/12 12:42:10 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-2416dd91-f441-4618-8141-e97257cdd17b
16/10/12 12:42:10 INFO MemoryStore: MemoryStore started with capacity 530.0 MB
16/10/12 12:42:10 INFO HttpFileServer: HTTP File server directory is /tmp/spark-f80de7bc-6613-4125-b047-eae7574df54e/httpd-a9849575-8777-4e8d-a6df-fdb2a08a0b87
16/10/12 12:42:10 INFO HttpServer: Starting HTTP Server
16/10/12 12:42:11 INFO Utils: Successfully started service 'HTTP file server' on port 44964.
16/10/12 12:42:11 INFO SparkEnv: Registering OutputCommitCoordinator
16/10/12 12:42:11 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/10/12 12:42:11 INFO SparkUI: Started SparkUI at http://178.62.18.22:4040
16/10/12 12:42:11 INFO SparkContext: Added JAR file:/root/new/hbfinance-module-1.0-SNAPSHOT.jar at http://178.62.18.22:44964/jars/hbfinance-module-1.0-SNAPSHOT.jar with timestamp 1476290531299
16/10/12 12:42:11 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
16/10/12 12:42:11 INFO Executor: Starting executor ID driver on host localhost
16/10/12 12:42:11 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46409.
16/10/12 12:42:11 INFO NettyBlockTransferService: Server created on 46409
16/10/12 12:42:11 INFO BlockManagerMaster: Trying to register BlockManager
16/10/12 12:42:11 INFO BlockManagerMasterEndpoint: Registering block manager localhost:46409 with 530.0 MB RAM, BlockManagerId(driver, localhost, 46409)
16/10/12 12:42:11 INFO BlockManagerMaster: Registered BlockManager
Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/hadoop/MongoInputFormat
    at com.hbfinance.DataframeExample.run(DataframeExample.java:47)
    at com.hbfinance.DataframeExample.main(DataframeExample.java:88)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:497)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.MongoInputFormat
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 11 more
16/10/12 12:42:11 INFO SparkContext: Invoking stop() from shutdown hook
16/10/12 12:42:11 INFO SparkUI: Stopped Spark web UI at http://178.62.18.22:4040
16/10/12 12:42:11 INFO DAGScheduler: Stopping DAGScheduler
16/10/12 12:42:12 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/10/12 12:42:12 INFO MemoryStore: MemoryStore cleared
16/10/12 12:42:12 INFO BlockManager: BlockManager stopped
16/10/12 12:42:12 INFO BlockManagerMaster: BlockManagerMaster stopped
16/10/12 12:42:12 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/10/12 12:42:12 INFO SparkContext: Successfully stopped SparkContext
16/10/12 12:42:12 INFO ShutdownHookManager: Shutdown hook called
16/10/12 12:42:12 INFO ShutdownHookManager: Deleting directory /tmp/spark-f80de7bc-6613-4125-b047-eae7574df54e
我不确定哪里出了错。这是我的POM文件:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.hbfinance</groupId>
<artifactId>hbfinance-module</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<name>hbfinance-module</name>
<url>http://maven.apache.org</url>
<dependencies>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>3.8.1</version>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>1.5.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>1.5.1</version>
    </dependency>
    <dependency>
        <groupId>log4j</groupId>
        <artifactId>log4j</artifactId>
        <version>1.2.14</version>
    </dependency>
    <dependency>
        <groupId>org.mongodb.mongo-hadoop</groupId>
        <artifactId>mongo-hadoop-core</artifactId>
        <version>1.4.1</version>
    </dependency>
</dependencies>
<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.1</version>
            <configuration>
                <source>1.6</source>
                <target>1.6</target>
            </configuration>
        </plugin>
    </plugins>
</build>
虽然我也应该添加代码

   public void run() {
    JavaSparkContext sc = new JavaSparkContext(new SparkConf());
    // Set configuration options for the MongoDB Hadoop Connector.
    Configuration mongodbConfig = new Configuration();
    // MongoInputFormat allows us to read from a live MongoDB instance.
    // We could also use BSONFileInputFormat to read BSON snapshots.
    mongodbConfig.set("mongo.job.input.format", "com.mongodb.hadoop.MongoInputFormat");

    // MongoDB connection string naming a collection to use.
    // If using BSON, use "mapred.input.dir" to configure the directory
    // where BSON files are located instead.
    mongodbConfig.set("mongo.input.uri",
      "mongodb://hadoopUser:Pocup1ne9@localhost:27017/hbdata.ppt_logs");
    // mongodbConfig.set("mongo.input.uri",
    //   "mongodb://hadoopUser:Pocup1ne9@localhost:27017/hbdata.ppa_logs");
    // mongodbConfig.set("mongo.input.uri",
    //   "mongodb://hadoopUser:Pocup1ne9@localhost:27017/hbdata.dd_logs");
    // mongodbConfig.set("mongo.input.uri",
    //   "mongodb://hadoopUser:Pocup1ne9@localhost:27017/hbdata.fav_logs");
    // mongodbConfig.set("mongo.input.uri",
    //   "mongodb://hadoopUser:Pocup1ne9@localhost:27017/hbdata.pps_logs");

    // Create an RDD backed by the MongoDB collection.
    JavaPairRDD<Object, BSONObject> documents = sc.newAPIHadoopRDD(
      mongodbConfig,            // Configuration
      MongoInputFormat.class,   // InputFormat: read from a live cluster.
      Object.class,             // Key class
      BSONObject.class          // Value class
    );

    JavaRDD<AppLog> logs = documents.map(

      new Function<Tuple2<Object, BSONObject>, AppLog>() {

          public AppLog call(final Tuple2<Object, BSONObject> tuple) {
              AppLog log = new AppLog();
              BSONObject header =
                (BSONObject) tuple._2().get("headers");

              log.setTarget((String) header.get("target"));
              log.setAction((String) header.get("action"));

              return log;
          }
      }
    );

    SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);

    DataFrame logsSchema = sqlContext.createDataFrame(logs, AppLog.class);
    logsSchema.registerTempTable("logs");

    DataFrame groupedMessages = sqlContext.sql(
      "select target, action, Count(*) from logs group by target, action");
      // "SELECT to, body FROM messages WHERE to = \"eric.bass@enron.com\"");



    groupedMessages.show();

    logsSchema.printSchema();
}

public static void main(final String[] args) {
    new DataframeExample().run();
}
尝试在存档资源管理器中打开hbfinance-module-1.0-SNAPSHOT.jar,查看它包含哪些类文件。默认情况下,Maven设计为只在构建的jar中包含您自己的类文件,而不是依赖项。当您在本地计算机上运行时,类路径上已经存在依赖项,因此您不会看到任何错误,但是当您将其发送给Spark时,文件丢失,并引发异常

您可以通过两种方式解决此问题:

将所有依赖项的所有jar文件发送到Spark

使用Maven Shade插件构建hbfinance模块的uber jar


避免此错误的一种简单方法是从Eclipse导出jar,但在库处理部分,请确保选中将所需库提取到生成的


这将给您一个膨胀的jar,但简化了spark上的jar引用。

您的应用程序无法加载MongoInputFormat.class,因为它抛出NoClassDefFoundError。。。看看这个。。哦,这是否意味着jar中缺少它?是的,还是jar没有正确加载?是的,这意味着它在编译时可用,但在运行时不可用。检查你添加到项目中的罐子。哇,你们真是太棒了。你能帮我知道如何诊断这个问题吗。我打开了我创建的jar文件,却看不到里面的依赖关系?这是否表示未加载。我有点像是maven编译器的新手。我试过uber jar,它很管用,但它有点笨重,现在我想知道,当我阅读您提供的步骤时,您是否可以概述如何执行第一个选项,我有点迷失了方向!如果以Netbeans为例在中打开应用程序的pom.xml文件,则可以选择显示项目依赖关系图。我认为Eclipse和IntelliJ中应该有类似的工具。检查图形中存在哪些依赖项,然后转到/.m2/-文件夹。这就是Maven存储所有.jar文件的地方。将应用程序正在使用的文件复制到与应用程序jar相同的目录中。然后,像第一次一样继续操作,除了包含参数-jar,并且列出了刚才复制的所有jar。