Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/399.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/maven/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
java.lang.ClassNotFoundException:org.spark_project.guava.collect.MapMaker_Java_Maven_Apache Spark_Spring Boot_Guava - Fatal编程技术网

java.lang.ClassNotFoundException:org.spark_project.guava.collect.MapMaker

java.lang.ClassNotFoundException:org.spark_project.guava.collect.MapMaker,java,maven,apache-spark,spring-boot,guava,Java,Maven,Apache Spark,Spring Boot,Guava,我正在尝试将ApacheSpark与spring boot cassandra项目集成。但在运行项目时,会出现以下错误: Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.NoClassDefFoundError: org/spark_project

我正在尝试将ApacheSpark与spring boot cassandra项目集成。但在运行项目时,会出现以下错误:

    Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Handler dispatch failed; nested exception is java.lang.NoClassDefFoundError: org/spark_project/guava/collect/MapMaker] with root cause
    java.lang.ClassNotFoundException: org.spark_project.guava.collect.MapMaker
我检查了我的maven依赖项,mapmaker文件出现在“org/spark_project/guava/collect/”中的spark-network-common_2.11.jar中。 以下是我正在使用的pom文件依赖项:

<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter</artifactId>
    </dependency>

    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
    </dependency>

    <dependency>
        <groupId>org.springframework.data</groupId>
        <artifactId>spring-data-cassandra</artifactId>
    </dependency>

    <!-- https://mvnrepository.com/artifact/com.datastax.cassandra/cassandra-driver-core -->
    <dependency>
        <groupId>com.datastax.cassandra</groupId>
        <artifactId>cassandra-driver-core</artifactId>
        <version>3.5.0</version>
    </dependency>
    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-network-common -->
    <!-- <dependency>
       <groupId>org.apache.spark</groupId>
       <artifactId>spark-network-common_2.10</artifactId>
       <version>1.3.0</version>
    </dependency> -->
    <!-- <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-network-common_2.11</artifactId>
        <version>2.2.1</version>
    </dependency> -->


    <!-- https://mvnrepository.com/artifact/com.datastax.cassandra/cassandra-driver-mapping -->
    <!-- <dependency> <groupId>com.datastax.cassandra</groupId> <artifactId>cassandra-driver-mapping</artifactId> 
        <version>3.5.0</version> </dependency> -->
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>


    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.1</version>
    </dependency>




    <!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector -->
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.11</artifactId>
        <version>2.0.8</version>
    </dependency>


    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.2.1</version>
    </dependency>

</dependencies>

<build>
    <plugins>
        <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
        </plugin>
    </plugins>
</build>

org.springframework.boot
弹簧靴起动器
org.springframework.boot
弹簧起动试验
测验
org.springframework.data
弹簧数据卡桑德拉
com.datasax.cassandra
卡桑德拉驱动核心
3.5.0
org.springframework.boot
SpringBootStarterWeb
org.apache.spark
spark-core_2.11
2.2.1
com.datasax.spark
spark-cassandra-connector_2.11
2.0.8
org.apache.spark
spark-sql_2.11
2.2.1
org.springframework.boot
springbootmaven插件
spark-network-common_2.11.jar与spark-core依赖项一起提供,尽管我试图单独添加它,但即使这样也不起作用。 spring boot在运行时无法识别mapmaker文件可能存在什么问题? 非常感谢您的帮助。

org.spark_project.guava.collect.MapMaker的包路径意味着为了避免依赖地狱,guava包已重新定位到spark_项目中

这样的事情是由构建过程控制的,很容易被库之间不兼容的根源所忽略

我的直觉是,您可能正在使用与库不匹配的版本,虽然在技术上可能是兼容的,但并非因为番石榴的位置不同

你有

   <artifactId>spark-network-common_2.10</artifactId>
spark-network-common_2.10
当其他依赖项已列出2.11时,注释掉

它们表示所使用的Scala语言+运行时的版本号,并且应该在使用Scala的所有项目中匹配

哪个类/库正在尝试加载重新定位的Guava?这将给您一个很大的提示,说明哪个库可能过时/不匹配