java.lang.ClassNotFoundException:com.datastax.spark.connector.japi.rdd.CassandraTableScanJavaRDD

java.lang.ClassNotFoundException:com.datastax.spark.connector.japi.rdd.CassandraTableScanJavaRDD,java,maven,apache-kafka,spark-streaming,spark-cassandra-connector,Java,Maven,Apache Kafka,Spark Streaming,Spark Cassandra Connector,我使用卡夫卡火花流与卡桑德拉。当我使用eclipse运行java类时,它工作得很好,但当我使用maven构建并在Spark Shell上执行它时,它抛出了下面提到的异常 java.lang.NoClassDefFoundError: com/datastax/spark/connector/japi/rdd/CassandraTableScanJavaRDD at java.lang.Class.forName0(Native Method) at java.lang.Class

我使用卡夫卡火花流与卡桑德拉。当我使用eclipse运行java类时,它工作得很好,但当我使用maven构建并在Spark Shell上执行它时,它抛出了下面提到的异常

java.lang.NoClassDefFoundError: com/datastax/spark/connector/japi/rdd/CassandraTableScanJavaRDD
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:348)
    at org.apache.spark.util.Utils$.classForName(Utils.scala:230)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:712)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.datastax.spark.connector.japi.rdd.CassandraTableScanJavaRDD
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
我尝试过使用maven jar插件,但它不起作用。 我的pom.xml配置是

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>com.edureka.capstone</groupId>
    <artifactId>spark-cassandra-demo</artifactId>
    <version>0.0.1-SNAPSHOT</version>

    <dependencies>

        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11 -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.2.0</version>
            <scope>provided</scope>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.10 -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>2.2.0</version>
            <scope>provided</scope>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.11 -->
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.11</artifactId>
            <version>2.0.5</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector-java_2.10 -->
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector-java_2.10</artifactId>
            <version>1.6.0-M1</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.datastax.cassandra/cassandra-driver-core -->
        <dependency>
            <groupId>com.datastax.cassandra</groupId>
            <artifactId>cassandra-driver-core</artifactId>
            <version>3.3.0</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-catalyst_2.10 -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-catalyst_2.10</artifactId>
            <version>2.2.0</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10 -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>2.2.0</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka -->
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka_2.11</artifactId>
            <version>1.6.2</version>
        </dependency>

        <dependency>
            <groupId>com.twitter</groupId>
            <artifactId>bijection-avro_2.10</artifactId>
            <version>0.9.2</version>
        </dependency>

    </dependencies>

    <build>
        <resources>
            <resource>
                <directory>${basedir}/src/main/resources</directory>
            </resource>
        </resources>
        <pluginManagement>
            <plugins>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-compiler-plugin</artifactId>
                    <version>3.6.2</version>
                    <configuration>
                        <source>1.8</source>
                        <target>1.8</target>
                    </configuration>
                </plugin>
            </plugins>
        </pluginManagement>
    </build>
</project>

4.0.0
com.edureka.capstone
火花卡桑德拉演示
0.0.1-快照
org.apache.spark
spark-core_2.11
2.2.0
假如
org.apache.spark
spark-2.10
2.2.0
假如
com.datasax.spark
spark-cassandra-connector_2.11
2.0.5
com.datasax.spark
spark-cassandra-connector-java_2.10
1.6.0-M1
com.datasax.cassandra
卡桑德拉驱动核心
3.3.0
org.apache.spark
spark-catalyst_2.10
2.2.0
org.apache.spark
spark-sql_2.10
2.2.0
org.apache.spark
spark-streaming-kafka_2.11
1.6.2
com.twitter
双射-avro_2.10
0.9.2
${basedir}/src/main/resources
org.apache.maven.plugins
maven编译器插件
3.6.2
1.8
1.8
您需要构建包含所有依赖项的“胖jar”(例如,通过使用Maven Assembly plugin:,),或者将所有其他依赖项指定为
spark shell
的参数

更新:忘记添加核心Spark依赖项应标记为
已提供
,以避免冲突