Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Maven spark 2.0.2 ClassNotFoundException:org.apache.kafka.clients.consumer.consumer_Maven_Apache Spark_Apache Kafka_Classnotfound - Fatal编程技术网

Maven spark 2.0.2 ClassNotFoundException:org.apache.kafka.clients.consumer.consumer

Maven spark 2.0.2 ClassNotFoundException:org.apache.kafka.clients.consumer.consumer,maven,apache-spark,apache-kafka,classnotfound,Maven,Apache Spark,Apache Kafka,Classnotfound,下面是我的pom.xml。我用maven shade制作了这个罐子。我非常肯定org.apache.kafka.clients.consumer.consumer包含在我的uber jar中。我还将kafka-clients-0.10.1.0.jar放入了spark.driver.extraLibraryPath。我还尝试了spark submit命令中的add-jars选项。但我还是得到了classNotFoundException <dependencies>

下面是我的pom.xml。我用maven shade制作了这个罐子。我非常肯定org.apache.kafka.clients.consumer.consumer包含在我的uber jar中。我还将kafka-clients-0.10.1.0.jar放入了spark.driver.extraLibraryPath。我还尝试了spark submit命令中的add-jars选项。但我还是得到了classNotFoundException

   <dependencies>
            <dependency>
                <groupId>org.scala-lang</groupId>
                <artifactId>scala-reflect</artifactId>
                <version>2.11.8</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.11</artifactId>
                <version>2.0.2</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming_2.11</artifactId>
                <version>2.0.2</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
                <version>2.0.2</version>
            </dependency>
            <dependency>
                <groupId>org.apache.kafka</groupId>
                <artifactId>kafka_2.11</artifactId>
                <version>0.10.1.0</version>
            </dependency>
            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>3.8.1</version>
                <scope>test</scope>
            </dependency>
        </dependencies>

我只是找到了一个旁路解决方案。将罐子添加到SPARK_HOME/jars中。 我使用spark提交命令。尝试添加-jars,-驱动程序库路径。我确信这些选择会生效。但仍然没有找到。 我根据下面列出的驱动程序日志找到了旁路解决方案

基本上,您需要:

<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>0.10.1.0</version>
</dependency>