Apache spark 正在尝试使用spark submit运行示例应用程序,但找不到类excpetion

Apache spark 正在尝试使用spark submit运行示例应用程序,但找不到类excpetion,apache-spark,spark-streaming,Apache Spark,Spark Streaming,我试着从书中举例说明 使用maven项目,这是我的pom.xml <project> <groupId>com.streaming.example</groupId> <artifactId>streaming-example</artifactId> <modelVersion>4.0.0</modelVersion> <name>example</name&g

我试着从书中举例说明

使用maven项目,这是我的pom.xml

<project>
    <groupId>com.streaming.example</groupId>
    <artifactId>streaming-example</artifactId>
    <modelVersion>4.0.0</modelVersion>
    <name>example</name>
    <packaging>jar</packaging>
    <version>0.0.1</version>
    <dependencies>
        <dependency> <!-- Spark dependency -->
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.1.0</version>
            <scope>provided</scope>
        </dependency>
        <dependency> <!-- Spark dependency -->
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>1.2.0</version>
        </dependency>
    </dependencies>
    <properties>
        <java.version>1.7</java.version>
    </properties>
    <build>
        <pluginManagement>
            <plugins>
                <plugin>
                    <groupId>org.apache.maven.plugins</groupId>
                    <artifactId>maven-compiler-plugin</artifactId>
                    <configuration>
                        <source>1.6</source>
                        <target>1.6</target>
                    </configuration>
                </plugin>
                <plugin>
                    <groupId>net.alchim31.maven</groupId>
                    <artifactId>scala-maven-plugin</artifactId>
                    <version>3.1.6</version>
                    <executions>
                        <execution>
                            <goals>
                                <goal>compile</goal>
                                <goal>testCompile</goal>
                            </goals>
                        </execution>
                    </executions>
                    <configuration>
                        <args>
                            <!-- work-around for https://issues.scala-lang.org/browse/SI-8358 -->
                            <arg>-nobootcp</arg>
                        </args>
                    </configuration>
                </plugin>
            </plugins>
        </pluginManagement>
    </build>
</project>
我收到此错误:

> spark-submit --class src/scala/com/example/SimpleExample.scala \
> target/streaming-example-0.0.1.jar local[4]
$ jar tf target/streaming-example-0.0.1.jar 
META-INF/
META-INF/MANIFEST.MF
META-INF/maven/
META-INF/maven/com.streaming.example/
META-INF/maven/com.streaming.example/streaming-example/
META-INF/maven/com.streaming.example/streaming-example/pom.xml
java.lang.ClassNotFoundException:src/scala/com/example/SimpleExample 位于java.lang.Class.forName0(本机方法)

编辑jar内容:

> spark-submit --class src/scala/com/example/SimpleExample.scala \
> target/streaming-example-0.0.1.jar local[4]
$ jar tf target/streaming-example-0.0.1.jar 
META-INF/
META-INF/MANIFEST.MF
META-INF/maven/
META-INF/maven/com.streaming.example/
META-INF/maven/com.streaming.example/streaming-example/
META-INF/maven/com.streaming.example/streaming-example/pom.xml

--class参数查找包结构,而不是文件路径。试试这个:

spark-submit --class com.example.SimpleExample target/streaming-example-0.0.1.jar

您的SimpleExample.class不在jar中

检查maven构建插件

您可以考虑使用汇编插件并编译为:

mvn assembly:assembly 

在创建uber jar时,它将包含您的所有依赖项

您是否通过
解压-l.jar
检查了jar生成的包中是否有该类?还有,试试sbt。@Reactormonk,我不确定我是否抓到你了。它是使用mvn clean Package生成的。如果生成的jar确实包含该类,请查看它。@Reactormonk,它似乎存在。$jar tf target/streaming-example-0.0.1.jar META-INF/META-INF/MANIFEST.MF META-INF/maven/com.streaming.example/META-INF/com.streaming.example/streaming-example/META-INF/maven/com.streaming.example/streaming-example/pom。xml@eliasah,你是对的。我错过了。问题在于pom缺少正确的插件,仍然是一样的。