Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/spring/13.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
使用Spark执行Spring引导应用程序_Spring_Apache Spark_Spring Boot - Fatal编程技术网

使用Spark执行Spring引导应用程序

使用Spark执行Spring引导应用程序,spring,apache-spark,spring-boot,Spring,Apache Spark,Spring Boot,我对apachespark还不熟悉。我试图使用Spark执行简单的spring启动应用程序,但遇到了异常 ERROR ApplicationMaster: User class threw exception: java.lang.NoClassDefFoundError: org/springframework/boot/SpringApplication java.lang.NoClassDefFoundError: org/springframework/boot/SpringAppli

我对apachespark还不熟悉。我试图使用Spark执行简单的spring启动应用程序,但遇到了异常

ERROR ApplicationMaster: User class threw exception: 
java.lang.NoClassDefFoundError: org/springframework/boot/SpringApplication
java.lang.NoClassDefFoundError: org/springframework/boot/SpringApplication
Caused by: java.lang.ClassNotFoundException: org.springframework.boot.SpringApplication
然而,我能够从我的EclipseIDE中完美地执行这个项目文件。它正在执行我保留的灵魂

<parent>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-parent</artifactId>
    <version>1.3.3.RELEASE</version>
    <relativePath/> <!-- lookup parent from repository -->
</parent>

<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <java.version>1.8</java.version>
</properties>

<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter</artifactId>
    </dependency>

    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
    </dependency>


    <dependency> <!-- Spark -->
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.4.0</version>
    </dependency>
    <dependency> <!-- Spark SQL -->
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.4.0</version>
    </dependency>
    <dependency> <!-- Spark SQL -->
        <groupId>com.fasterxml.jackson.module</groupId>
        <artifactId>jackson-module-scala_2.10</artifactId>
        <version>2.6.5</version>
    </dependency>
</dependencies>

<build>
    <plugins>
        <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
        </plugin>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>2.3.2</version>
            <configuration>
                <source>1.8</source>
                <target>1.8</target>
            </configuration>
        </plugin>
    </plugins>
</build>

我仅使用--jars“jarpath,另一个jar路径”在spark submit命令中添加了所需的依赖项。您需要提供逗号分隔后的所有罐子 --罐子

第二件事是尝试在spark 2.0中执行这个,我使用park 1.6,我遇到了问题,但它在spark 2.0中运行得非常好


希望这能对你们有所帮助。

看看这里:,这里似乎有一个在中执行Spring Boot应用程序的示例设置Sparkhi@AlexSavitsky我也尝试过该代码,但它不起作用。你解决了这个问题吗?@SrinivasValekar是的,我可以解决。或者你可以将jar添加到spark中的jars文件夹中
@SpringBootApplication
public class SparkS3Application {

   public static void main(String[] args) {
       SpringApplication.run(SparkS3Application.class, args);
       System.out.println(" *************************** called *******************");
   }
}