Scala 使用SparkLauncher运行Spark任务

Scala 使用SparkLauncher运行Spark任务,scala,apache-spark,Scala,Apache Spark,在我的本地scala应用程序中,我想在集群中启动Spark任务。任务类是my.spark.SparkRunner,它包含在HDFS中的jar中,这是我在本地程序中配置的: val spark = new SparkLauncher() //.setSparkHome("C:/spark-1.6.0-bin-hadoop2.4") .setVerbose(true) .setAppResource("hdfs://192.168.10.183:8020/spark/myjar.jar"

在我的本地scala应用程序中,我想在集群中启动Spark任务。任务类是my.spark.SparkRunner,它包含在HDFS中的jar中,这是我在本地程序中配置的:

val spark = new SparkLauncher()
  //.setSparkHome("C:/spark-1.6.0-bin-hadoop2.4")
  .setVerbose(true)
  .setAppResource("hdfs://192.168.10.183:8020/spark/myjar.jar")
  .setMainClass("my.spark.SparkRunner")
  .setMaster("spark://192.168.10.183:7077")
  //.setMaster("192.168.10.183:7077")
  .launch();

spark.waitFor();

它不会抛出错误,但会立即返回,并且不会启动任务。我做错了什么?谢谢…

我刚刚添加了一个检查启动器状态的线程,就是这样

val spark = new SparkLauncher()
  //.setSparkHome("C:/spark-1.6.0-bin-hadoop2.4")
  .setVerbose(true)
  .setAppResource("hdfs://192.168.10.183:8020/spark/myjar.jar")
  .setMainClass("my.spark.SparkRunner")
  .setMaster("spark://192.168.10.183:7077")
  //.setMaster("192.168.10.183:7077")
  .startApplication();

while (spark.getState.toString != "FINISHED") {

    println (spark.getState)

    Thread.sleep(1000)
}