Java (无活动SparkContext。)将作业提交到本地Spark主机时出错

Java (无活动SparkContext。)将作业提交到本地Spark主机时出错,java,apache-spark,Java,Apache Spark,我在本地机器上安装了一个Spark Master和一个Spark Slave并运行。我想通过像这样的命令行配置将代码提交给运行中的Spark Master,就像文档中描述的那样 构建.jar之后,我通过 bin/spark-submit --class logAnalysis.myApp --name "myApp" --master "spark://some.server:7077" /jars/myApp-0.3.jar 编辑:我以前尝试过设置没有引号的主控形状 在此之后,我得到以下错误

我在本地机器上安装了一个Spark Master和一个Spark Slave并运行。我想通过像这样的命令行配置将代码提交给运行中的Spark Master,就像文档中描述的那样

构建.jar之后,我通过

bin/spark-submit --class logAnalysis.myApp --name "myApp" --master "spark://some.server:7077" /jars/myApp-0.3.jar
编辑:我以前尝试过设置没有引号的主控形状

在此之后,我得到以下错误:

17/03/22 12:23:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/03/22 12:23:04 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: Master removed our application: FAILED
17/03/22 12:23:04 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
logAnalysis.myApp.main(myApp.java:48)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:606)
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

The currently active SparkContext was created at:

(No active SparkContext.)

        at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:101)
        at org.apache.spark.SparkContext.getSchedulingMode(SparkContext.scala:1658)
        at org.apache.spark.SparkContext.postEnvironmentUpdate(SparkContext.scala:2162)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:542)
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
        at logAnalysis.myApp.main(myApp.java:48)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Exception in thread "main" java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
它很好用

我使用的是spark 2.0.2,我的scala版本不是问题所在,如本文所述:


一切都是默认设置的。为什么会发生这种情况?我现在已经在集群中添加了另一个节点。它现在成功地运行了1x主-2x工作设置

除了将ElasticSearch HadoopConnector添加到配置中,代码中没有任何更改:

JavaSparkContext sc = new JavaSparkContext(new SparkConf().set("es.nodes", "node1").set("es.port", "9200"));

我不知道是什么问题,可能是配置造成的。但如前所述,当将主机设置为本地[*]时,作业成功运行

在您的第一个示例中,即
spark://some.server:7077
(我不相信它会起作用),但文档中似乎从来没有引用过我以前已经尝试过的,没有改变
bin/spark-submit --class logAnalysis.myApp--name "myApp" --master local[8] /jars/myApp-0.3.jar
JavaSparkContext sc = new JavaSparkContext(new SparkConf().set("es.nodes", "node1").set("es.port", "9200"));