Apache spark 从java代码提交spark作业并获取java.lang.NullPointerException
我想将spark作业提交给纱线集群,我从“”处获得参考 这是我的密码:Apache spark 从java代码提交spark作业并获取java.lang.NullPointerException,apache-spark,submit,Apache Spark,Submit,我想将spark作业提交给纱线集群,我从“”处获得参考 这是我的密码: package learn.spark; import org.apache.spark.deploy.yarn.Client; import org.apache.spark.deploy.yarn.ClientArguments; import org.apache.hadoop.conf.Configuration; import org.apache.spark.SparkConf; public class S
package learn.spark;
import org.apache.spark.deploy.yarn.Client;
import org.apache.spark.deploy.yarn.ClientArguments;
import org.apache.hadoop.conf.Configuration;
import org.apache.spark.SparkConf;
public class SubmitJob {
public static void main(String[] arguments) throws Exception {
String[] args = new String[] {
"--jar", "lib/spark-examples-1.4.0-hadoop2.6.0.jar",
"--class", "org.apache.spark.examples.mllib.JavaKMeans",
"--num-executors", "32",
"--executor-cores", "4",
"--executor-memory", "16G",
"--driver-memory", "8G",
"--addJars", "./lib/spark-assembly-1.4.0-hadoop2.6.0.jar",
"--arg", "/data/kmeans_data.txt",
"--arg", "5",
"--arg", "9"
};
Configuration config = new Configuration();
System.setProperty("SPARK_YARN_MODE", "true");
SparkConf sparkConf = new SparkConf();
ClientArguments cArgs = new ClientArguments(args, sparkConf);
Client client = new Client(cArgs, config, sparkConf);
client.run();
}
}
我编译源代码和jar。通过以下命令运行程序:
java -Xbootclasspath/a:./lib/spark-assembly-1.4.0-hadoop2.6.0.jar: -jar learn.spark.SubmitJob.jar
不幸的是,我得到了以下错误:
我不明白这个问题,或者我怎样才能向纱线集群提交spark作业?
有更好的方法吗?我找到了另一个例子,但代码仍然让我困惑。我以前的代码中使用的ClientArguments是正确的,但是在这个链接中,当我尝试时会出现错误。我检查spark源代码,得到私有的[deploy]类ClientArguments(args:Array[String])。为什么会有不同?更改org.apache.spark.deploy.SparkSubmit和