Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何加上「--“部署模式群集”;我的scala代码的选项_Scala_Apache Spark_Spark Streaming_Apache Spark Standalone - Fatal编程技术网

如何加上「--“部署模式群集”;我的scala代码的选项

如何加上「--“部署模式群集”;我的scala代码的选项,scala,apache-spark,spark-streaming,apache-spark-standalone,Scala,Apache Spark,Spark Streaming,Apache Spark Standalone,209/5000 你好 我想在我的代码scala中添加选项“-deploy mode cluster”: val sparkConf = new SparkConfig ().setMaster ("spark: //192.168.60.80:7077") 不使用shell(命令。\Spark submit) 我不想在scala中使用您可以使用的“spark.submit.deployMode” val sparkConf = new SparkConf ().setMaster ("

209/5000 你好 我想在我的代码scala中添加选项“-deploy mode cluster”:

  val sparkConf = new SparkConfig ().setMaster ("spark: //192.168.60.80:7077")
不使用shell(命令。\Spark submit)

我不想在scala中使用您可以使用的“spark.submit.deployMode”

 val sparkConf = new SparkConf ().setMaster ("spark: //192.168.60.80:7077").set("spark.submit.deployMode","cluster")
使用SparkConfig:

//set up the spark configuration and create contexts
val sparkConf = new SparkConf().setAppName("SparkApp").setMaster("spark: //192.168.60.80:7077")

val sc = new SparkContext(sparkConf).set("spark.submit.deployMode", "cluster")
带火花会话:

val spark = SparkSession
   .builder()
   .appName("SparkApp")
   .master("spark: //192.168.60.80:7077")
   .config("spark.submit.deployMode","cluster")
   .enableHiveSupport()
   .getOrCreate()

val sparkConf=new sparkConf().set(“spark.submit.deployMode”,“cluster”).setMaster(“spark://master:7077setAppName(“wordcount”)我的地址是:192.168.60.90,与主主机中的地址不同:192.168.60.80无导入,驱动程序和主控程序之间没有配置请注意,我使用的是Eclipse maven项目您使用的是哪个版本的spark?我使用的是spark 2.1.0