Scala 火花与磁芯平行

Scala 火花与磁芯平行,scala,apache-spark,Scala,Apache Spark,我已经开始学习spark,并尝试运行此示例: package examples import org.apache.spark.sql.SparkSession object Test extends App { val spark: SparkSession = SparkSession.builder() .master("local[2]") .appName("SparkByExample") .getOrCreat

我已经开始学习spark,并尝试运行此示例:

package examples

import org.apache.spark.sql.SparkSession

object Test extends App {
  val spark: SparkSession = SparkSession.builder()
    .master("local[2]")
    .appName("SparkByExample")
    .getOrCreate()

  println("First SparkContext:")
  println("APP Name :"+spark.sparkContext.appName)
  println("Deploy Mode :"+spark.sparkContext.deployMode)
  println("Master :"+spark.sparkContext.master)
  println("Default Min parallelism" + spark.sparkContext.defaultMinPartitions)
  println("Default parallelism" + spark.sparkContext.defaultParallelism)

  val sparkSession2: SparkSession = SparkSession.builder()
    .master("local[1]")
    .appName("SparkByExample-test")
    .getOrCreate()

  println("Second SparkContext:")
  println("APP Name :"+sparkSession2.sparkContext.appName)
  println("Deploy Mode :"+sparkSession2.sparkContext.deployMode)
  println("Master :"+sparkSession2.sparkContext.master)
  println("Default Min parallelism" + sparkSession2.sparkContext.defaultMinPartitions)
  println("Default parallelism" + sparkSession2.sparkContext.defaultParallelism)
}


在这里,我创建了两个spark会话,第一个有两个内核,第二个有一个内核,但是当我运行它时,两个会话都有两个并行性,我不明白为什么

First SparkContext:
APP Name :SparkByExample
Deploy Mode :client
Master :local[2]
Default Min parallelism2
Default parallelism2

Second SparkContext:
APP Name :SparkByExample
Deploy Mode :client
Master :local[2]
Default Min parallelism2
Default parallelism2
“sparkContext.defaultParallelism”返回在上定义的默认并行级别
SparkContext,它的默认值是根据应用程序上可用的内核数计算的

   println("Default parallelism" + sparkSession2.sparkContext.defaultParallelism)