Scala 在ApacheSpark中追加/连接两个类型为Set的RDD

Scala 在ApacheSpark中追加/连接两个类型为Set的RDD,scala,apache-spark,rdd,Scala,Apache Spark,Rdd,我与Spark RDD合作。我需要附加/连接两个类型为Set的RDD scala> var ek: RDD[Set[Int]] = sc.parallelize(Seq(Set(7))) ek: org.apache.spark.rdd.RDD[Set[Int]] = ParallelCollectionRDD[31] at parallelize at <console>:32 scala> val vi: RDD[Set[Int]] = sc.parallelize

我与Spark RDD合作。我需要附加/连接两个类型为
Set
的RDD

scala> var ek: RDD[Set[Int]] = sc.parallelize(Seq(Set(7)))
ek: org.apache.spark.rdd.RDD[Set[Int]] = ParallelCollectionRDD[31] at parallelize at <console>:32

scala> val vi: RDD[Set[Int]] = sc.parallelize(Seq(Set(3,5)))
vi: org.apache.spark.rdd.RDD[Set[Int]] = ParallelCollectionRDD[32] at parallelize at <console>:32

scala> val z = vi.union(ek)
z: org.apache.spark.rdd.RDD[Set[Int]] = UnionRDD[34] at union at <console>:36

scala> z.collect
res15: Array[Set[Int]] = Array(Set(3, 5), Set(7))

scala> val t = visited++ek
t: org.apache.spark.rdd.RDD[Set[Int]] = UnionRDD[40] at $plus$plus at <console>:36

scala> t.collect
res30: Array[Set[Int]] = Array(Set(3, 5), Set(7))
预期结果应如下所示:

scala> val u = Set(3,5)
u: scala.collection.immutable.Set[Int] = Set(3, 5)

scala> val o = Set(7)
o: scala.collection.immutable.Set[Int] = Set(7)

scala> u.union(o)
res28: scala.collection.immutable.Set[Int] = Set(3, 5, 7)

有谁能告诉我怎么做吗?

你是在集合列表(seq)上应用并集,这就是为什么元素是完整集合而不是它们的元素。尝试使用:

var ek = sc.parallelize(Set(7).toSeq)
val vi = sc.parallelize(Set(3,5).toSeq)
val z = vi.union(ek)

您正在集合列表(seq)上应用并集,这就是为什么元素是完整集合而不是它们的元素。尝试使用:

var ek = sc.parallelize(Set(7).toSeq)
val vi = sc.parallelize(Set(3,5).toSeq)
val z = vi.union(ek)