Scala reduceByKey不接受泛型参数
我想根据提供的函数减少键值中的泛型值 但它给出了错误:Scala reduceByKey不接受泛型参数,scala,generics,apache-spark,Scala,Generics,Apache Spark,我想根据提供的函数减少键值中的泛型值 但它给出了错误: value reduceByKey is not a member of org.apache.spark.rdd.RDD[(String, T)] val reduce = (rDD:RDD[(String,T)]) => rDD.reduceByKey((x,y) => y)** ^ reduceByKey不适用于泛型 也 当我试着用咖喱
value reduceByKey is not a member of org.apache.spark.rdd.RDD[(String, T)]
val reduce = (rDD:RDD[(String,T)]) => rDD.reduceByKey((x,y) => y)**
^
reduceByKey不适用于泛型
也
当我试着用咖喱做它的时候
map andThen reduce
它再次表示找不到,然后是符号
更新
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.RDD
class Aggragator[T]( val rdd:RDD[String], function:Function[String,T]){
val mapped = rdd.map(x => (x,function.apply(x)))
val reduce = mapped.reduceByKey((x,y) => y)
}
object Aggragator{
val conf = new SparkConf()
.setMaster("local[2]")
.setAppName("xx")
val sc = new SparkContext(conf)
val rdd = sc.parallelize(List("1","2"))
val reduced = new Aggragator[Int](rdd, (x:String) => x.toInt).reduce.collect()
def main( args: Array[String] ) {
println(reduced)
}}
在哪里定义了
T
?请发布你的问题。T是类的类型,请向我们展示完整的代码示例。这对我来说编译得很好。嗨,我创建了这个类,它在ide中没有显示错误,但是当我编译代码时,它抛出了错误。缺少的是类中的ClassTag
上下文绑定:classegragator[T:ClassTag]
在哪里定义了T
?请发布你的问题。T是类的类型,请向我们展示完整的代码示例。这对我来说编译得很好。嗨,我创建了这个类,它在ide中没有显示错误,但是当我编译代码时,它会抛出错误。缺少的是类中的ClassTag
上下文绑定:classegragator[T:ClassTag]
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.rdd.RDD
class Aggragator[T]( val rdd:RDD[String], function:Function[String,T]){
val mapped = rdd.map(x => (x,function.apply(x)))
val reduce = mapped.reduceByKey((x,y) => y)
}
object Aggragator{
val conf = new SparkConf()
.setMaster("local[2]")
.setAppName("xx")
val sc = new SparkContext(conf)
val rdd = sc.parallelize(List("1","2"))
val reduced = new Aggragator[Int](rdd, (x:String) => x.toInt).reduce.collect()
def main( args: Array[String] ) {
println(reduced)
}}