Scala spark shell找不到要扩展的类

Scala spark shell找不到要扩展的类,scala,apache-spark,Scala,Apache Spark,为什么我不能在spark shell中加载包含以下代码的文件 import org.apache.spark.sql.types._

为什么我不能在spark shell中加载包含以下代码的文件

import org.apache.spark.sql.types._                                                                                                                                                                                                                                             

import org.apache.spark.sql.Encoder                                                                                                                                                                                                                                             import org.apache.spark.sql.Encoders                                                                                                                                                                                                                                            
import org.apache.spark.sql.expressions.Aggregator                                                                                                                                                                                                                              


case class Data(i: Int)                                                                                                                                                                                                                                                         

val customSummer =  new Aggregator[Data, Int, Int] {                                                                                                                                                                                                                            
  def zero: Int = 0                                                                                                                                                                                                                                                             
  def reduce(b: Int, a: Data): Int = b + a.i                                                                                                                                                                                                                                    
  def merge(b1: Int, b2: Int): Int = b1 + b2                                                                                                                                                                                                                                    
  def finish(r: Int): Int = r                                                                                                                                                                                                                                                   
}.toColumn()   
错误:

<console>:47: error: object creation impossible, since:
it has 2 unimplemented members.
/** As seen from <$anon: org.apache.spark.sql.expressions.Aggregator[Data,Int,Int]>, the missing signatures are as follows.
 *  For convenience, these are usable as stub implementations.
 */
  def bufferEncoder: org.apache.spark.sql.Encoder[Int] = ???
  def outputEncoder: org.apache.spark.sql.Encoder[Int] = ???

       val customSummer =  new Aggregator[Data, Int, Int] {
错误:

loading ./script.sc...
import org.apache.spark.sql.expressions.Aggregator
<console>:11: error: not found: type Aggregator
       class MyClass extends Aggregator
正在加载。/script.sc。。。
导入org.apache.spark.sql.expressions.Aggregator
:11:错误:未找到:类型聚合器
类MyClass扩展聚合器
更新(2017-12-03):齐柏林飞艇内部似乎也不起作用


根据错误消息,您没有实施
bufferEncoder
outpuntencoder
。请查看必须实现的
抽象
方法列表

这两项就足够了:

def bufferEncoder: Encoder[Int] = Encoders.scalaInt
def outputEncoder: Encoder[Int] = Encoders.scalaInt
def bufferEncoder: Encoder[Int] = Encoders.scalaInt
def outputEncoder: Encoder[Int] = Encoders.scalaInt