Scala.SARK 2.2.1的PAR函数给出了JANNIO编译错误

Scala.SARK 2.2.1的PAR函数给出了JANNIO编译错误,scala,apache-spark,apache-spark-sql,scala-collections,janino,Scala,Apache Spark,Apache Spark Sql,Scala Collections,Janino,我有一些代码如下 def computeGroupByCount(列:List[String],DF:DataFrame):List[JsValue]={ val结果:ListBuffer[JsValue]=新ListBuffer[JsValue]() val编码器:编码器[ColumnGroupByCount]=编码器。产品[ColumnGroupByCount] 组BytoCultStudio val groupByCount:数组[ColumnGroupByCount]=DF .grou

我有一些代码如下

def computeGroupByCount(列:List[String],DF:DataFrame):List[JsValue]={
val结果:ListBuffer[JsValue]=新ListBuffer[JsValue]()
val编码器:编码器[ColumnGroupByCount]=编码器。产品[ColumnGroupByCount]
组BytoCultStudio
val groupByCount:数组[ColumnGroupByCount]=DF
.groupBy(colName)
.count()
.map(x=>ResponseOnGroupByCount(colName.toString,x.getString(0),x.getLong(1))(编码器)
.collect()
result+=Json.toJson(groupByCount)
})
结果:托利斯特
}