Spark Scala:增值不是Scala.collection.mutable.ListBuffer的成员

Spark Scala:增值不是Scala.collection.mutable.ListBuffer的成员,scala,apache-spark,apache-spark-sql,Scala,Apache Spark,Apache Spark Sql,下面是我的代码,尝试将数据帧行添加到列表中,然后以元组形式返回 导入scala.collection.mutable.ListBuffer myDF.rdd.filter{row:Row => row.getString(6).length > 0}.map { row: Row => var rowList: ListBuffer[Row] = ListBuffer() rowList.add(row) (row.getString(1) + "_" + r

下面是我的代码,尝试将数据帧行添加到列表中,然后以元组形式返回

导入scala.collection.mutable.ListBuffer

myDF.rdd.filter{row:Row => row.getString(6).length > 0}.map {
  row: Row => 
  var rowList: ListBuffer[Row] = ListBuffer()
  rowList.add(row)
  (row.getString(1) + "_" + row.getString(2) + "_" + row.getString(6) + "_" + row.getString(7) + "_" + row.getString(14), rowList)
}.count()
然后我得到了以下错误:

 error: value add is not a member of scala.collection.mutable.ListBuffer[org.apache.spark.sql.Row]
                rowList.add(row)

有人知道我做错了什么吗?谢谢

add不是标准的ListBufferAPI。只需导入以下行:

import scala.collection.JavaConversions._

在ListBuffer中使用+=行:行列表+=行