Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala spark数据帧爆炸函数错误_Scala_Apache Spark_Apache Spark Sql - Fatal编程技术网

Scala spark数据帧爆炸函数错误

Scala spark数据帧爆炸函数错误,scala,apache-spark,apache-spark-sql,Scala,Apache Spark,Apache Spark Sql,我想在DF上使用explode函数,我只需编写与文档类似的代码: case class Url(url:String) val temp3 = temp2.explode($"urls"){ case Row(urls:Array[String]) => urls.map(Url(_)) } 然而,结果是: error: not found: value Row DF temp2类似于: temp2.printSchema() root |--

我想在DF上使用explode函数,我只需编写与文档类似的代码:

    case class Url(url:String)
    val temp3 = temp2.explode($"urls"){
        case Row(urls:Array[String]) => urls.map(Url(_))
    }
然而,结果是:

error: not found: value Row
DF temp2类似于:

temp2.printSchema()
root
 |-- userid: string (nullable = true)
 |-- urls: array (nullable = true)
 |    |-- element: string (containsNull = true)

添加以下导入:

import org.apache.spark.sql.Row