Scala Spark DF至蜂巢ORC表-地图类型列

Scala Spark DF至蜂巢ORC表-地图类型列,scala,apache-spark,spark-dataframe,Scala,Apache Spark,Spark Dataframe,我正在尝试从spark DF到Hive Orc表编写一个映射类型列,但失败了,错误为“不匹配列类型” 配置单元表: CREATE EXTERNAL TABLE `default.test_map_col`( test_col Map<String,String>) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' COLLECTION ITEMS TERMINATED BY '\002' MAP KEYS TERMINATED BY '

我正在尝试从spark DF到Hive Orc表编写一个映射类型列,但失败了,错误为“不匹配列类型”

配置单元表:

CREATE EXTERNAL TABLE `default.test_map_col`(
test_col Map<String,String>)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\001'
COLLECTION ITEMS TERMINATED BY '\002'
MAP KEYS TERMINATED BY '\003'
LINES TERMINATED BY '\n'
STORED AS ORC
LOCATION '/hdfs/path'
TBLPROPERTIES (
'serialization.null.format'='')
在inputDF中填充的映射列为

("map_key_values", map(lit("testkey"), lit("testval"))
我还尝试使用一个UDF来填充DF中的map列

val toStruct = udf((c1: Map[String, String]) => c1.map {
case (k, v) => k + "\u0003" + v}.toSeq)

有没有关于如何写这篇文章的想法?

根据这里的答案,我能够完成这篇文章。
val toStruct = udf((c1: Map[String, String]) => c1.map {
case (k, v) => k + "\u0003" + v}.toSeq)