Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/json/13.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
将MapType col转换为Json:DataFrame_Json_Scala_Apache Spark_Dataframe - Fatal编程技术网

将MapType col转换为Json:DataFrame

将MapType col转换为Json:DataFrame,json,scala,apache-spark,dataframe,Json,Scala,Apache Spark,Dataframe,我有一个数据框架,包含以下内容: +---+-----------------------------------------------------------------------------------------------------------------------------------+ |id |mapData

我有一个数据框架,包含以下内容:

+---+-----------------------------------------------------------------------------------------------------------------------------------+
|id |mapData                                                                                                                            |
+---+-----------------------------------------------------------------------------------------------------------------------------------+
|1  |Map(e1 -> WrappedArray({"number":"n1","strData":"d1","intData":2}), e2 -> WrappedArray({"number":"n1","strData":"d1","intData":2}))|
+---+-----------------------------------------------------------------------------------------------------------------------------------+
我想在Json中转换相同的格式,比如

+---+-----------------------------------------------------------------------------------------------------------------------------------+
|1  |{"e1": [{"number": "n1","strData": "d1","intData": 2}],"e2": [{"number": "n1","strData": "d1","intData": 2}]}                      |
+---+-----------------------------------------------------------------------------------------------------------------------------------+
我尝试了
df.withColumn(“jsonData”),to_json(col(“mapData”))
,但在运行相同的命令时得到了AnalysisException

Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve 'structstojson(`mapData`)' due to data type mismatch: Input type map<string,array<string>> must be a struct or array of structs.;;
'Project [id#9, mapData#98, structstojson(mapData#98, Some(Asia/Kolkata)) AS jsonData#138]
+- Aggregate [id#9], [id#9, customaggregator2(executor_id#36, data#32, CustomAggregator2@3c38e2bf, 0, 0) AS mapData#98]
   +- Union
      :- Project [id#9, data#32, e1 AS executor_id#36]
      :  +- Aggregate [id#9], [id#9, collect_list(data#18, 0, 0) AS data#32]
      :     +- Project [id#9, number#10, strData#11, intData#12, structstojson(named_struct(number, number#10, strData, strData#11, intData, intData#12), Some(Asia/Kolkata)) AS data#18]
      :        +- Project [_1#4 AS id#9, _2#5 AS number#10, _3#6 AS strData#11, _4#7 AS intData#12]
      :           +- LocalRelation [_1#4, _2#5, _3#6, _4#7]
      +- Project [id#9, data#55, e2 AS executor_id#59]
         +- Aggregate [id#9], [id#9, collect_list(data#41, 0, 0) AS data#55]
            +- Project [id#9, number#10, strData#11, intData#12, structstojson(named_struct(number, number#10, strData, strData#11, intData, intData#12), Some(Asia/Kolkata)) AS data#41]
               +- Project [_1#4 AS id#9, _2#5 AS number#10, _3#6 AS strData#11, _4#7 AS intData#12]
                  +- LocalRelation [_1#4, _2#5, _3#6, _4#7]
Scala: 2.11
Spark: 2.2