Scala 我有一个两行多列的数据帧,如何转换成两列多列?;

Scala 我有一个两行多列的数据帧,如何转换成两列多列?;,scala,dataframe,apache-spark,Scala,Dataframe,Apache Spark,我有这样一个spark数据框: +---+---+---+---+---+---+---+ | f1| f2| f3| f4| f5| f6| f7| +---+---+---+---+---+---+---+ | 5| 4| 5| 2| 5| 5| 5| +---+---+---+---+---+---+---+ 你怎么能拒绝呢 +---+---+ | f1| 5| +---+---+ | f2| 4| +---+---+ | f3| 5| +---+---+ | f4|

我有这样一个spark数据框:

+---+---+---+---+---+---+---+
| f1| f2| f3| f4| f5| f6| f7|
+---+---+---+---+---+---+---+
|  5|  4|  5|  2|  5|  5|  5|
+---+---+---+---+---+---+---+
你怎么能拒绝呢

+---+---+
| f1|  5|
+---+---+
| f2|  4|
+---+---+
| f3|  5|
+---+---+
| f4|  2|
+---+---+
| f5|  5|
+---+---+
| f6|  5|
+---+---+
| f7|  5|
+---+---+
spark scala中是否有一个简单的代码可用于转置

scala> df.show()
+---+---+---+---+---+---+---+
| f1| f2| f3| f4| f5| f6| f7|
+---+---+---+---+---+---+---+
|  5|  4|  5|  2|  5|  5|  5|
+---+---+---+---+---+---+---+


scala> import org.apache.spark.sql.DataFrame

scala> def transposeUDF(transDF: DataFrame, transBy: Seq[String]): DataFrame = {
     |   val (cols, types) = transDF.dtypes.filter{ case (c, _) => !transBy.contains(c)}.unzip
     |   require(types.distinct.size == 1)      
     | 
     |   val kvs = explode(array(
     |     cols.map(c => struct(lit(c).alias("columns"), col(c).alias("value"))): _*
     |   ))
     | 
     |   val byExprs = transBy.map(col(_))
     | 
     |   transDF
     |     .select(byExprs :+ kvs.alias("_kvs"): _*)
     |     .select(byExprs ++ Seq($"_kvs.columns", $"_kvs.value"): _*)
     | }

scala> val df1 = df.withColumn("tempColumn", lit("1"))

scala> transposeUDF(df1, Seq("tempColumn")).drop("tempColumn").show(false)
+-------+-----+
|columns|value|
+-------+-----+
|f1     |5    |
|f2     |4    |
|f3     |5    |
|f4     |2    |
|f5     |5    |
|f6     |5    |
|f7     |5    |
+-------+-----+
希望它能帮助你

我写了一个函数

object DT {
  val KEY_COL_NAME = "dt_key"
  val VALUE_COL_NAME = "dt_value"

  def pivot(df: DataFrame, valueDataType: DataType, cols: Array[String], keyColName: String, valueColName: String): DataFrame = {
    val tempData: RDD[Row] = df.rdd.flatMap(row => row.getValuesMap(cols).map(Row.fromTuple))
    val keyStructField = DataTypes.createStructField(keyColName, DataTypes.StringType, false)
    val valueStructField = DataTypes.createStructField(valueColName, DataTypes.StringType, true)
    val structType = DataTypes.createStructType(Array(keyStructField, valueStructField))
    df.sparkSession.createDataFrame(tempData, structType).select(col(keyColName), col(valueColName).cast(valueDataType))
  }

  def pivot(df: DataFrame, valueDataType: DataType): DataFrame = {
    pivot(df, valueDataType, df.columns, KEY_COL_NAME, VALUE_COL_NAME)
  }
}
它起作用了

df.show()
DT.pivot(df,DoubleType).show()
像这样

+---+---+-----------+---+---+       +------+-----------+
| f1| f2|         f3| f4| f5|       |dt_key|   dt_value|
+---+---+-----------+---+---+  to   +------+-----------+
|100|  1|0.355072464|  0| 31|       |    f1|      100.0|
+---+---+-----------+---+---+       |    f5|       31.0|
                                    |    f3|0.355072464|
                                    |    f4|        0.0|
                                    |    f2|        1.0|
                                    +------+-----------+


非常好

你能分享你尝试过的东西吗?
df.show()
DT.pivot(df,DoubleType).show()
+---+---+-----------+---+---+       +------+-----------+
| f1| f2|         f3| f4| f5|       |dt_key|   dt_value|
+---+---+-----------+---+---+  to   +------+-----------+
|100|  1|0.355072464|  0| 31|       |    f1|      100.0|
+---+---+-----------+---+---+       |    f5|       31.0|
                                    |    f3|0.355072464|
                                    |    f4|        0.0|
                                    |    f2|        1.0|
                                    +------+-----------+
+---+---+-----------+-----------+---+        +------+-----------+
| f1| f2|         f3|         f4| f5|        |dt_key|   dt_value|
+---+---+-----------+-----------+---+  to    +------+-----------+
|100|  1|0.355072464|          0| 31|        |    f1|      100.0|
| 63|  2|0.622775801|0.685809375| 16|        |    f5|       31.0|
+---+---+-----------+-----------+---+        |    f3|0.355072464|
                                             |    f4|        0.0|
                                             |    f2|        1.0|
                                             |    f1|       63.0|
                                             |    f5|       16.0|
                                             |    f3|0.622775801|
                                             |    f4|0.685809375|
                                             |    f2|        2.0|
                                             +------+-----------+