Sql Scala-将数组数据转换为表或数据帧?
我想创建并保存一个充满随机整数的表。到目前为止,一切都很顺利,但我不明白如何将多维数组tmp放入顶部定义了模式的数据帧中Sql Scala-将数组数据转换为表或数据帧?,sql,scala,apache-spark,dataframe,spark-dataframe,Sql,Scala,Apache Spark,Dataframe,Spark Dataframe,我想创建并保存一个充满随机整数的表。到目前为止,一切都很顺利,但我不明白如何将多维数组tmp放入顶部定义了模式的数据帧中 import org.apache.spark.sql.types.{ StructType, StructField, StringType, IntegerType, DoubleType} import org.apache.spark.sql.Row val schema = StructType( StructField("rowId", IntegerType,
import org.apache.spark.sql.types.{
StructType, StructField, StringType, IntegerType, DoubleType}
import org.apache.spark.sql.Row
val schema = StructType(
StructField("rowId", IntegerType, true) ::
StructField("t0_1", DoubleType, true) ::
StructField("t0_2", DoubleType, true) ::
StructField("t0_3", DoubleType, true) ::
StructField("t0_4", DoubleType, true) ::
StructField("t0_5", DoubleType, true) ::
StructField("t0_6", DoubleType, true) ::
StructField("t0_7", DoubleType, true) ::
StructField("t0_8", DoubleType, true) ::
StructField("t0_9", DoubleType, true) ::
StructField("t0_10", DoubleType, true) :: Nil)
val columnNo = 10;
val rowNo = 50;
var c = 0;
var r = 0;
val tmp = Array.ofDim[Double](10,rowNo)
for (r <- 1 to rowNo){
for (c <- 1 to columnNo){
val temp = new scala.util.Random
tmp(c-1)(r-1) = temp.nextDouble
println( "Value of " + c + "/"+ r + ":" + tmp(c-1)(r-1));
}
}
val df = sc.parallelize(tmp).toDF
df.show
dataframe.show
不能将数组数组转换为数据帧,而是需要元组数组或case类。此处的变量基于与所需架构对应的案例类:
case class Record(
rowID:Option[Int],
t0_1:Option[Double],
t0_2:Option[Double],
t0_3:Option[Double],
t0_4:Option[Double],
t0_5:Option[Double],
t0_6:Option[Double],
t0_7:Option[Double],
t0_8:Option[Double],
t0_9:Option[Double],
t0_10:Option[Double]
)
val rowNo = 50;
val temp = new scala.util.Random
val data = (1 to rowNo).map(r =>
Record(
Some(r),
Some(temp.nextDouble),
Some(temp.nextDouble),
Some(temp.nextDouble),
Some(temp.nextDouble),
Some(temp.nextDouble),
Some(temp.nextDouble),
Some(temp.nextDouble),
Some(temp.nextDouble),
Some(temp.nextDouble),
Some(temp.nextDouble)
)
)
val df = sc.parallelize(data).toDF
非常感谢你!解决了我的问题,大大缩短了我的代码!