Apache spark 在Spark Stream中创建数据帧

Apache spark 在Spark Stream中创建数据帧,apache-spark,apache-kafka,spark-streaming,sparse-matrix,apache-spark-mllib,Apache Spark,Apache Kafka,Spark Streaming,Sparse Matrix,Apache Spark Mllib,我已经把卡夫卡音乐和星火联系起来了。我还训练了ApacheSparkMLIB模型,以基于流式文本进行预测。我的问题是,得到一个我需要通过DataFramework的预测 //kafka stream val stream = KafkaUtils.createDirectStream[String, String]( ssc, PreferConsistent, Subscribe[String, String](topics,

我已经把卡夫卡音乐和星火联系起来了。我还训练了ApacheSparkMLIB模型,以基于流式文本进行预测。我的问题是,得到一个我需要通过DataFramework的预测

//kafka stream    
val stream = KafkaUtils.createDirectStream[String, String](
          ssc,
          PreferConsistent,
          Subscribe[String, String](topics, kafkaParams)
        )
//load mlib model
val model = PipelineModel.load(modelPath)
 stream.foreachRDD { rdd =>

      rdd.foreach { record =>
       //to get a prediction need to pass DF
       val toPredict = spark.createDataFrame(Seq(
          (1L, record.value())
        )).toDF("id", "review")
        val prediction = model.transform(test)
      }
}

我的问题是,Spark streaming不允许创建数据帧。有办法吗?我可以使用case类或结构吗?

可以像在core Spark中一样,从RDD创建
数据帧
数据集
。为此,我们需要应用一个模式。在
foreachRDD
中,我们可以将生成的RDD转换为数据帧,该数据帧可以进一步与ML管道一起使用

// we use a schema in the form of a case class
case class MyStructure(field:type, ....)
// and we implement our custom transformation from string to our structure
object MyStructure {
    def parse(str: String) : Option[MyStructure] = ...
}

val stream = KafkaUtils.createDirectStream... 
// give the stream a schema using a case class
val strucStream =  stream.flatMap(cr => MyStructure.parse(cr.value))

strucStream.foreachRDD { rdd =>
    import sparkSession.implicits._
    val df = rdd.toDF()
    val prediction = model.transform(df)
    // do something with df
}

DataFramework或DataFrame??如何使用pyspark做同样的事情?我对pysparkFor pyspark有完全相同的问题,请参见: