Apache spark Spark SQL over Streaming-ArrayIndexOutOfBoundsException
我有以下代码来启动关于流的SQL查询。我的问题是在其中一个结果显示ArrayIndexOutOfBoundsException之后。为什么会发生这种情况Apache spark Spark SQL over Streaming-ArrayIndexOutOfBoundsException,apache-spark,apache-spark-sql,spark-streaming,spark-dataframe,Apache Spark,Apache Spark Sql,Spark Streaming,Spark Dataframe,我有以下代码来启动关于流的SQL查询。我的问题是在其中一个结果显示ArrayIndexOutOfBoundsException之后。为什么会发生这种情况 import org.apache.spark._ import org.apache.spark.streaming.{Seconds, StreamingContext} import org.apache.spark.streaming.StreamingContext._ import org.apache.spark.sql.SQLC
import org.apache.spark._
import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.streaming.StreamingContext._
import org.apache.spark.sql.SQLContext
import org.apache.spark.streaming.Duration
import org.apache.spark.sql.functions.udf
object StreamingSQL {
case class Persons(name: String, age: Int)
def main(args: Array[String]) {
val sparkConf = new SparkConf().setMaster("local").setAppName("HdfsWordCount")
val sc = new SparkContext(sparkConf)
// Create the context
val ssc = new StreamingContext(sc, Seconds(2))
val lines = ssc.textFileStream("/home/cloudera/Smartcare/stream/")
lines.foreachRDD(rdd=>rdd.foreach(println))
val sqc = new SQLContext(sc);
//import sqc.createSchemaRDD
import sqc.implicits._
// Create the FileInputDStream on the directory and use the
// stream to count words in new files created
lines.foreachRDD{rdd=>
val persons = rdd.map(_.split(",")).map(p => Persons(p(0), p(1).trim.toInt)).toDF()
persons.registerTempTable("data")
val teenagers = sqc.sql("SELECT name FROM data WHERE age >= 13 AND age <= 19")
teenagers.foreach(println)
}
ssc.start()
ssc.awaitTermination()
}
}
我的txt是:
Ana,31
Edgar,16
Luis,22
Noelia,26
Isabel50
Pablo,34
Laura,18
Paco,17
这是因为伊莎贝尔50没有逗号。您的
split(“,”)
只为该行返回一个值,因此p(1)
对该行失败。我将首先检查RDD中的实际数据是否有两个字段。我检查RDD是否有两个字段。我现在无法测试它,但您至少提供了一个输入数据示例,这很好。如果您将它作为一个完整的RDD读取并执行转换,您是否可以尝试检查它是否会给您相同的错误?
Ana,31
Edgar,16
Luis,22
Noelia,26
Isabel50
Pablo,34
Laura,18
Paco,17