Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 对象无法强制转换到流式接收器_Scala_Apache Spark_Spark Streaming_Spray Json - Fatal编程技术网

Scala 对象无法强制转换到流式接收器

Scala 对象无法强制转换到流式接收器,scala,apache-spark,spark-streaming,spray-json,Scala,Apache Spark,Spark Streaming,Spray Json,在spark流式应用程序中,我得到的消息是,我用来作为RDD的item类传递的可序列化类之一不能被强制转换到接收器中。我不是想把它扔进一个接受器,但它在这里: 15/04/18 18:30:22 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) java.lang.ClassCastException: vehicles.Vehicle_Status cannot be cast to org.apache.spark.stre

在spark流式应用程序中,我得到的消息是,我用来作为RDD的item类传递的可序列化类之一不能被强制转换到接收器中。我不是想把它扔进一个接受器,但它在这里:

15/04/18 18:30:22 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1)
java.lang.ClassCastException: vehicles.Vehicle_Status cannot be cast to org.apache.spark.streaming.receiver.Receiver
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$8.apply(ReceiverTracker.scala:295)
at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$8.apply(ReceiverTracker.scala:290)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1497)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1497)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:64)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
下面是大概让我达到目的的代码:

val rmqReceiver=新的rmqReceiver(qAddress,“车辆数据”)
val customReceiverStream=aContext.receiverStream(rmqReceiver)
val处理程序=(rdd:rdd[列表[字符串])=>{
此.handleStreamResult(rdd)
}
customReceiverStream.foreachRDD(处理程序)
def jsonToVehicleStatus(aRecord:String):车辆状态={
val ast=aRecord.parseJson
val aMap=最后转换为[车辆状态]
返回aMap
}
def handleStreamResult(rdd:rdd[列表[字符串]):单位={
rdd.foreach{list=>
val mapList=list.map(jsonToVehicleStatus)
val pMapList=sparkContext.parallelize(映射列表)
pMapList.saveToCassandra(“车辆数据”,“车辆”,所有列)
println()
}
}
有人知道它为什么要把它扔进接收器吗