Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/17.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala:case类运行时错误_Scala_Apache Spark - Fatal编程技术网

Scala:case类运行时错误

Scala:case类运行时错误,scala,apache-spark,Scala,Apache Spark,这个演示运行正常。但当我将它移动到另一个类函数(我以前的项目)并调用该函数时,它编译失败 object DFMain { case class Person(name: String, age: Double, t:String) def main (args: Array[String]): Unit = { val sc = new SparkContext("local", "Scala Word Count")

这个演示运行正常。但当我将它移动到另一个类函数(我以前的项目)并调用该函数时,它编译失败

     object DFMain {
         case class Person(name: String, age: Double, t:String)
         def main (args: Array[String]): Unit = {
         val sc = new SparkContext("local", "Scala Word Count")
         val sqlContext = new org.apache.spark.sql.SQLContext(sc)
         import sqlContext.implicits._
         val bsonRDD = sc.parallelize(("foo",1,"female")::
                                        ("bar",2,"male")::
                                     ("baz",-1,"female")::Nil)
                      .map(tuple=>{
                    var bson = new BasicBSONObject()
                    bson.put("name","bfoo")
                    bson.put("value",0.1)
                    bson.put("t","female")
                    (null,bson)
                 })
    val tDf = bsonRDD.map(_._2)
              .map(f=>Person(f.get("name").toString,
                   f.get("value").toString.toDouble,
                   f.get("t").toString)).toDF()

       tDf.limit(1).show()
 }
}
“MySQLDao.insertIntoMySQL()”编译错误

object MySQLDao {
     private val sc= new SparkContext("local", "Scala Word Count")
     val sqlContext = new org.apache.spark.sql.SQLContext(sc)
     import sqlContext.implicits._

     case class Person(name: String, age: Double, t:String)
     def insertIntoMySQL(): Unit ={

      val bsonRDD = sc.parallelize(("foo",1,"female")::
                                     ("bar",2,"male")::
                                     ("baz",-1,"female")::Nil)
                       .map(tuple=>{
               val bson = new BasicBSONObject()
               bson.put("name","bfoo")
               bson.put("value",0.1)
               bson.put("t","female")
               (null,bson)
         })
 val tDf = bsonRDD.map(_._2).map( f=> Person(f.get("name").toString,
                                           f.get("value").toString.toDouble,
                                            f.get("t").toString)).toDF()

   tDf.limit(1).show()

 }
} 
威尔,当我调用'MySQLDao.insertIntoMySQL()'时,会得到

值typedProductIterator不是对象scala.runtim.scala.ScalarUnitTime的成员


case类Person(name:String,age:Double,t:String)

我假设case类不在closure-insidemap函数中。将其移动到包级别

case class Person(name: String, age: Double, t:String)
object MySQLDao {
...
}

我想在map函数的闭包中看不到case类。将其移动到包级别

case class Person(name: String, age: Double, t:String)
object MySQLDao {
...
}

奇怪的错误。这是一个Scala编译器错误。它是在编译代码时出现的,还是在运行代码时出现的?另外,试着使示例独立,这样人们就可以重现错误。例如,未定义BasicBonObject。是的,这是编译器错误。对不起,我给你的快车错了。我创建了一个新项目,它做了正确的事情。然而,我仍然不知道为什么。嗨,我正在用scala做一个spark项目。但是我的scala很差。你能给我一些建议吗?奇怪的错误。这是一个Scala编译器错误。它是在编译代码时出现的,还是在运行代码时出现的?另外,试着使示例独立,这样人们就可以重现错误。例如,未定义BasicBonObject。是的,这是编译器错误。对不起,我给你的快车错了。我创建了一个新项目,它做了正确的事情。然而,我仍然不知道为什么。嗨,我正在用scala做一个spark项目。但是我的scala很差。你能给我一些建议吗?