Apache spark 使用映射通过(1到N)生成RDD失败
这项工作:Apache spark 使用映射通过(1到N)生成RDD失败,apache-spark,Apache Spark,这项工作: val rdd = sc.makeRDD((1 to 10)) val rdd2 = rdd.map(x => (1, 1,"2019-01-01", "2019-01-01",1,2,"XXXXXXXXXXXXXXXXXXXXXXXXXX")) 这并不是: val rdd = sc.makeRDD((1 to 10)).map((1, 1,"2019-01-01", "2019-01-01",1,2,"XXXXXXXXXXXXXXXXXXXXXXXXXX")) 得到的错误
val rdd = sc.makeRDD((1 to 10))
val rdd2 = rdd.map(x => (1, 1,"2019-01-01", "2019-01-01",1,2,"XXXXXXXXXXXXXXXXXXXXXXXXXX"))
这并不是:
val rdd = sc.makeRDD((1 to 10)).map((1, 1,"2019-01-01", "2019-01-01",1,2,"XXXXXXXXXXXXXXXXXXXXXXXXXX"))
得到的错误是:
notebook:1: error: type mismatch;
found : (Int, Int, String, String, Int, Int, String)
required: Int => ?
这同样有效,没关系:
val rdd = sc.makeRDD((1 to 10)).map((_, 1,"2019-01-01", "2019-01-01",1,2,"XXXXXXXXXXXXXXXXXXXXXXXXXX"))
我在这里遗漏了一些更好的要点。遗漏的一点是您必须传递一个函数,
(Int,Int,String,String,Int,Int,Int,String)
不是一个函数
如果要生成常量
sc.makeRDD((1 to 10)).map(_ => (1, 1,"2019-01-01", "2019-01-01",1,2,"XXXXXXXXXXXXXXXXXXXXXXXXXX"))
但为什么呢?因为马克德?lit?至少x=>是必需的,否则我怀疑。