Hadoop NoSuchMethodError:org.apache.spark.sql.SQLContext.applySchema
我试图使用ApacheSpark中提供的sqlcontext,使用下面的代码查询存储在hdfs中的文件,但遇到了NoSuchMethodErrorHadoop NoSuchMethodError:org.apache.spark.sql.SQLContext.applySchema,hadoop,apache-spark,apache-spark-sql,Hadoop,Apache Spark,Apache Spark Sql,我试图使用ApacheSpark中提供的sqlcontext,使用下面的代码查询存储在hdfs中的文件,但遇到了NoSuchMethodError package SQL import org.apache.spark.SparkContext import org.apache.spark.sql._ object SparSQLCSV { def main(args: Array[String]) { val sc = new SparkContext("local[*]
package SQL
import org.apache.spark.SparkContext
import org.apache.spark.sql._
object SparSQLCSV { def main(args: Array[String]) {
val sc = new SparkContext("local[*]","home")
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val people = sc.textFile("/home/devan/Documents/dataset/peoplesTest.csv")
val delimiter = ","
val schemaString = "a,b".split(delimiter)//csv header
//Automated Schema creation
val schema = StructType(schemaString.map(fieldName => StructField(fieldName, StringType, true)))
val peopleLines = people.flatMap(x=> x.split("\n"))
val rowRDD = peopleLines.map(p=>{
Row.fromSeq(p.split(delimiter))
})
val peopleSchemaRDD = sqlContext.applySchema(rowRDD, schema)
peopleSchemaRDD.registerTempTable("people")
sqlContext.sql("SELECT b FROM people").foreach(println)
} }
主线程java.lang.NoSuchMethodError中出现异常:
org.apache.spark.sql.SQLContext.applySchemaLorg/apache/spark/rdd/rdd;Lorg/apache/spark/sql/types/StructType;Lorg/apache/spark/sql/DataFrame;
烫伤时。主要对象$。主要对象scala:34
在烫伤处
在sun.reflect.NativeMethodAccessorImpl.invoke0Native方法中
位于sun.reflect.NativeMethodAccessorImpl.invokeNativeMethodAccessorImpl.java:57
在sun.reflect.DelegatingMethodAccessorImpl.invokeDelegatingMethodAccessorImpl.java:43
位于java.lang.reflect.Method.invokeMethod.java:606
位于org.apache.spark.deploy.SparkSubmit$.launchSparkSubmit.scala:358
位于org.apache.spark.deploy.SparkSubmit$.mainSparkSubmit.scala:75
位于org.apache.spark.deploy.SparkSubmit.mainSparkSubmit.scala
我曾经使用spark中提供的命令行尝试过同样的方法,它可以工作,但是当我创建一个scala项目并尝试运行它时,我得到了上面的错误。我做错了什么?无此方法错误通常意味着库之间存在不兼容。在这种特殊情况下,您可能使用的spark csv版本要求spark 1.3与旧版本的spark配合使用。NoSuchMethodError通常表示库之间不兼容。在这种特殊情况下,您可能使用的spark csv版本要求spark 1.3与旧版本的spark配合使用。尝试将`val rowRDD=peopleLines.mapp=>{Row.fromSeqp.splitdelimiter}`更改为`val rowRDD=peopleLines.mapp=>{Row.fromSeqp.splitdelimiter}}`尝试将`val rowRDD=peopleLines.mapp=>{Row.fromSeqp.splitdelimiter}`更改为`val rowRDD=peopleLines.mapp=>{Row.fromSeqp.splitdelimiter}`