Apache spark 为什么在spark shell中导入SparkSession失败;object SparkSession不是包org.apache.spark.sql的成员;?

Apache spark 为什么在spark shell中导入SparkSession失败;object SparkSession不是包org.apache.spark.sql的成员;?,apache-spark,cloudera-cdh,apache-spark-1.6,Apache Spark,Cloudera Cdh,Apache Spark 1.6,我在我的虚拟机Cloudera机器上使用Spark 1.6.0 我正试图从Spark shell向配置单元表中输入一些数据。 为此,我尝试使用SparkSession。但下面的导入不起作用 scala> import org.apache.spark.sql.SparkSession <console>:33: error: object SparkSession is not a member of package org.apache.spark.sql

我在我的虚拟机Cloudera机器上使用Spark 1.6.0

我正试图从Spark shell向配置单元表中输入一些数据。 为此,我尝试使用SparkSession。但下面的导入不起作用

scala> import org.apache.spark.sql.SparkSession
<console>:33: error: object SparkSession is not a member of package org.apache.spark.sql
         import org.apache.spark.sql.SparkSession
scala>import org.apache.spark.sql.SparkSession
:33:错误:对象SparkSession不是包org.apache.spark.sql的成员
导入org.apache.spark.sql.SparkSession
如果没有这一点,我无法执行这一声明:

val spark = SparkSession.builder.master("local[2]").enableHiveSupport().config("hive.exec.dynamic.partition","true").config("hive.exec.dynamic.partition.mode", "nonstrict").config("spark.sql.warehouse.dir", warehouseLocation).config("hive.metastore.warehouse.dir","/user/hive/warehouse").getOrCreate()
<console>:33: error: not found: value SparkSession
         val spark = SparkSession.builder.master("local[2]").enableHiveSupport().config("hive.exec.dynamic.partition","true").config("hive.exec.dynamic.partition.mode", "nonstrict").config("spark.sql.warehouse.dir", warehouseLocation).config("hive.metastore.warehouse.dir","/user/hive/warehouse").getOrCreate()
val spark=SparkSession.builder.master(“local[2]”).enableHiveSupport().config(“hive.exec.dynamic.partition”、“true”).config(“hive.exec.dynamic.partition.mode”、“nonstrict”).config(“spark.sql.warehouse.dir”、warehouseLocation.config(“hive.metastore.warehouse.dir”、“/user/hive/warehouse”).getOrCreate()
:33:错误:未找到:值SparkSession
val spark=SparkSession.builder.master(“local[2]”).enableHiveSupport().config(“hive.exec.dynamic.partition”、“true”).config(“hive.exec.dynamic.partition.mode”、“nonstrict”).config(“spark.sql.warehouse.dir”、warehouseLocation.config(“hive.metastore.warehouse.dir”、“/user/hive/warehouse”).getOrCreate()
有人能告诉我我在这里犯了什么错误吗?

SparkSession
所以你应该改用
SQLContext
(或者将你的Spark升级到最新最棒的版本)

引用Spark 1.6.0:

Spark SQL中所有功能的入口点是
SQLContext
类或其子类之一

除了基本SQLContext之外,还可以创建HiveContext,它提供基本SQLContext提供的功能的超集


那么,你100%确定你有Spark2吗?你使用哪种版本的spark?您是否确保可以打开火花壳而没有任何错误?@PraveenKumarKrishnaiyer打开火花壳时没有错误。