Apache spark Spark SQL-找不到注册的临时表

Apache spark Spark SQL-找不到注册的临时表,apache-spark,cassandra,apache-spark-sql,spark-dataframe,spark-cassandra-connector,Apache Spark,Cassandra,Apache Spark Sql,Spark Dataframe,Spark Cassandra Connector,我运行以下命令: spark-shell --packages datastax:spark-cassandra-connector:1.6.0-s_2.10 然后我用以下内容停止上下文: sc.stop 然后在REPL中运行以下代码: val conf = new org.apache.spark.SparkConf(true).set("spark.cassandra.connection.host", "127.0.0.1") val sc = new org.apache.spark

我运行以下命令:

spark-shell --packages datastax:spark-cassandra-connector:1.6.0-s_2.10
然后我用以下内容停止上下文:

sc.stop
然后在REPL中运行以下代码:

val conf = new org.apache.spark.SparkConf(true).set("spark.cassandra.connection.host", "127.0.0.1")
val sc = new org.apache.spark.SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val cc = new org.apache.spark.sql.cassandra.CassandraSQLContext(sc)

cc.setKeyspace("ksp")

cc.sql("SELECT * FROM continents").registerTempTable("conts")

val allContinents = sqlContext.sql("SELECT * FROM conts").collect
我得到:

org.apache.spark.sql.AnalysisException: Table not found: conts;
键空间ksp和表大陆是在Cassandra中定义的,因此我怀疑错误不在这一边


(Spark 1.6.0,1.6.1)

因为您使用不同的上下文来创建数据帧和执行SQL

val conf = new 
org.apache.spark.SparkConf(true).set("spark.cassandra.connection.host", "127.0.0.1")
val sc = new org.apache.spark.SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val cc = new org.apache.spark.sql.cassandra.CassandraSQLContext(sc)

cc.setKeyspace("ksp")

cc.sql("SELECT * FROM continents").registerTempTable("conts")

// use cc instead of sqlContext
val allContinents = cc.sql("SELECT * FROM conts").collect