Hadoop Apache Spark在查阅表格时遇到的问题

Hadoop Apache Spark在查阅表格时遇到的问题,hadoop,hive,apache-spark,hiveql,Hadoop,Hive,Apache Spark,Hiveql,我尝试火花和蜂巢, 我要选一张桌子 hiveContext.hql("select * from final_table").collect() 但我有这个错误 ERROR Hive: NoSuchObjectException(message:default.final_table table not found) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java

我尝试火花和蜂巢, 我要选一张桌子

hiveContext.hql("select * from final_table").collect()
但我有这个错误

ERROR Hive: NoSuchObjectException(message:default.final_table table not found)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1569)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:106)
at com.sun.proxy.$Proxy27.get_table(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1008)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90)
at com.sun.proxy.$Proxy28.getTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:1000)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:974)
at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:70)
at org.apache.spark.sql.hive.HiveContext$$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(HiveContext.scala:253)
at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:141)
at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:141)
at scala.Option.getOrElse(Option.scala:120)
但是当我尝试这个的时候

hiveContext.hql("CREATE TABLE IF NOT EXISTS TestTable (key INT, value STRING)")
我没有任何问题,表格已创建

对这个问题有什么想法,有什么解决办法吗


谢谢

您使用哪个命令启动Spark?很可能您没有正确设置配置单元元存储的使用,这意味着每次启动集群时,您都在创建新的临时本地元存储。要使用配置单元元存储,请遵循以下指南:(和)。这样,您创建的表将在集群重启之间在配置单元元存储中持久化