Apache spark HiveContext:Can';无法通过JDBC客户端查看临时表
在Pypark 注册了临时表Apache spark HiveContext:Can';无法通过JDBC客户端查看临时表,apache-spark,hive,pyspark-sql,Apache Spark,Hive,Pyspark Sql,在Pypark 注册了临时表 from pyspark import HiveContext sqlContext = HiveContext(sc) df = sqlContext.sql("select * from test").collect() df.registerTempTable("testing") sqlContext.sql("show tables").show() +--------------------+-----------+ | tableN
from pyspark import HiveContext
sqlContext = HiveContext(sc)
df = sqlContext.sql("select * from test").collect()
df.registerTempTable("testing")
sqlContext.sql("show tables").show()
+--------------------+-----------+
| tableName|isTemporary|
+--------------------+-----------+
| testing| true|
| check| false|
+--------------------+-----------+
我可以从pyspark查看临时表“测试”
我启动了spark thrift服务器
启动JDBC客户端并连接到spark thrift server
$ ./bin/beeline
beeline> !connect jdbc:hive2://ip:10000
Connecting to jdbc:hive2://ip:10000
Enter username for jdbc:hive2://ip:
Enter password for jdbc:hive2://ip:10000:
16/03/06 13:17:41 INFO jdbc.Utils: Supplied authorities: :10000
16/03/06 13:17:41 INFO jdbc.Utils: Resolved authority: :10000
16/03/06 13:17:41 INFO jdbc.HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://ip:10000
Connected to: Spark SQL (version 1.5.2)
Driver: Spark Project Core (version 1.5.2)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://ip.> show tables;
+-------------+--------------+--+
| tableName | isTemporary |
+-------------+--------------+--+
| check | false |
+-------------+--------------+--+
2 rows selected (0.842 seconds)
0: jdbc:hive2://ip.>
我无法查看临时表。
有什么我遗漏的吗?临时表将只在当前会话中有效。这意味着您通过beeline进行的新会话无法看到临时表
测试
谢谢,我看到了。注册临时表后,我以编程方式启动了一个thrift服务器,指向特定的HiveContext。然后它工作得很好