Apache spark 通过Spark Phoenix连接器连接到HBase

Apache spark 通过Spark Phoenix连接器连接到HBase,apache-spark,hbase,phoenix,Apache Spark,Hbase,Phoenix,我正在尝试通过spark sql connector加载HBase表。 我能够得到表的模式 val port = s"${configuration.get(ZOOKEEPER_CLIENT_PORT, "2181")}" val znode = s"${configuration.get(ZOOKEEPER_ZNODE_PARENT, "/hbase")}" val zkUrl = s"${conf

我正在尝试通过spark sql connector加载HBase表。 我能够得到表的模式

 val port = s"${configuration.get(ZOOKEEPER_CLIENT_PORT, "2181")}"
 val znode = s"${configuration.get(ZOOKEEPER_ZNODE_PARENT, "/hbase")}"
 val zkUrl = s"${configuration.get(ZOOKEEPER_QUORUM, "localhost")}"
 val url = s"jdbc:phoenix:$zkUrl:$port:$znode"
 val props = new Properties()
 val table ="SOME_Metrics_Test"
 props.put("driver", "org.apache.phoenix.jdbc.PhoenixDriver")
 val df = spark.read.jdbc(url, getEscapedFullTableName(table), props)
scala> df.printSchema
root
 |-- PK: string (nullable = false)
 |-- status: string (nullable = true)
 |-- other_Status: string (nullable = true)
如果我执行df.printSchema,我可以获取表的模式

 val port = s"${configuration.get(ZOOKEEPER_CLIENT_PORT, "2181")}"
 val znode = s"${configuration.get(ZOOKEEPER_ZNODE_PARENT, "/hbase")}"
 val zkUrl = s"${configuration.get(ZOOKEEPER_QUORUM, "localhost")}"
 val url = s"jdbc:phoenix:$zkUrl:$port:$znode"
 val props = new Properties()
 val table ="SOME_Metrics_Test"
 props.put("driver", "org.apache.phoenix.jdbc.PhoenixDriver")
 val df = spark.read.jdbc(url, getEscapedFullTableName(table), props)
scala> df.printSchema
root
 |-- PK: string (nullable = false)
 |-- status: string (nullable = true)
 |-- other_Status: string (nullable = true)
但是当我执行
df.show
时,我得到了以下错误:

org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName=SOME_Metrics_Test
  at org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:542)
  at org.apache.phoenix.iterate.BaseResultIterators.getParallelScans(BaseResultIterators.java:480)
知道为什么会出现这个错误吗?我能做些什么来解决它? 启动spark shell时,我在spark shell命令中添加了
phoenix-4.7.0-HBase-1.1-client-spark.jar
HBase site.xml

试试看

val phoenixDF = spark.read.format("org.apache.phoenix.spark")
      .option("table", "my_table") 
      .option("zkUrl", "0.0.0.0:2181") 
      .load()