Apache spark 升级hive metastore后,Apache spark 2.2.0无法连接到metastore

Apache spark 升级hive metastore后,Apache spark 2.2.0无法连接到metastore,apache-spark,hive,hive-metastore,Apache Spark,Hive,Hive Metastore,运行spark shell时出现以下错误 Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 18/01/30 18:22:27 W

运行spark shell时出现以下错误

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
18/01/30 18:22:27 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/01/30 18:22:29 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
18/01/30 18:22:29 WARN Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042.

    java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':

    Caused by: scala.MatchError: 2.3.0 (of class java.lang.String)
    at 

org.apache.spark.sql.hive.client.IsolatedClientLoader$.hiveVersion(IsolatedClientLoader.scala:89)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:279)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:266)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:194)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:193)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35)
at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289)
at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1050)
... 61 more
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
使用Spark的默认log4j配置文件:org/apache/Spark/log4j-defaults.properties
将默认日志级别设置为“警告”。
要调整日志记录级别,请使用sc.setLogLevel(newLevel)。对于SparkR,使用setLogLevel(newLevel)。
18/01/30 18:22:27警告NativeCodeLoader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类
18/01/30 18:22:29警告Utils:服务“SparkUI”无法在端口4040上绑定。正在尝试端口4041。
18/01/30 18:22:29警告Utils:服务“SparkUI”无法在端口4041上绑定。正在尝试端口4042。
java.lang.IllegalArgumentException:实例化“org.apache.spark.sql.hive.HiveSessionStateBuilder”时出错:
原因:scala.MatchError:2.3.0(属于java.lang.String类)
在
org.apache.spark.sql.hive.client.IsolatedClientLoader$.hiveVersion(IsolatedClientLoader.scala:89)
位于org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:279)
位于org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:266)
位于org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
位于org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
位于org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:194)
位于org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
位于org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194)
位于org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
位于org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:193)
位于org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105)
位于org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93)
位于org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
位于org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
位于org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
位于org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35)
位于org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289)
位于org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1050)
... 61多
:14:错误:未找到:值火花
导入spark.implicits_
^
:14:错误:未找到:值火花
导入spark.sql
^
今天(Spark 2.3.0 RC)支持的最新版本是2.1:


Hive 2.3不受支持。

那么apache Hive 2.3.2不支持apache spark 2.2.0?ApacheHadoop、ApacheHive和ApacheSpark是否有版本匹配表?我需要在spark上运行hive,我应该使用什么最新版本?我没有见过,但代码应该是权威的,所以hive 2.1是受支持的最高版本。谢谢,所以我计划下载apache spark 2.0的预构建版本、apache hive 2.1并在spark上配置hive。我的版本是Hadoop-2.8、hive-2.1、spark-2.0、zeppelin-0.7.3。还有别的事吗?
def hiveVersion(version: String): HiveVersion = version match {
  case "12" | "0.12" | "0.12.0" => hive.v12
  case "13" | "0.13" | "0.13.0" | "0.13.1" => hive.v13
  case "14" | "0.14" | "0.14.0" => hive.v14
  case "1.0" | "1.0.0" => hive.v1_0
  case "1.1" | "1.1.0" => hive.v1_1
  case "1.2" | "1.2.0" | "1.2.1" | "1.2.2" => hive.v1_2
  case "2.0" | "2.0.0" | "2.0.1" => hive.v2_0
  case "2.1" | "2.1.0" | "2.1.1" => hive.v2_1
}