Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/scala/19.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala “线程中的异常”;“主要”;java.lang.IllegalArgumentException:实例化时出错';org.apache.spark.sql.hive.HiveSessionState';:_Scala_Apache Spark_Hadoop_Hive_Apache Spark 2.0 - Fatal编程技术网

Scala “线程中的异常”;“主要”;java.lang.IllegalArgumentException:实例化时出错';org.apache.spark.sql.hive.HiveSessionState';:

Scala “线程中的异常”;“主要”;java.lang.IllegalArgumentException:实例化时出错';org.apache.spark.sql.hive.HiveSessionState';:,scala,apache-spark,hadoop,hive,apache-spark-2.0,Scala,Apache Spark,Hadoop,Hive,Apache Spark 2.0,我正试图通过Intelliji连接到蜂巢。我使用的是Scala版本2.11.4,spark core、spark hive和spark sql的版本都是2.1.1。这是我用来从windows m/c远程连接的代码snippt。连接时,我遇到以下错误,有人能帮我解决这个问题吗 注意:当我读到一些线程时,它们提到检查tmp的权限,在本例中是/tmp/hive/warehouse。它对用于连接的用户xyz具有适当的权限。使用此功能id,我可以从一台unix服务器手动连接。我甚至尝试使用spark.sq

我正试图通过Intelliji连接到蜂巢。我使用的是Scala版本2.11.4,spark core、spark hive和spark sql的版本都是2.1.1。这是我用来从windows m/c远程连接的代码snippt。连接时,我遇到以下错误,有人能帮我解决这个问题吗

注意:当我读到一些线程时,它们提到检查tmp的权限,在本例中是/tmp/hive/warehouse。它对用于连接的用户xyz具有适当的权限。使用此功能id,我可以从一台unix服务器手动连接。我甚至尝试使用spark.sql(“show databases”),但都是相同的错误

def main(args: Array[String]): Unit = {
    createKerberosTicket()
    val spark: SparkSession = {
          SparkSession
            .builder()
            .master("local")
            .appName("SparkHiveTest")
            .config("hive.exec.dynamic.partition.mode", "nonstrict")
            .config("hive.exec.dynamic.partition", "true")
            .config("mapreduce.job.queuename", "root.XYZ_Pool")
            .enableHiveSupport()
            .getOrCreate()
        }
    spark.sparkContext.hadoopConfiguration.addResource(new Path("core-site.xml"))
    spark.sparkContext.hadoopConfiguration.addResource(new Path("hdfs-site.xml"))
    spark.sparkContext.hadoopConfiguration.addResource(new Path("hive-site.xml"))
    spark.sparkContext.hadoopConfiguration.set("fs.hdfs.impl", classOf[DistributedFileSystem].getName)
    spark.sparkContext.hadoopConfiguration.set("fs.file.impl", classOf[LocalFileSystem].getName)
    val listOfDBs = spark.sqlContext.sql("show databases")
}

18/05/02 23:59:13 INFO SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/tmp/hive/warehouse').
18/05/02 23:59:13 INFO SharedState: Warehouse path is '/tmp/hive/warehouse'.
18/05/02 23:59:14 INFO HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
18/05/02 23:59:14 INFO metastore: Trying to connect to metastore with URI thrift://xyz.net:1234
18/05/02 23:59:14 INFO metastore: Connected to metastore.
18/05/02 23:59:18 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':
    at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
    at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
    at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
    at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
    at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
    at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
    at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
    at spark.SparkPlusHive$.main(SparkPlusHive.scala:25)
    at spark.SparkPlusHive.main(SparkPlusHive.scala)
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
    ... 12 more
Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':
    at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
    at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:86)
    at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
    at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
    at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
    at org.apache.spark.sql.internal.SessionState.<init>(SessionState.scala:157)
    at org.apache.spark.sql.hive.HiveSessionState.<init>(HiveSessionState.scala:32)
    ... 17 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
    ... 25 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
    at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:358)
    at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:262)
    at org.apache.spark.sql.hive.HiveExternalCatalog.<init>(HiveExternalCatalog.scala:66)
    ... 30 more
Caused by: java.lang.RuntimeException: java.lang.NullPointerException
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
    at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:188)
    ... 38 more
Caused by: java.lang.NullPointerException
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:505)
    at org.apache.hadoop.util.Shell.run(Shell.java:478)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:831)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:814)
    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:712)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:470)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:510)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:488)
    at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:309)
    at org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:639)
    at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:567)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
    ... 39 more
def main(args:Array[String]):单位={
createKerberosTicket()
val spark:SparkSession={
SparkSession
.builder()
.master(“本地”)
.appName(“SparkHiveTest”)
.config(“hive.exec.dynamic.partition.mode”,“非严格”)
.config(“hive.exec.dynamic.partition”,“true”)
.config(“mapreduce.job.queuename”、“root.XYZ_池”)
.enableHiveSupport()
.getOrCreate()
}
spark.sparkContext.hadoopConfiguration.addResource(新路径(“core site.xml”))
spark.sparkContext.hadoopConfiguration.addResource(新路径(“hdfs site.xml”))
spark.sparkContext.hadoopConfiguration.addResource(新路径(“hive site.xml”))
spark.sparkContext.hadoopConfiguration.set(“fs.hdfs.impl”,classOf[DistributedFileSystem].getName)
spark.sparkContext.hadoopConfiguration.set(“fs.file.impl”,classOf[LocalFileSystem].getName)
val listOfDBs=spark.sqlContext.sql(“显示数据库”)
}
18/05/02 23:59:13信息共享状态:未设置spark.sql.warehouse.dir,但已设置hive.metastore.warehouse.dir。将spark.sql.warehouse.dir设置为hive.metastore.warehouse.dir('/tmp/hive/warehouse')的值。
18/05/02 23:59:13信息共享状态:仓库路径为“/tmp/hive/Warehouse”。
18/05/02 23:59:14信息HiveUtils:使用Spark类初始化HiveMetastoreConnection版本1.2.1。
18/05/02 23:59:14信息元存储:尝试使用URI连接到元存储thrift://xyz.net:1234
18/05/02 23:59:14信息元存储:已连接到元存储。
18/05/02 23:59:18警告NativeCodeLoader:无法为您的平台加载本机hadoop库。。。在适用的情况下使用内置java类
线程“main”java.lang.IllegalArgumentException中出现异常:实例化“org.apache.spark.sql.hive.HiveSessionState”时出错:
位于org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:981)
位于org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:110)
位于org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
位于org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
位于org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:878)
位于scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
位于scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
位于scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
位于scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
位于scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
位于org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:878)
在spark.SparkPlusHive$.main时(SparkPlusHive.scala:25)
位于spark.SparkPlusHive.main(SparkPlusHive.scala)
原因:java.lang.reflect.InvocationTargetException
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:423)
位于org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$reflect(SparkSession.scala:978)
... 还有12个
原因:java.lang.IllegalArgumentException:实例化“org.apache.spark.sql.hive.HiveExternalCatalog”时出错:
位于org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:169)
位于org.apache.spark.sql.internal.SharedState。(SharedState.scala:86)
位于org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
位于org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:101)
在scala.Option.getOrElse(Option.scala:120)
位于org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:101)
位于org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:100)
位于org.apache.spark.sql.internal.SessionState.(SessionState.scala:157)
位于org.apache.spark.sql.hive.HiveSessionState。(HiveSessionState.scala:32)
... 还有17个
原因:java.lang.reflect.InvocationTargetException
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:423)
位于org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$reflect(SharedState.scala:166)
... 25多
原因:java.lang.reflect.InvocationTargetException
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImp
val warehouseLocation = "file:${system:user.dir}/spark-warehouse"
    val spark = SparkSession
    .builder()
    .appName("***")
    .master("***")
    .config("spark.sql.warehouse.dir", warehouseLocation)
    .enableHiveSupport()
    .getOrCreate()