Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/305.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
java中的Spark配置单元上下文hql问题-在java中运行Spark作业时_Java_Hive_Apache Spark_Yarn - Fatal编程技术网

java中的Spark配置单元上下文hql问题-在java中运行Spark作业时

java中的Spark配置单元上下文hql问题-在java中运行Spark作业时,java,hive,apache-spark,yarn,Java,Hive,Apache Spark,Yarn,我们有一个spark应用程序,它使用spark submit在纱线中运行。当运行 java中的sparkHiveContext.hql(“显示数据库”) 获取以下异常 ClassLoaderResolver for class "" gave error on creation : {1} org.datanucleus.exceptions.NucleusUserException: ClassLoaderResolver for class "" gave error on creation

我们有一个spark应用程序,它使用spark submit在纱线中运行。当运行

java中的sparkHiveContext.hql(“显示数据库”)

获取以下异常

ClassLoaderResolver for class "" gave error on creation : {1} org.datanucleus.exceptions.NucleusUserException: ClassLoaderResolver for class "" gave error on creation : {1}
at org.datanucleus.NucleusContext.getClassLoaderResolver(NucleusContext.java:1087)
at org.datanucleus.PersistenceConfiguration.validatePropertyValue(PersistenceConfiguration.java:797)
at org.datanucleus.PersistenceConfiguration.setProperty(PersistenceConfiguration.java:714)
at org.datanucleus.PersistenceConfiguration.setPersistenceProperties(PersistenceConfiguration.java:693)
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:273)
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:247)
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:225)
类“”的ClassLoaderResolver在创建时出错:{1}org.datanucleus.exceptions.NucleusUserException:ClassLoaderResolver在创建时出错:{1} 位于org.datanucleus.NucleusContext.getClassLoaderResolver(NucleusContext.java:1087) 位于org.datanucleus.PersistenceConfiguration.validatePropertyValue(PersistenceConfiguration.java:797) 位于org.datanucleus.PersistenceConfiguration.setProperty(PersistenceConfiguration.java:714) 位于org.datanucleus.PersistenceConfiguration.setPersistenceProperties(PersistenceConfiguration.java:693) 位于org.datanucleus.NucleusContext(NucleusContext.java:273) 位于org.datanucleus.NucleusContext(NucleusContext.java:247) 位于org.datanucleus.NucleusContext(NucleusContext.java:225) 我得到了一条线索

Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 27 more caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 32 more Caused by: javax.jdo.JDOFatalInternalException: Unexpected exception caught.
原因:java.lang.RuntimeException:无法实例化org.apache.hadoop.hive.metastore.HiveMetaStoreClient
位于org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
位于org.apache.hadoop.hive.metastore.RetryingMetaStoreClient。(RetryingMetaStoreClient.java:62)
位于org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
位于org.apache.hadoop.hive.ql.metadata.hive.createMetaStoreClient(hive.java:2453)
位于org.apache.hadoop.hive.ql.metadata.hive.getMSC(hive.java:2465)
位于org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 27更多原因:java.lang.reflect.InvocationTargetException
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:408)
位于org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 另外32个原因:javax.jdo.JDOFatalInternalException:捕获意外异常。

但是..在spark sql控制台中运行我的查询是可行的。这有什么问题。

原因是,如果您将spark应用打包为fat/uber jar,datanucleus库不喜欢它,因为每个datanucleus依赖jar中都有一个公共pom.xml,它会被一个覆盖。您需要将它们分别添加到类路径。

哎哟!我有完全相同的问题,但情况不同。这与hivemetastore有关。因此,我认为其中一个原因是Spark是使用Hive0.13构建的。我们现在正在运行hiveserver 0.14.0或Hive1.0。最后放弃并使用配置单元客户端连接并运行hql语句