Java 配置单元-启动时的元存储连接

Java 配置单元-启动时的元存储连接,java,hadoop,hive,datanucleus,Java,Hadoop,Hive,Datanucleus,我已开始在配置单元命令上出现以下故障: Logging initialized using configuration in file:/usr/local/someuser/hive/conf/hive-log4j.properties Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.me

我已开始在配置单元命令上出现以下故障:

Logging initialized using configuration in file:/usr/local/someuser/hive/conf/hive-log4j.properties
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 7 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 12 more
Caused by: javax.jdo.JDOFatalUserException: There is no available StoreManager of type "rdbms". Make sure that you have put the relevant DataNucleus store plugin in your CLASSPATH and if defining a connection via JNDI or DataSource you also need to provide persistence property "datanucleus.storeManagerType"
NestedThrowables:
org.datanucleus.exceptions.NucleusUserException: There is no available StoreManager of type "rdbms". Make sure that you have put the relevant DataNucleus store plugin in your CLASSPATH and if defining a connection via JNDI or DataSource you also need to provide persistence property "datanucleus.storeManagerType"
at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:528)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)
... 17 more
Caused by: org.datanucleus.exceptions.NucleusUserException: There is no available StoreManager of type "rdbms". Make sure that you have put the relevant DataNucleus store plugin in your CLASSPATH and if defining a connection via JNDI or DataSource you also need to provide persistence property "datanucleus.storeManagerType"
at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1217)
at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
... 46 more
使用文件:/usr/local/someuser/hive/conf/hive-log4j.properties中的配置初始化日志记录
线程“main”java.lang.RuntimeException中的异常:java.lang.RuntimeException:无法实例化org.apache.hadoop.hive.metastore.HiveMetaStoreClient
位于org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
位于org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
位于org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)中
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:606)
位于org.apache.hadoop.util.RunJar.main(RunJar.java:156)
原因:java.lang.RuntimeException:无法实例化org.apache.hadoop.hive.metastore.HiveMetaStoreClient
位于org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
位于org.apache.hadoop.hive.metastore.RetryingMetaStoreClient。(RetryingMetaStoreClient.java:62)
位于org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
位于org.apache.hadoop.hive.ql.metadata.hive.createMetaStoreClient(hive.java:2453)
位于org.apache.hadoop.hive.ql.metadata.hive.getMSC(hive.java:2465)
位于org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 还有7个
原因:java.lang.reflect.InvocationTargetException
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:526)
位于org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 还有12个
原因:javax.jdo.JDOFatalUserException:没有可用的“rdbms”类型的StoreManager。确保已将相关的DataNucleus存储插件放入类路径中,如果通过JNDI或DataSource定义连接,还需要提供持久性属性“DataNucleus.storeManagerType”
嵌套的工作流表:
org.datanucleus.exceptions.NucleusUserException:没有类型为“rdbms”的可用StoreManager。确保已将相关的DataNucleus存储插件放入类路径中,如果通过JNDI或DataSource定义连接,还需要提供持久性属性“DataNucleus.storeManagerType”
位于org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:528)
位于org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
位于org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
位于org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)中
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:606)
javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
位于java.security.AccessController.doPrivileged(本机方法)
位于javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
在javax.jdo.JDOHelper.InvokeGetPersistenceManager工厂实现中(JDOHelper.java:1166)
位于javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
位于javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
位于org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
位于org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
位于org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
位于org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
位于org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
位于org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
位于org.apache.hadoop.hive.metastore.RawStoreProxy。(RawStoreProxy.java:58)
位于org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
位于org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)
位于org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)
位于org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
位于org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
位于org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler。(HiveMetaStore.java:356)
位于org.apache.hadoop.hive.metastore.RetryingHMSHandler(RetryingHMSHandler.java:54)
位于org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
位于org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
位于org.apache.hadoop.hive.metastore.HiveMetaStoreClient。(HiveMetaStoreClient.java:171)
... 还有17个
原因:org.datanucleus.exceptions.nucleuserexception:没有可用的“rdbms”类型的StoreManager。确保已将相关的DataNucleus存储插件放入类路径中,如果通过JNDI或DataSource定义连接,还需要提供持久性属性“DataNucleus.storeManagerType”
位于org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1217)
位于org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
位于org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
... 46多
这是蜂巢的位置

<configuration>
<property>
  <name>mapred.reduce.tasks</name>
  <value>6</value>
  <name>mapred.map.tasks</name>
  <value>7</value>
</property>

<property>
  <name>hive.exec.scratchdir</name>
  <value>/data/cloud/hive/logs/hive-${user.name}</value>
</property>

<property>
  <name>hive.exec.local.scratchdir</name>
  <value>/data/cloud/hive/logs/hive-${user.name}</value>
</property>

<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:mysql://somehost:3306/hive?createDatabaseIfNotExist=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionUserName</name>
  <value>xx</value>
  <description>username to use against metastore database</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionPassword</name>
  <value>xx</value>
  <description>password to use against metastore database</description>
</property>


<property>
  <name>hive.metastore.warehouse.dir</name>
  <value>/user/hive/warehouse</value>
  <description>location of default database for the warehouse</description>
</property>


</configuration>

mapred.reduce.tasks