运行配置单元时出错:org.apache.hadoop.hive.ql.metadata.HiveException:java.lang.RuntimeException配置单元错误等

运行配置单元时出错:org.apache.hadoop.hive.ql.metadata.HiveException:java.lang.RuntimeException配置单元错误等,hadoop,hive,mahout,Hadoop,Hive,Mahout,我的,巴什尔 运行配置单元时获取错误: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException配置单元错误及更多错误 JAVA Hadoop 象夫 蜂箱 export-HIVE\u-HOME=/HOME/yash/HIVE/apache-HIVE-2.1.0-bin 导出路径=$PATH:$HIVE\u HOME/bin My hive-site.xml: javax.jdo.option.Conn

我的,巴什尔

运行配置单元时获取错误: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException配置单元错误及更多错误

JAVA Hadoop 象夫 蜂箱
export-HIVE\u-HOME=/HOME/yash/HIVE/apache-HIVE-2.1.0-bin
导出路径=$PATH:$HIVE\u HOME/bin
My hive-site.xml:
javax.jdo.option.ConnectionURL
jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true
javax.jdo.option.ConnectionDriverName
com.mysql.jdbc.Driver
javax.jdo.option.ConnectionUserName
根
javax.jdo.option.ConnectionPassword
根
datanucleus.autoCreateSchema
真的
datanucleus.fixedDatastore
真的
datanucleus.autoCreateTables
真的
hive.querylog.location
/home/yash/hive/apache-hive-2.1.0-bin/iotmp
配置单元运行时结构化日志文件的位置
hive.exec.local.scratchdir
/home/yash/hive/apache-hive-2.1.0-bin/iotmp
用于配置单元作业的本地暂存空间
hive.download.resources.dir
/home/yash/hive/apache-hive-2.1.0-bin/iotmp
用于在远程文件系统中添加资源的临时本地目录。
错误
SLF4J:类路径包含多个SLF4J绑定。
SLF4J:在[jar:file:/home/yash/hive/apache-hive-2.1.0-bin/lib/log4j-SLF4J-impl-2.4.1.jar!/org/SLF4J/impl/StaticLoggerBinder.class]中找到绑定
SLF4J:在[jar:file:/usr/local/hadoop/share/hadoop/common/lib/SLF4J-log4j12-1.7.25.jar!/org/SLF4J/impl/StaticLoggerBinder.class]中找到绑定
SLF4J:参见http://www.slf4j.org/codes.html#multiple_bindings 我需要一个解释。
SLF4J:实际绑定的类型为[org.apache.logging.SLF4J.Log4jLoggerFactory]
使用jar:file:/home/yash/hive/apache-hive-2.1.0-bin/lib/hive-common-2.1.0.jar中的配置初始化日志记录/配置单元-log4j2.properties
异步:true 线程“main”java.lang.RuntimeException中的异常:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException:无法实例化 org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 位于org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:578) 位于org.apache.hadoop.hive.ql.session.SessionState.begintart(SessionState.java:518) 位于org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705) 位于org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641) 在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处 位于sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中 位于java.lang.reflect.Method.invoke(Method.java:498) 位于org.apache.hadoop.util.RunJar.run(RunJar.java:239) 位于org.apache.hadoop.util.RunJar.main(RunJar.java:153) 原因:org.apache.hadoop.hive.ql.metadata.HiveException:java.lang.RuntimeException:无法实例化 org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 位于org.apache.hadoop.hive.ql.metadata.hive.RegisteralFunctionsOnce(hive.java:226) 位于org.apache.hadoop.hive.ql.metadata.hive(hive.java:366) 位于org.apache.hadoop.hive.ql.metadata.hive.create(hive.java:310) 位于org.apache.hadoop.hive.ql.metadata.hive.getInternal(hive.java:290) 位于org.apache.hadoop.hive.ql.metadata.hive.get(hive.java:266) 位于org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:545) ... 9更多 原因:java.lang.RuntimeException:无法实例化org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 位于org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1627) 位于org.apache.hadoop.hive.metastore.RetryingMetaStoreClient。(RetryingMetaStoreClient.java:80) 位于org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130) 位于org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101) 位于org.apache.hadoop.hive.ql.metadata.hive.createMetaStoreClient(hive.java:3317) 位于org.apache.hadoop.hive.ql.metadata.hive.getMSC(hive.java:3356) 位于org.apache.hadoop.hive.ql.metadata.hive.getMSC(hive.java:3336) 位于org.apache.hadoop.hive.ql.metadata.hive.getAllFunctions(hive.java:3590) 位于org.apache.hadoop.hive.ql.metadata.hive.reloadFunctions(hive.java:236) 位于org.apache.hadoop.hive.ql.metadata.hive.RegisteralFunctionsOnce(hive.java:221) ... 14多 原因:java.lang.reflect.InvocationTargetException 位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法) 位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 位于java.lang.reflect.Constructor.newInstance(Constructor.java:423) 位于org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1625) ... 23多 原因:java.lang.IllegalArgumentException:无法识别的Hadoop主要版本号:3.0.1 在org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:169)上 位于org.apache.hadoop.hive.shimmers.ShimLoader.loadshimmers(ShimLoader.java:136) 在org.apache.hadoop.hive.shimmers.ShimLoader.gethadoopshimmers(ShimLoader.java:95)上 位于org.apache.hadoop.hive.metastore.ObjectStore.getDataSourceProps(ObjectStore.java:402) 位于org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:275) 位于org.apache.hadoop.util.ReflectionUtils.setConf(Refl
export JAVA_HOME=/usr/lib/jvm/java-8-oracle 

export PATH=$PATH:$JAVA_HOME/bin
export HADOOP_HOME=/usr/local/hadoop 
export HADOOP_MAPRED_HOME=$HADOOP_HOME 
export HADOOP_COMMON_HOME=$HADOOP_HOME 
export HADOOP_HDFS_HOME=$HADOOP_HOME 
export YARN_HOME=$HADOOP_HOME 
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"
export HADOOP_INSTALL=$HADOOP_HOME
export PATH=$PATH:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/sbin
export MAHOUT_HOME=/usr/local/mahout

# HBase
export HBASE_HOME=/usr/local/hbase
export PATH=$PATH:$HBASE_HOME/bin
export HIVE_HOME=/home/yash/hive/apache-hive-2.1.0-bin
export PATH=$PATH:$HIVE_HOME/bin

 My hive-site.xml:

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
</property>
<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
</property>
<property>
  <name>javax.jdo.option.ConnectionUserName</name>
  <value>root</value>
</property>
<property>
  <name>javax.jdo.option.ConnectionPassword</name>
  <value>root</value>
</property>
<property>
  <name>datanucleus.autoCreateSchema</name>
  <value>true</value>
</property>
<property>
  <name>datanucleus.fixedDatastore</name>
  <value>true</value>
</property>
<property>
 <name>datanucleus.autoCreateTables</name>
 <value>True</value>
 </property>
<property>
    <name>hive.querylog.location</name>
    <value>/home/yash/hive/apache-hive-2.1.0-bin/iotmp</value>
    <description>Location of Hive run time structured log file</description>
  </property>
  <property>
    <name>hive.exec.local.scratchdir</name>
    <value>/home/yash/hive/apache-hive-2.1.0-bin/iotmp</value>
    <description>Local scratch space for Hive jobs</description>
  </property>
  <property>
    <name>hive.downloaded.resources.dir</name>
    <value>/home/yash/hive/apache-hive-2.1.0-bin/iotmp</value>
    <description>Temporary local directory for added resources in the        remote file system.</description>
  </property>
</configuration>
 <i>ERROR<i>

    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in [jar:file:/home/yash/hive/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

    Logging initialized using configuration in jar:file:/home/yash/hive/apache-hive-2.1.0-bin/lib/hive-common-2.1.0.jar!/hive-log4j2.properties