Integration 使用hive+hbase集成的java客户端时ClassNotFoundException

Integration 使用hive+hbase集成的java客户端时ClassNotFoundException,integration,hbase,hive,classnotfoundexception,Integration,Hbase,Hive,Classnotfoundexception,我有一个hive+hbase集成集群 当我试图通过配置单元的java客户端执行查询时,有时会发生ClassNotFoundException 我的java代码: final Connection conn = DriverManager.getConnection(URL); final ResultSet rs = conn.executeQuery("SELECT count(*) FROM test_table WHERE (source = '0' AND ur_createtime B

我有一个hive+hbase集成集群

当我试图通过配置单元的java客户端执行查询时,有时会发生ClassNotFoundException

我的java代码:

final Connection conn = DriverManager.getConnection(URL);
final ResultSet rs = conn.executeQuery("SELECT count(*) FROM test_table WHERE (source = '0' AND ur_createtime BETWEEN '20121031000000' AND '20121031235959')");
我可以在hive cli mod中执行sql:从test_表中选择count*,其中source='0'和ur_createtime介于'20121031000000'和'20121031235959'之间,并获取查询结果,因此我的sql中没有错误

客户端异常:

Caused by: java.sql.SQLException: Query returned non-zero code: 9, cause: FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask
    at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:189)
... 23 more
服务器端例外Hadoop jobtracker:

2012-11-05 18:55:39,443 INFO org.apache.hadoop.mapred.TaskInProgress: Error from attempt_201210301133_0112_m_000000_3: java.io.IOException: Cannot create an instance of InputSplit class = org.apache.hadoop.hive.hbase.HBaseSplit:org.apache.hadoop.hive.hbase.HBaseSplit
    at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:146)
    at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:67)
    at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:40)
    at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:396)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:412)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Unknown Source)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.hbase.HBaseSplit
    at java.net.URLClassLoader$1.run(Unknown Source)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Unknown Source)
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:819)
    at org.apache.hadoop.hive.ql.io.HiveInputFormat$HiveInputSplit.readFields(HiveInputFormat.java:143)
    ... 10 more
我的蜂巢环境

export HIVE_AUX_JARS_PATH=/data/install/hive-0.9.0/lib/hive-hbase-handler-0.9.0.jar,/data/install/hive-0.9.0/lib/hbase-0.92.0.jar,/data/install/hive-0.9.0/lib/zookeeper-3.4.2.jar
My hive-site.xml

<property>
    <name>hive.zookeeper.quorum</name>
    <value>hadoop01,hadoop02,hadoop03</value>
    <description>The list of zookeeper servers to talk to. This is only needed for read/write locks.</description>
</property>

服务器端错误日志显示未找到HBaseSplit。但是为什么呢?如何修复此问题?

此问题的解决方法是,您可以复制jar文件 hive-hbase-handler-0.9.0-cdh4.1.2、hbase-0.92.1-cdh4.1.2-security等至HADOOP lib文件夹 或者在HADOOP_CLASSPATH环境变量中添加这些JAR的路径

在$HIVE_HOME中创建一个文件夹auxlib,并将所有HIVE hbase处理程序、hbase JAR放入该文件夹

将以下行添加到$HIVE_HOME/conf/HIVE-site.xml

<property>
 <name>hive.aux.jars.path</name>
 <value>file:///<absolute-path-of-all-auxlib-jars></value>
</property>
重新启动配置单元服务器


如果您无权访问配置文件,可以使用-auxpath开关将JAR添加到配置单元cli类路径:

hive --auxpath /path/to/hive-hbase-handler-0.10.0-cdh4.2.0.jar,/path/to/hbase.jar 
hive --auxpath /path/to/hive-hbase-handler-0.10.0-cdh4.2.0.jar,/path/to/hbase.jar