java.lang.ClassNotFoundException:org.apache.hadoop.hive.ql.io.hivepassthroughoutput格式

java.lang.ClassNotFoundException:org.apache.hadoop.hive.ql.io.hivepassthroughoutput格式,hive,hbase,apache-spark,shark-sql,Hive,Hbase,Apache Spark,Shark Sql,我已按照链接在CDH5上安装shark。我已经安装了它,但正如上面提到的:- This -skipRddReload is only needed when you have some table with hive/hbase mapping, because of some issus in PassthroughOutputFormat by hive hbase handler. the error message is something like: "Property value

我已按照链接在CDH5上安装shark。我已经安装了它,但正如上面提到的:-

This -skipRddReload is only needed when you have some table with hive/hbase mapping, because of some issus in PassthroughOutputFormat by hive hbase handler.

the error message is something like:
"Property value must not be null"
or
"java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.io.HivePassThroughOutputFormat"
我在配置单元中创建了一个外部表来访问Hbase表,当我使用
-skipRddReload
尝试shark时,shark启动了,但当我尝试访问shark中的同一个外部表时,shark获取错误

java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.io.HivePassThroughOutputFormat
有什么解决办法吗

编辑

Hbase与蜂巢

CREATE EXTERNAL TABLE abc (key string,LPID STRING,Value int,ts1 STRING,ts2 STRING) 
ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' 
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES 
("hbase.columns.mapping" = ":key,cf1:LPID,cf1:Value,cf1:ts1,cf1:ts2") 
TBLPROPERTIES("hbase.table.name" = "abc");

这是我想在shark中访问的abc,有什么解决方案吗?

您的表定义是什么?你在使用任何serdes吗?@visakh我已经编辑了这个问题,谢谢。请确保在Shark和CDH中正确链接了hive-serdes-1.0-SNAPSHOT.jar。我不知道Cloudera是如何打包的,但它很可能在
cdh/lib/hive/lib
中。还要确保它也存在于所有工作节点中。。