Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/clojure/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Scala 在CDH5.1中使用火花壳与HBase连接_Scala_Hbase_Apache Spark_Cloudera Cdh - Fatal编程技术网

Scala 在CDH5.1中使用火花壳与HBase连接

Scala 在CDH5.1中使用火花壳与HBase连接,scala,hbase,apache-spark,cloudera-cdh,Scala,Hbase,Apache Spark,Cloudera Cdh,我目前有一个针对virtual box的CDH 5.1的新映像,在尝试使用spark shell连接到HBase时遇到了一个问题。下面是scala代码: import org.apache.hadoop.hbase.HBaseConfiguration import org.apache.hadoop.hbase.client.{HBaseAdmin,HTable,Put,Get} import org.apache.hadoop.hbase.util.Bytes val conf = new

我目前有一个针对virtual box的CDH 5.1的新映像,在尝试使用spark shell连接到HBase时遇到了一个问题。下面是scala代码:

import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.client.{HBaseAdmin,HTable,Put,Get}
import org.apache.hadoop.hbase.util.Bytes
val conf = new HBaseConfiguration()
val admin = new HBaseAdmin(conf)
以下是错误:

java.io.IOException: java.lang.reflect.InvocationTargetException
    at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:416)
    at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:393)
    at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:274)
    at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:192)
.
.
.
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:414)
    ... 43 more
Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
    at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:195)
    at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
    at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
    at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:857)
    at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:662)
    ... 48 more
Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    ... 54 more
java.io.IOException:java.lang.reflect.InvocationTargetException
位于org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:416)
位于org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:393)
位于org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:274)
位于org.apache.hadoop.hbase.client.HBaseAdmin.(HBaseAdmin.java:192)
.
.
.
原因:java.lang.reflect.InvocationTargetException
位于sun.reflect.NativeConstructorAccessorImpl.newInstance0(本机方法)
位于sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
在sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
位于java.lang.reflect.Constructor.newInstance(Constructor.java:526)
位于org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:414)
... 43多
原因:java.lang.NoClassDefFoundError:org/cloudera/htrace/Trace
位于org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:195)
位于org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
位于org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
位于org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
位于org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:857)
位于org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.(HConnectionManager.java:662)
... 48多
原因:java.lang.ClassNotFoundException:org.cloudera.htrace.Trace
在java.net.URLClassLoader$1.run(URLClassLoader.java:366)
在java.net.URLClassLoader$1.run(URLClassLoader.java:355)
位于java.security.AccessController.doPrivileged(本机方法)
位于java.net.URLClassLoader.findClass(URLClassLoader.java:354)
位于java.lang.ClassLoader.loadClass(ClassLoader.java:425)
位于sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
位于java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 54多

可能是版本问题,因为它是由NoSuchXXXError的某种形式引起的。您的HBase 5.1也是吗?

这个问题是我没有正确设置spark配置。我必须在spark配置中添加以下内容:
在spark defaults.conf中:
spark.executor.extraClassPath/usr/lib/hive/lib/hive hbase handler.jar:/usr/lib/hbase/hbase server.jar:/usr/lib/hbase/hbase-protocol.jar:/usr/lib/hbase/hbase-compat.jar:/usr/lib/hbase/hbase-common.jar:/usr/lib/hbase/lib/htrace-core.jar:/etc/hbase/conf

在spark env.sh中:

导出SPARK_CLASSPATH=/usr/lib/hbase/hbase-server.jar:/usr/lib/hbase/hbase-hbase-protocol.jar:/usr/lib/hbase/hbase-hadoop2-compat.jar:/usr/lib/hbase/hbase-client.jar:/usr/lib/hbase/lib/htrace-core.jar:/etc/hbase/conf

Yup它是5.1。我拉了CDH5.1映像,所以所有的服务都是5.1版本。