Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/redis/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop Hbase远程连接_Hadoop_Hbase_Remote Access - Fatal编程技术网

Hadoop Hbase远程连接

Hadoop Hbase远程连接,hadoop,hbase,remote-access,Hadoop,Hbase,Remote Access,我在7个AWS实例中设置了一个Hadoop集群,并在其中安装了HBase。我试图从一台不在群集中的机器上操作HBase表。我使用HBase API编写了以下代码,只有在其中一个群集节点中执行该代码时才有效: BasicConfigurator.configure(); //This reads hdfs-site.xml HdfsConfiguration hdfsConf = new HdfsConfiguration(); System.setProperty("hadoop.

我在7个AWS实例中设置了一个Hadoop集群,并在其中安装了HBase。我试图从一台不在群集中的机器上操作HBase表。我使用HBase API编写了以下代码,只有在其中一个群集节点中执行该代码时才有效:

BasicConfigurator.configure();

//This reads hdfs-site.xml
HdfsConfiguration hdfsConf = new HdfsConfiguration();

System.setProperty("hadoop.home.dir", new Path("/usr/hdp/current/hadoop-client").toString());

//This reads core-site.xml and hbase-site.xml
Configuration conf = HBaseConfiguration.create();
conf.set(HConstants.ZOOKEEPER_QUORUM, PUBLIC_DNS);

HBaseAdmin.checkHBaseAvailable(conf);

Connection conn = ConnectionFactory.createConnection(conf);
TableName tName = TableName.valueOf("history_sample");
Table t = conn.getTable(tName);
Get get = new Get(Bytes.toBytes("100005379"));
Result res = t.get(get);
byte[] geoloc_date = res.getValue(Bytes.toBytes("general_data"), Bytes.toBytes("geoloc_date"));
System.out.println("Example value: " + Bytes.toString(geoloc_date));

t.close();
conn.close();  
我想做的是创建一个类似的代码,它可以在我的本地机器上工作,这台机器在Hadoop集群之外,我不想在集群节点中执行jar文件