Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/318.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
无法使用Java客户端API扫描Hbase中的表_Java_Hadoop_Hbase_Hortonworks Data Platform - Fatal编程技术网

无法使用Java客户端API扫描Hbase中的表

无法使用Java客户端API扫描Hbase中的表,java,hadoop,hbase,hortonworks-data-platform,Java,Hadoop,Hbase,Hortonworks Data Platform,我正在尝试扫描Hbase中的表并检索其中的所有记录。这是我用来扫描表格的方法。我使用Maven构建项目 public void getAllRecord (String tableName) { try{ HTable table = new HTable(configuration, tableName); Scan s = new Scan(); ResultScanner ss = table.getScanner(s);

我正在尝试扫描Hbase中的表并检索其中的所有记录。这是我用来扫描表格的方法。我使用Maven构建项目

 public void getAllRecord (String tableName) {
    try{
        HTable table = new HTable(configuration, tableName);
        Scan s = new Scan();
        ResultScanner ss = table.getScanner(s);
        for(Result r:ss){
            for(KeyValue kv : r.raw()){
                System.out.print(new String(kv.getRow()) + " ");
                System.out.print(new String(kv.getFamily()) + ":");
                System.out.print(new String(kv.getQualifier()) + " ");
                System.out.print(kv.getTimestamp() + " ");
                System.out.println(new String(kv.getValue()));

                break;
            }
        }
    } catch (IOException e){
        e.printStackTrace();
    }
}
这些都是我对专家的依赖

 <dependencies>
    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-thrift</artifactId>
        <version>0.98.2-hadoop2</version>
        <type>jar</type>
    </dependency>
    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-client</artifactId>
        <version>0.96.0-hadoop2</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>1.2.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-auth</artifactId>
        <version>2.2.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.opennlp</groupId>
        <artifactId>opennlp-tools</artifactId>
        <version>1.5.3</version>
    </dependency>
    <dependency>
        <groupId>org.apache.opennlp</groupId>
        <artifactId>opennlp-maxent</artifactId>
        <version>3.0.3</version>
    </dependency>
    <dependency>
        <groupId>org.apache.solr</groupId>
        <artifactId>solr-solrj</artifactId>
        <version>4.9.0</version>
    </dependency>

</dependencies>

org.apache.hbase
节约
0.98.2-hadoop2
罐子
org.apache.hbase
hbase客户端
0.96.0-hadoop2
org.apache.hadoop
hadoop内核
1.2.1
org.apache.hadoop
hadoop验证
2.2.0
org.apache.opennlp
opennlp工具
1.5.3
org.apache.opennlp
opennlp maxent
3.0.3
org.apache.solr
索尔索尔
4.9.0
但问题是,当我运行此方法时,我收到以下错误

线程“main”java.lang.NoClassDefFoundError中出现异常:org/apache/hadoop/hbase/util/SoftValueSortedMap

原因:java.lang.ClassNotFoundException:org.apache.hadoop.hbase.util.SoftValueSortedMap


如何解决此问题?

您的类路径中似乎缺少一些JAR。您可能忘记将util jar添加到类路径中,或者由于某种原因它没有被包含(可能是maven通过标记故意排除了util jar)

通过向项目添加以下依赖性解决了问题

    <dependency>
        <groupId>org.apache.hbase</groupId>
        <artifactId>hbase-client</artifactId>
        <version>0.96.0-hadoop2</version>
    </dependency>

org.apache.hbase
hbase客户端
0.96.0-hadoop2