Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/api/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java HBase区域服务器因错误DataXceiver错误处理写入块操作而停机_Java_Hadoop_Hbase - Fatal编程技术网

Java HBase区域服务器因错误DataXceiver错误处理写入块操作而停机

Java HBase区域服务器因错误DataXceiver错误处理写入块操作而停机,java,hadoop,hbase,Java,Hadoop,Hbase,我有AWS集群,Cloudera Hadoop发行版5.4,配置了1个namenode和2个DataNode 我有一个包含100K条记录的HBase表,并使用Java在该表的顶部执行扫描操作。根据前端的用户选择,我将在表中的不同列上生成where条件以获取记录 当我有多个筛选条件并试图从HBase表中获取数据时,我得到以下异常 node3.xxx.com:50010:DataXceiver error processing WRITE_BLOCK operation src: /xxx.xx.

我有AWS集群,Cloudera Hadoop发行版5.4,配置了1个namenode和2个DataNode

我有一个包含100K条记录的HBase表,并使用Java在该表的顶部执行扫描操作。根据前端的用户选择,我将在表中的不同列上生成where条件以获取记录

当我有多个筛选条件并试图从HBase表中获取数据时,我得到以下异常

node3.xxx.com:50010:DataXceiver error processing WRITE_BLOCK operation  src: /xxx.xx.xx.194:58862 dst: /xxx.xx.xx.193:50010
java.io.IOException: Premature EOF from inputStream
    at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:201)
    at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213)
    at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134)
    at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109)
    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:466)
    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:780)
    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:783)
    at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:137)
    at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:74)
    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:243)
    at java.lang.Thread.run(Thread.java:745)


node3.xxx.com:50010:DataXceiver error processing WRITE_BLOCK operation  src: /xxx.xx.xx.194:35615 dst: /xxx.xx.xx.193:50010
java.io.IOException: Premature EOF from inputStream
    at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:201)
    at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doReadFully(PacketReceiver.java:213)
    at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRead(PacketReceiver.java:134)
    at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.receiveNextPacket(PacketReceiver.java:109)
    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receivePacket(BlockReceiver.java:466)
    at org.apache.hadoop.hdfs.server.datanode.BlockReceiver.receiveBlock(BlockReceiver.java:780)
    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:783)
    at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:137)
    at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:74)
    at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:243)
    at java.lang.Thread.run(Thread.java:745)
下面是我添加的使用Java连接到HBase的依赖项

<dependency>
    <groupId>org.apache.hbase</groupId>
    <artifactId>hbase-client</artifactId>
    <version>1.0.0-cdh5.4.3</version>
</dependency>

org.apache.hbase
hbase客户端
1.0.0-cdh5.4.3
下面是我用来从HBase表中获取记录的java代码

public synchronized List<Map<String, String>> getFilterData(Map<String, String> map) {
    Connection connection = null;
    Table table = null;
    ResultScanner resultScanner = null;

    List<Map<String, String>> resultList = new ArrayList<Map<String, String>>();

    try {
        connection = ConnectionFactory.createConnection(config);
        table = connection.getTable(TableName.valueOf(fileReader.getProperty("HBASE_FILTER_DATA_TABLE_NAME")));

        Scan scan = new Scan();
        scan.setCaching(1000);
        scan.addColumn(Bytes.toBytes("data"), Bytes.toBytes("avgValue"));
        scan.addColumn(Bytes.toBytes("data"), Bytes.toBytes("ts"));
        scan.addColumn(Bytes.toBytes("filter"), Bytes.toBytes("dataType"));

        SingleColumnValueFilter ageGroupFilter = new SingleColumnValueFilter(Bytes.toBytes("filter"), Bytes.toBytes("ageGroup"), CompareOp.EQUAL, Bytes.toBytes("-1"));
        SingleColumnValueFilter applicationFilter = new SingleColumnValueFilter(Bytes.toBytes("filter"), Bytes.toBytes("appName"), CompareOp.EQUAL, Bytes.toBytes("-1"));
        SingleColumnValueFilter deviceFilter = new SingleColumnValueFilter(Bytes.toBytes("filter"), Bytes.toBytes("deviceModel"), CompareOp.EQUAL, Bytes.toBytes("-1"));
        SingleColumnValueFilter genderFilter = new SingleColumnValueFilter(Bytes.toBytes("filter"), Bytes.toBytes("gender"), CompareOp.EQUAL, Bytes.toBytes("-1"));

        if (map.containsKey("ageGroup")) {
            ageGroupFilter = new SingleColumnValueFilter(Bytes.toBytes("filter"), Bytes.toBytes("ageGroup"), CompareOp.EQUAL, Bytes.toBytes(map.get("ageGroup")));
            scan.addColumn(Bytes.toBytes("filter"), Bytes.toBytes("ageGroup"));
        }

        if (map.containsKey("appName")) {
            applicationFilter = new SingleColumnValueFilter(Bytes.toBytes("filter"), Bytes.toBytes("appName"), CompareOp.EQUAL, Bytes.toBytes(map.get("appName").toLowerCase().replace(" ", "_")));
            scan.addColumn(Bytes.toBytes("filter"), Bytes.toBytes("appName"));
        }

        if (map.containsKey("deviceModel")) {
            deviceFilter = new SingleColumnValueFilter(Bytes.toBytes("filter"), Bytes.toBytes("deviceModel"), CompareOp.EQUAL, Bytes.toBytes(map.get("deviceModel").toLowerCase().replace(" ", "_")));
            scan.addColumn(Bytes.toBytes("filter"), Bytes.toBytes("deviceModel"));
        }

        if (map.containsKey("gender")) {
            genderFilter = new SingleColumnValueFilter(Bytes.toBytes("filter"), Bytes.toBytes("gender"), CompareOp.EQUAL, Bytes.toBytes(map.get("gender").toLowerCase()));
            scan.addColumn(Bytes.toBytes("filter"), Bytes.toBytes("gender"));
        }

        FilterList filters = new FilterList(FilterList.Operator.MUST_PASS_ALL);
        filters.addFilter(ageGroupFilter);
        filters.addFilter(applicationFilter);
        filters.addFilter(deviceFilter);
        filters.addFilter(genderFilter);

        if (map.containsKey("dataType")) {
            SingleColumnValueFilter dataTypeFilter = new SingleColumnValueFilter(Bytes.toBytes("filter"), Bytes.toBytes("dataType"), CompareOp.EQUAL, Bytes.toBytes(map.get("dataType").toLowerCase()));
            filters.addFilter(dataTypeFilter);
        }

        if (map.containsKey("startDate") && map.containsKey("endDate")) {
            String startDate = dateFormat.format(new Date(Long.parseLong(map.get("startDate"))));
            String endDate = dateFormat.format(new Date(Long.parseLong(map.get("endDate"))));

            SingleColumnValueFilter startDateFilter = new SingleColumnValueFilter(Bytes.toBytes("data"), Bytes.toBytes("hour"), CompareOp.GREATER_OR_EQUAL, Bytes.toBytes(startDate));
            SingleColumnValueFilter endDateFilter = new SingleColumnValueFilter(Bytes.toBytes("data"), Bytes.toBytes("hour"), CompareOp.LESS_OR_EQUAL, Bytes.toBytes(endDate));

            filters.addFilter(startDateFilter);
            filters.addFilter(endDateFilter);
        }

        scan.setFilter(filters);

        resultScanner = table.getScanner(scan);

        for (Result result = resultScanner.next(); result != null; result = resultScanner.next()) {
            Map<String, String> row = new HashMap<String, String>();
            for (Cell cell : result.listCells()) {
                row.put(Bytes.toString(CellUtil.cloneQualifier(cell)), Bytes.toString(CellUtil.cloneValue(cell)));
            }

            resultList.add(row);
        }
    } catch (Exception e) {
        logger.error("Exception in getFilterData() : ", e.getMessage(), e); 
    } finally {
        try {
            if (resultScanner != null) {
                resultScanner.close();
            }

            if (table != null) {
                table.close();
            }

            if (connection != null && !connection.isClosed()) {
                connection.close();
            }
        } catch (Exception e2) {
        }
    }

    return resultList;
}
公共同步列表getFilterData(映射){
连接=空;
Table=null;
ResultScanner ResultScanner=null;
List resultList=new ArrayList();
试一试{
connection=ConnectionFactory.createConnection(配置);
table=connection.getTable(TableName.valueOf(fileReader.getProperty(“HBASE\u FILTER\u DATA\u table\u NAME”));
扫描=新扫描();
scan.setCaching(1000);
scan.addColumn(Bytes.toBytes(“数据”)、Bytes.toBytes(“avgValue”);
scan.addColumn(Bytes.toBytes(“数据”)、Bytes.toBytes(“ts”);
scan.addColumn(Bytes.toBytes(“过滤器”)、Bytes.toBytes(“数据类型”);
SingleColumnValueFilter ageGroupFilter=新的SingleColumnValueFilter(Bytes.toBytes(“filter”)、Bytes.toBytes(“ageGroup”)、CompareOp.EQUAL、Bytes.toBytes(“-1”);
SingleColumnValueFilter applicationFilter=新的SingleColumnValueFilter(Bytes.toBytes(“filter”)、Bytes.toBytes(“appName”)、CompareOp.EQUAL、Bytes.toBytes(“-1”);
SingleColumnValueFilter deviceFilter=新的SingleColumnValueFilter(Bytes.toBytes(“filter”)、Bytes.toBytes(“deviceModel”)、CompareOp.EQUAL、Bytes.toBytes(“-1”);
SingleColumnValueFilter genderFilter=新的SingleColumnValueFilter(Bytes.toBytes(“过滤器”)、Bytes.toBytes(“性别”)、CompareOp.EQUAL、Bytes.toBytes(“1”);
if(map.containsKey(“年龄组”)){
ageGroupFilter=新的SingleColumnValueFilter(Bytes.toBytes(“filter”)、Bytes.toBytes(“ageGroup”)、CompareOp.EQUAL、Bytes.toBytes(map.get(“ageGroup”);
scan.addColumn(Bytes.toBytes(“过滤器”)、Bytes.toBytes(“年龄组”);
}
if(map.containsKey(“appName”)){
applicationFilter=new SingleColumnValueFilter(Bytes.toBytes(“filter”)、Bytes.toBytes(“appName”)、CompareOp.EQUAL、Bytes.toBytes(map.get(“appName”).toLowerCase().replace(“,”);
scan.addColumn(Bytes.toBytes(“filter”)、Bytes.toBytes(“appName”);
}
if(映射containsKey(“设备模型”)){
deviceFilter=new SingleColumnValueFilter(Bytes.toBytes(“filter”)、Bytes.toBytes(“deviceModel”)、CompareOp.EQUAL、Bytes.toBytes(map.get(“deviceModel”).toLowerCase().replace(“,”);
scan.addColumn(Bytes.toBytes(“过滤器”)、Bytes.toBytes(“设备模型”);
}
if(地图内容(“性别”)){
genderFilter=new SingleColumnValueFilter(Bytes.toBytes(“filter”)、Bytes.toBytes(“gender”)、CompareOp.EQUAL、Bytes.toBytes(map.get(“gender”).toLowerCase());
scan.addColumn(Bytes.toBytes(“过滤器”)、Bytes.toBytes(“性别”);
}
filterlistfilters=新的FilterList(FilterList.Operator.MUST\u PASS\u ALL);
filters.addFilter(ageGroupFilter);
filters.addFilter(applicationFilter);
filters.addFilter(deviceFilter);
filters.addFilter(genderFilter);
if(map.containsKey(“数据类型”)){
SingleColumnValueFilter dataTypeFilter=新的SingleColumnValueFilter(Bytes.toBytes(“filter”)、Bytes.toBytes(“dataType”)、CompareOp.EQUAL、Bytes.toBytes(map.get(“dataType”).toLowerCase());
filters.addFilter(dataTypeFilter);
}
if(map.containsKey(“开始日期”)和map.containsKey(“结束日期”)){
String startDate=dateFormat.format(新日期(Long.parseLong(map.get(“startDate”)));
字符串endDate=dateFormat.format(新日期(Long.parseLong(map.get(“endDate”)));
SingleColumnValueFilter startDateFilter=新的SingleColumnValueFilter(Bytes.toBytes(“数据”)、Bytes.toBytes(“小时”)、CompareOp.GREATER_或_EQUAL、Bytes.toBytes(startDate));
SingleColumnValueFilter endDateFilter=新的SingleColumnValueFilter(Bytes.toBytes(“数据”)、Bytes.toBytes(“小时”)、CompareOp.LESS_或_EQUAL、Bytes.toBytes(endDate));
filters.addFilter(startDateFilter);
filters.addFilter(endDateFilter);
}
scan.setFilter(过滤器);
resultScanner=table.getScanner(扫描);
for(Result=resultScanner.next();Result!=null;Result=resultScanner.next()){
Map row=newhashmap();
对于(单元格:result.listCells()){
row.put(Bytes.toString(CellUtil.cloneQualifier(cell)),Bytes.toString(CellUtil.cloneValue(cell));
}
结果列表.添加(行);
}
}捕获(例外e){
错误(“getFilterData()中的异常:”,e.getMessage(),e);
}最后{
试一试{
if(resultScanner!=null){
resultScanner.close();
}
如果(表!=null){
table.close();
}
if(connection!=null&&!connection.isClosed()){
connection.close();
}
}捕获(异常e2){
}
}
返回结果列表;
}

Hi Prasad。我不知道这是否有帮助,但我会开始使用try with resources块并删除finally,因此我会确保finally块中没有错误。值得一试;)