Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/379.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java Spark流式数据将数据放入HBase的问题_Java_Apache Spark_Hbase_Spark Streaming - Fatal编程技术网

Java Spark流式数据将数据放入HBase的问题

Java Spark流式数据将数据放入HBase的问题,java,apache-spark,hbase,spark-streaming,Java,Apache Spark,Hbase,Spark Streaming,我是这个领域的初学者,所以我对它没有感觉 HBase版本:0.98.24-hadoop2 Spark版本:2.1.0 下面的代码尝试将从Spark Streming Kafka producer接收的数据放入HBase 卡夫卡输入数据格式如下: 第1行,TAG1123 第1行,标记2134 Spark streaming进程通过分隔符“,”分割接收行,然后将数据放入HBase。 但是,我的应用程序在调用htable.put()方法时遇到错误。 有人能解释为什么下面的代码会抛出错误吗 多谢

我是这个领域的初学者,所以我对它没有感觉

  • HBase版本:0.98.24-hadoop2
  • Spark版本:2.1.0
下面的代码尝试将从Spark Streming Kafka producer接收的数据放入HBase

  • 卡夫卡输入数据格式如下:

    第1行,TAG1123
    第1行,标记2134

Spark streaming进程通过分隔符“,”分割接收行,然后将数据放入HBase。 但是,我的应用程序在调用htable.put()方法时遇到错误。 有人能解释为什么下面的代码会抛出错误吗

多谢各位

JavaDStream<String> records = lines.flatMap(new FlatMapFunction<String, String>() {   
    private static final long serialVersionUID = 7113426295831342436L;

    HTable htable; 
    public HTable set() throws IOException{ 
        Configuration hconfig = HBaseConfiguration.create();
        hconfig.set("hbase.zookeeper.property.clientPort", "2222");
        hconfig.set("hbase.zookeeper.quorum", "127.0.0.1");  

        HConnection hconn = HConnectionManager.createConnection(hconfig);  

        htable = new HTable(hconfig, tableName); 

        return htable;  
    };  
    @Override
    public Iterator<String> call(String x) throws IOException {  

        ////////////// Put into HBase   ///////////////////// 
        String[] data = x.split(",");   

        if (null != data && data.length > 2 ){ 
            SimpleDateFormat sdf = new SimpleDateFormat("yyyyMMddHHmmss");   
            String ts = sdf.format(new Date());  

            Put put = new Put(Bytes.toBytes(ts)); 

            put.addImmutable(Bytes.toBytes(familyName), Bytes.toBytes("LINEID"), Bytes.toBytes(data[0]));
            put.addImmutable(Bytes.toBytes(familyName), Bytes.toBytes("TAGID"), Bytes.toBytes(data[1]));
            put.addImmutable(Bytes.toBytes(familyName), Bytes.toBytes("VAL"), Bytes.toBytes(data[2]));

/*I've checked data passed like this 
{"totalColumns":3,"row":"20170120200927",
"families":{"TAGVALUE":
[{"qualifier":"LINEID","vlen":3,"tag[],  "timestamp":9223372036854775807},
{"qualifier":"TAGID","vlen":3,"tag":[],"timestamp":9223372036854775807},
{"qualifier":"VAL","vlen":6,"tag" [],"timestamp":9223372036854775807}]}}*/ 


//********************* ERROR *******************//   
            htable.put(put);  
            htable.close();  


        }

        return Arrays.asList(COLDELIM.split(x)).iterator(); 
    } 
}); 

您没有调用此方法
public HTable set()抛出IOException
它返回htable实例

由于htable实例为null,并且您正试图对null执行操作

htable.put() 
你得到的NPE如下所示

 stage 23.0 failed 1 times, most recent failure: Lost task 0.0 in stage 23.0 (TID 23, localhost, executor driver): java.lang.NullPointerException

谢谢你的帮助。我解决了一个我无法继续解决的问题。。。
 stage 23.0 failed 1 times, most recent failure: Lost task 0.0 in stage 23.0 (TID 23, localhost, executor driver): java.lang.NullPointerException