Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/apache-spark/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 连接关闭尝试连接到kerberized hbase_Apache Spark_Hbase_Kerberos - Fatal编程技术网

Apache spark 连接关闭尝试连接到kerberized hbase

Apache spark 连接关闭尝试连接到kerberized hbase,apache-spark,hbase,kerberos,Apache Spark,Hbase,Kerberos,我正在尝试从HDP群集外部连接到kerberized Hbase。我有以下配置(请参阅下面的代码),并且我尝试在spark submit中的“-files”选项下传递hbase站点。在任何一种情况下,我都会看到以下异常(我屏蔽了主机名)。不确定是什么原因导致连接关闭: Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=69264: Connection closed row 'mytable,,

我正在尝试从HDP群集外部连接到kerberized Hbase。我有以下配置(请参阅下面的代码),并且我尝试在spark submit中的“-files”选项下传递hbase站点。在任何一种情况下,我都会看到以下异常(我屏蔽了主机名)。不确定是什么原因导致连接关闭:

Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=69264: Connection closed row 'mytable,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=myhbase-master,16020,1602115323155, seqNum=-1
        at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:159)
        at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
        at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelInactive(NettyRpcDuplexHandler.java:211)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
        at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:377)
        at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:342)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
        at org.apache.hbase.thirdparty.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
        at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
        at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1354)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
        at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:917)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:822)
        at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
        at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
        at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
        at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
        at org.apache.hbase.thirdparty.io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
 SparkConf conf=new SparkConf().setAppName("Hbase SPARK").setMaster("local");

        JavaSparkContext javaSparkContext=new JavaSparkContext(conf);
        
        System.setProperty("java.security.krb5.conf", "/etc/krb5.conf");
        System.setProperty("sun.security.krb5.debug", "true");
        System.setProperty("HADOOP_CONF_DIR","/pathto/hadoopconf/");
        System.setProperty("HADOOP_CONF_DIR","/pathto/hadoopconf/");
        final Configuration config = HBaseConfiguration.create();               

        config.set("hadoop.security.authentication", "kerberos");
        config.set("hbase.security.authentication", "true");
        config.set("hbase.rpc.protection", "authentication");
        config.set("hbase.master.kerberos.principal", "hbase/_HOST@myhost.com");
        config.set("hbase.regionserver.kerberos.principal", "hbase/_HOST@myhost.com");
        config.set("hbase.zookeeper.quorum", "xxxx.yyy.zzz.com,xxx.yyy..zzz.com");
        conf.set("hbase.connection-timeout", "60000");
        conf.set("hbase.zookeeper.session.timeout","30000");
        conf.set("hbase.hbase.client.retries.number","3");
        conf.set("hbase.hbase.client.pause","1000");
        conf.set("hbase.zookeeper.recovery.retry","1");
        config.set("hbase.zookeeper.property.clientPort", "2181");
        config.set("zookeeper.znode.parent", "/hbase-secure");
        config.set(TableInputFormat.INPUT_TABLE, "mytable");
      //  create the connection :

   try {
        UserGroupInformation.setConfiguration(config);
        UserGroupInformation.loginUserFromKeytab(principal, keytabLocation);
      
            connection = ConnectionFactory.createConnection(config);
            System.out.println("connected************ "+connection.getConfiguration().toString());
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        } 

     // read the data 
 JavaPairRDD<ImmutableBytesWritable, Result> javaPairRdd = javaSparkContext.newAPIHadoopRDD(conn.getConfiguration(), TableInputFormat.class,ImmutableBytesWritable.class, Result.class);
        System.out.println("Count:"+javaPairRdd.count()); ```
原因:java.net.SocketTimeoutException:callTimeout=60000,callDuration=69264:连接关闭行'mytable',表'hbase:meta'上的'hbase:meta',region=hbase:meta',1.1588230740,hostname=myhbase master,160201602115323155,seqNum=-1
位于org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:159)
位于org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)
位于java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
位于java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
运行(Thread.java:748)
原因:org.apache.hadoop.hbase.exceptions.ConnectionClosedException:连接已关闭
位于org.apache.hadoop.hbase.ipc.nettyrpcdublexhandler.channelInactive(nettyrpcdublexhandler.java:211)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.InvokeChannelActive(AbstractChannelHandlerContext.java:245)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.InvokeChannelActive(AbstractChannelHandlerContext.java:231)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
位于org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:377)
位于org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:342)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.InvokeChannelActive(AbstractChannelHandlerContext.java:245)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.InvokeChannelActive(AbstractChannelHandlerContext.java:231)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
位于org.apache.hbase.thirdparty.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
位于org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.InvokeChannelActive(AbstractChannelHandlerContext.java:245)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.InvokeChannelActive(AbstractChannelHandlerContext.java:231)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
位于org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1354)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.InvokeChannelActive(AbstractChannelHandlerContext.java:245)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.InvokeChannelActive(AbstractChannelHandlerContext.java:231)
位于org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:917)
位于org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$Abstract8.run(AbstractChannel.java:822)
位于org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
位于org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
位于org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
位于org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
位于org.apache.hbase.thirdparty.io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
SparkConf conf=new SparkConf().setAppName(“Hbase SPARK”).setMaster(“local”);
JavaSparkContext JavaSparkContext=新的JavaSparkContext(conf);
setProperty(“java.security.krb5.conf”,“/etc/krb5.conf”);
set属性(“sun.security.krb5.debug”、“true”);
setProperty(“HADOOP\u CONF\u DIR”,“/pathto/hadoopconf/”;
setProperty(“HADOOP\u CONF\u DIR”,“/pathto/hadoopconf/”;
最终配置config=HBaseConfiguration.create();
config.set(“hadoop.security.authentication”、“kerberos”);
config.set(“hbase.security.authentication”、“true”);
config.set(“hbase.rpc.protection”、“authentication”);
config.set(“hbase.master.kerberos.principal”、“hbase/_HOST@myhost.com");
config.set(“hbase.regionserver.kerberos.principal”、“hbase/_HOST@myhost.com");
config.set(“hbase.zookeeper.quorum”、“xxxx.yyy.zzz.com、xxx.yyy..zzz.com”);
conf.set(“hbase.connection timeout”,“60000”);
conf.set(“hbase.zookeeper.session.timeout”,“30000”);
conf.set(“hbase.hbase.client.retries.number”,“3”);
conf.set(“hbase.hbase.client.pause”,“1000”);
conf.set(“hbase.zookeeper.recovery.retry”,“1”);
config.set(“hbase.zookeeper.property.clientPort”、“2181”);
config.set(“zookeeper.znode.parent”,“/hbase secure”);
config.set(TableInputFormat.INPUT_TABLE,“mytable”);
//创建连接:
试一试{
用户组信息