Java 如何将client_协议传递给JDBC驱动程序?

Java 如何将client_协议传递给JDBC驱动程序?,java,r,jdbc,hive,r-dbi,Java,R,Jdbc,Hive,R Dbi,我正试图使用dplyr.spark.hive软件包连接HiveServer2,但我发生了一个错误,无法将用户名传递给dbConnect函数,这可能是我收到有关NULL客户端协议的错误的原因 是否有人知道如何解决此问题,或者如何将用户/用户名传递到dbConnect函数,其中驱动程序是JDBC? 这个beeline请求对我来说很好 beeline -u "jdbc:hive2://host:port/dbname;auth=noSasl" -n mkosinski --outputformat=

我正试图使用
dplyr.spark.hive
软件包连接
HiveServer2
,但我发生了一个错误,无法将用户名传递给
dbConnect
函数,这可能是我收到有关
NULL
客户端协议
的错误的原因

是否有人知道如何解决此问题,或者如何将
用户/用户名
传递到
dbConnect
函数,其中驱动程序是
JDBC

这个
beeline
请求对我来说很好

beeline  -u "jdbc:hive2://host:port/dbname;auth=noSasl" -n mkosinski --outputformat=tsv --incremental=true -f sql_statement.sql > sql_output
但这一R等价物并不:

> library(dplyr.spark.hive)
Warning: changing locked binding for ‘over’ in ‘dplyr’ whilst loading ‘dplyr.spark.hive’
Warning: changing locked binding for ‘partial_eval’ in ‘dplyr’ whilst loading ‘dplyr.spark.hive’
Warning: changing locked binding for ‘default_op’ in ‘dplyr’ whilst loading ‘dplyr.spark.hive’

Attaching package: ‘dplyr.spark.hive’

The following object is masked from ‘package:SparkR’:

    cache

Warning messages:
1: replacing previous import by ‘purrr::%>%’ when loading ‘dplyr.spark.hive’ 
2: replacing previous import by ‘purrr::order_by’ when loading ‘dplyr.spark.hive’ 
> Sys.setenv(HADOOP_JAR = "/opt/spark-1.5.0-bin-hadoop2.4/lib/spark-assembly-1.5.0-hadoop2.4.0.jar")
> Sys.setenv(HIVE_SERVER2_THRIFT_BIND_HOST = 'tools-1.hadoop.srv')
> Sys.setenv(HIVE_SERVER2_THRIFT_PORT = '10000')
> host = 'tools-1.hadoop.srv'
> port = 10000
> driverclass = "org.apache.hive.jdbc.HiveDriver"
> Sys.setenv(HADOOP_JAR = "/opt/spark-1.5.0-bin-hadoop2.4/lib/spark-assembly-1.5.0-hadoop2.4.0.jar")
> library(RJDBC)
> dr = JDBC(driverclass, Sys.getenv("HADOOP_JAR"))
> #url = paste0("jdbc:hive2://", host, ":", port)
> url = paste0("jdbc:hive2://", host, ":", port,"/loghost;auth=noSasl")
> class = "Hive"
> con.class = paste0(class, "Connection") # class = "Hive"
> con = new(con.class, dbConnect(dr, url, username = "mkosinski", database = "loghost"))
log4j:WARN No appenders could be found for logger (org.apache.hive.jdbc.Utils).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Error in .jcall(drv@jdrv, "Ljava/sql/Connection;", "connect", as.character(url)[1],  : 
  java.sql.SQLException: Could not establish connection to jdbc:hive2://tools-1.hadoop.srv:10000/loghost;auth=noSasl: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, configuration:{use:database=loghost})
> con = new(con.class, dbConnect(dr, url, username = "mkosinski"))
Error in .jcall(drv@jdrv, "Ljava/sql/Connection;", "connect", as.character(url)[1],  : 
  java.sql.SQLException: Could not establish connection to jdbc:hive2://tools-1.hadoop.srv:10000/loghost;auth=noSasl: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null, configuration:{use:database=loghost})
编辑1

我曾尝试使用不同的
.jar
进行连接(如注释中所建议的),看起来前面的问题已经解决(我可能使用了错误的
.jar
),但现在我收到一个错误,告诉我连接没有配置:

> Sys.setenv(HADOOP_HOME="/usr/share/hadoop/share/hadoop/common/")
> Sys.setenv(HIVE_HOME = '/opt/hive/lib/')
> host = 'tools-1.hadoop.srv'
> port = 10000
> driverclass = "org.apache.hive.jdbc.HiveDriver"
> library(RJDBC)
Loading required package: DBI
Loading required package: rJava
> dr = JDBC(driverclass,classPath = c("/opt/hive/lib/hive-jdbc-1.0.0-standalone.jar"))
> dr2 = JDBC(driverclass,classPath = c("/opt/hive/lib/hive-jdbc-1.0.0-standalone.jar",
+                                      "/opt/hive/lib/commons-configuration-1.6.jar"))
> url = paste0("jdbc:hive2://", host, ":", port)
> dbConnect(dr, url, username = "mkosinski", database = "loghost") -> cont
log4j:WARN No appenders could be found for logger (org.apache.hive.jdbc.Utils).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Error in .jcall(drv@jdrv, "Ljava/sql/Connection;", "connect", as.character(url)[1],  : 
  java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
> dbConnect(dr2, url, username = "mkosinski", database = "loghost") -> cont
Error in .jcall(drv@jdrv, "Ljava/sql/Connection;", "connect", as.character(url)[1],  : 
  java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
> sessionInfo()
R version 3.1.3 (2015-03-09)
Platform: x86_64-redhat-linux-gnu (64-bit)
Running under: CentOS Linux 7 (Core)

locale:
 [1] LC_CTYPE=en_US.UTF-8          LC_NUMERIC=C                  LC_TIME=en_US.UTF-8           LC_COLLATE=en_US.UTF-8       
 [5] LC_MONETARY=en_US.UTF-8       LC_MESSAGES=en_US.UTF-8       LC_PAPER=en_US.UTF-8          LC_NAME=en_US.UTF-8          
 [9] LC_ADDRESS=en_US.UTF-8        LC_TELEPHONE=en_US.UTF-8      LC_MEASUREMENT=en_US.UTF-8    LC_IDENTIFICATION=en_US.UTF-8

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] RJDBC_0.2-5 rJava_0.9-7 DBI_0.3.1  

loaded via a namespace (and not attached):
[1] tools_3.1.3

问题在于错误的
.jar
规范(JDBC中的
classPath
arg)和错误的
hiveServer2
url


解释如下

My Seconds:该错误消息闻起来更像是Hadoop/Hive JAR的客户端配置问题(例如,请参阅)。您要连接到哪个版本的HiveServer2?哪个版本的JDBC驱动程序实际上是从R作业的类路径中挑选出来的?@SamsonScharfrichter感谢你让我大开眼界。我将检查所有版本是否兼容配置单元客户端、配置单元jdbc和hiveserver2@SamsonScharfrichter我已经检查了我的配置单元版本是1.0.0,我的hive-jdbc.jar也在1.0.0版本中-我不知道如何检查HiveServer2的版本。此外,我不知道R jobOk的类路径是什么,我知道
CLASSPATH
是JAR的路径向量,当使用JDBC时,JAR需要连接到hiveserver2。也许我应该添加morde JAR,正如Hive 1.0中列出的那样,它应该归结为
Hive jdbc standalone
(大的~18MB)加上,仅对于Kerberos集群,
hadoop common
/
hadoop auth
/
commons configuration
加上可选选项,只是为了关闭警告,
slf4j-log4j12.jar
/
log4j.jar