在cloudera docker中使用jdbc连接配置单元
我在本地安装了cloudera docker容器,并且配置了配置单元端口,如下所示在cloudera docker中使用jdbc连接配置单元,jdbc,hive,cloudera-quickstart-vm,Jdbc,Hive,Cloudera Quickstart Vm,我在本地安装了cloudera docker容器,并且配置了配置单元端口,如下所示 docker-run--hostname=quickstart.cloudera--privileged=true-t-i-p 8888:8888-p 80:80-p 10000:10000--name cloudera2 cloudera/quickstart/usr/bin/docker-quickstart 我想把它和JDBC连接起来,我的代码是这样的 val driver = "org.apache.hi
docker-run--hostname=quickstart.cloudera--privileged=true-t-i-p 8888:8888-p 80:80-p 10000:10000--name cloudera2 cloudera/quickstart/usr/bin/docker-quickstart
我想把它和JDBC连接起来,我的代码是这样的
val driver = "org.apache.hive.jdbc.HiveDriver"
val url = "jdbc:hive2://localhost:10000/default"
val username = ""
val password = ""
// there's probably a better way to do this
var connection: Connection = null
try {
// make the connection
Class.forName(driver)
} catch {
case e => e.printStackTrace
}
connection = DriverManager.getConnection(url, username, password)
connection.close()
但是当我尝试执行它时,会出现一个NoClassDefFoundError
log4j:WARN No appenders could be found for logger (org.apache.hive.jdbc.Utils).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
at org.apache.hive.jdbc.HiveConnection.createUnderlyingTransport(HiveConnection.java:362)
at org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnection.java:382)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:193)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:167)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:247)
at ScalaJdbcConnectSelect$.main(ScalaJdbcConnectSelect.scala:32)
maven依赖
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>1.1.0-cdh5.7.0</version>
</dependency>
org.apache.hive
蜂窝jdbc
1.1.0-cdh5.7.0
我不确定这是因为用户名和密码,但我尝试了
“cloudera”、“cloudera”、“hive”、“和”
我发现它需要添加hadoop公共依赖项,在我的例子中如下所示
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0-cdh5.7.0</version>
</dependency>
org.apache.hadoop
hadoop通用
2.6.0-cdh5.7.0
而且效果很好
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0-cdh5.7.0</version>
</dependency>