Java 无法通过jdbc连接到配置单元

Java 无法通过jdbc连接到配置单元,java,hadoop,jdbc,hive,impala,Java,Hadoop,Jdbc,Hive,Impala,我用gradle来运行我的程序。引用了示例代码 我可以从回购协议中运行这个例子,而不会失败 // Apply the java plugin to add support for Java apply plugin: 'java' apply plugin: 'application' mainClassName = "com.my.impala.fetcher.Fetcher" // In this section you declare where to find the dependen

我用gradle来运行我的程序。引用了示例代码

我可以从回购协议中运行这个例子,而不会失败

// Apply the java plugin to add support for Java
apply plugin: 'java'
apply plugin: 'application'

mainClassName = "com.my.impala.fetcher.Fetcher"
// In this section you declare where to find the dependencies of your project
repositories {
    // Use 'jcenter' for resolving your dependencies.
    // You can declare any Maven/Ivy/file repository here.
    mavenCentral()
    maven {
        url "https://repository.cloudera.com/artifactory/cloudera-repos/"
    }
}

run {
     if (project.hasProperty("params")) {
         args params
         // args Eval.me(params)
     }
}

test {
        testLogging {
            events "passed", "skipped", "failed", "standardOut", "standardError"
    }
}


// In this section you declare the dependencies for your production and test code
dependencies {
    // The production code uses the SLF4J logging API at compile time
    compile 'org.slf4j:slf4j-api:1.7.7'
    compile 'org.apache.hadoop:hadoop-client:2.6.0-cdh5.4.4.1'
    compile 'org.json:json:20140107'
    compile 'org.apache.hive:hive-jdbc:1.1.0'
}
下面是示例代码

Connection con = null;
ResultSet rs = null;
try {
    Class.forName(JDBC_DRIVER_NAME);
    con = DriverManager.getConnection(CONNECTION_HOST);
    Statement stmt = con.createStatement();
    rs = stmt.executeQuery(sqlStatement);
} catch (SQLException e) {
    e.printStackTrace();
    System.exit(-1);
} catch (Exception e) {
    e.printStackTrace();
    System.exit(-1);
} finally {
    try {
        con.close();
    } catch (Exception e) {
        e.printStackTrace();
        System.exit(-1);
    }
}

rs.next();
其中
rs.next
将引发以下异常

org.apache.thrift.transport.TTransportException: Cannot write to null outputStream
    at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:142)
    at org.apache.thrift.protocol.TBinaryProtocol.writeI32(TBinaryProtocol.java:178)
    at org.apache.thrift.protocol.TBinaryProtocol.writeMessageBegin(TBinaryProtocol.java:106)
    at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
    at org.apache.hive.service.cli.thrift.TCLIService$Client.send_FetchResults(TCLIService.java:495)
    at org.apache.hive.service.cli.thrift.TCLIService$Client.FetchResults(TCLIService.java:487)
    at org.apache.hive.jdbc.HiveQueryResultSet.next(HiveQueryResultSet.java:360)
    at com.my.impala.fetcher.Fetcher.main(Fetcher.java:54)
Exception in thread "main" java.sql.SQLException: Error retrieving next row
    at org.apache.hive.jdbc.HiveQueryResultSet.next(HiveQueryResultSet.java:388)
    at com.my.impala.fetcher.Fetcher.main(Fetcher.java:54)
Caused by: org.apache.thrift.transport.TTransportException: Cannot write to null outputStream
    at org.apache.thrift.transport.TIOStreamTransport.write(TIOStreamTransport.java:142)
    at org.apache.thrift.protocol.TBinaryProtocol.writeI32(TBinaryProtocol.java:178)
    at org.apache.thrift.protocol.TBinaryProtocol.writeMessageBegin(TBinaryProtocol.java:106)
    at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:62)
    at org.apache.hive.service.cli.thrift.TCLIService$Client.send_FetchResults(TCLIService.java:495)
    at org.apache.hive.service.cli.thrift.TCLIService$Client.FetchResults(TCLIService.java:487)
    at org.apache.hive.jdbc.HiveQueryResultSet.next(HiveQueryResultSet.java:360)
    ... 1 more
我不知道我错过了哪一部分


谢谢。

您已经关闭了连接,当然无法读取结果。
rs.next()
应该位于try块末尾的
executeQuery()
行之后,如下所示:

rs = stmt.executeQuery(sqlStatement);
while (rs.next()) {
    // handle the record
}

ResultSet
是一个数据库游标,它不包含整个数据集,您可以通过它访问数据库中的数据。

您已经关闭了连接,当然无法读取结果。
rs.next()
应该位于try块末尾的
executeQuery()
行之后,如下所示:

rs = stmt.executeQuery(sqlStatement);
while (rs.next()) {
    // handle the record
}

ResultSet
是一个数据库游标,它不包含整个数据集,您可以通过它访问数据库中的数据。

这非常有用。这非常有用。