Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop Java HivePreparedStatement示例_Hadoop_Jdbc_Hive_Prepared Statement - Fatal编程技术网

Hadoop Java HivePreparedStatement示例

Hadoop Java HivePreparedStatement示例,hadoop,jdbc,hive,prepared-statement,Hadoop,Jdbc,Hive,Prepared Statement,是否有人可以展示HivePreparedStatement的示例java程序,该程序可用于连接到hive表并从中检索数据 我尝试了以下代码,但无法完成,因为构造函数有很多参数 TCLIService.Iface client =null; EmbeddedThriftBinaryCLIService embeddedClient = new EmbeddedThriftBinaryCLIService(); embeddedClient.init(null)

是否有人可以展示HivePreparedStatement的示例java程序,该程序可用于连接到hive表并从中检索数据

我尝试了以下代码,但无法完成,因为构造函数有很多参数

      TCLIService.Iface client =null;
      EmbeddedThriftBinaryCLIService embeddedClient = new EmbeddedThriftBinaryCLIService();
      embeddedClient.init(null);
      client = embeddedClient;

      TOpenSessionReq openReq = new TOpenSessionReq();

      JdbcConnectionParams connParams = Utils.parseURL("jdbc:hive2://168.61.32.157:10000/nadb");

      Map<String, String> openConf = new HashMap<String, String>();
      // for remote JDBC client, try to set the conf var using 'set foo=bar'
       for (Map.Entry<String, String> hiveConf : connParams.getHiveConfs().entrySet()) {
        openConf.put("set:hiveconf:" + hiveConf.getKey(), hiveConf.getValue());
      }
      // For remote JDBC client, try to set the hive var using 'set hivevar:key=value'
      for (Map.Entry<String, String=> hiveVar : connParams.getHiveVars().entrySet()) {
        openConf.put("set:hivevar:" + hiveVar.getKey(), hiveVar.getValue());
      }
      // switch the database
      openConf.put("use:database", connParams.getDbName());
      // set the fetchSize
      openConf.put("set:hiveconf:hive.server2.thrift.resultset.default.fetch.size",
        Integer.toString(fetchSize));

      // set the session configuration
      Map<String, String> sessVars = connParams.getSessionVars();
      if (sessVars.containsKey(HiveAuthFactory.HS2_PROXY_USER)) {
        openConf.put(HiveAuthFactory.HS2_PROXY_USER,
            sessVars.get(HiveAuthFactory.HS2_PROXY_USER));
      }
      openReq.setConfiguration(openConf);

      // Store the user name in the open request in case no non-sasl authentication
      if (JdbcConnectionParams.AUTH_SIMPLE.equals(sessConfMap.get(JdbcConnectionParams.AUTH_TYPE))) {
        openReq.setUsername(sessConfMap.get(JdbcConnectionParams.AUTH_USER));
        openReq.setPassword(sessConfMap.get(JdbcConnectionParams.AUTH_PASSWD));
      }

      TOpenSessionResp openResp = client.OpenSession(openReq);

      // validate connection
      Utils.verifySuccess(openResp.getStatus());
      if (!supportedProtocols.contains(openResp.getServerProtocolVersion())) {
        throw new TException("Unsupported Hive2 protocol");
      }
      protocol = openResp.getServerProtocolVersion();
      sessHandle = openResp.getSessionHandle();


      HivePreparedStatement hivePreparedStatement = new HivePreparedStatement(con, client, sessHandle, sql);
TCLIService.Iface client=null;
EmbeddedThriftBinaryCLIService embeddedClient=新的EmbeddedThriftBinaryCLIService();
embeddedClient.init(null);
客户端=嵌入式客户端;
TOpenSessionReq openReq=新的TOpenSessionReq();
JdbcConnectionParams connParams=Utils.parseURL(“jdbc:hive2://168.61.32.157:10000/nadb”);
Map openConf=newhashmap();
//对于远程JDBC客户端,尝试使用“set foo=bar”设置conf变量
对于(Map.Entry hiveConf:connParams.getHiveConfs().entrySet()){
openConf.put(“set:hiveconf:+hiveconf.getKey(),hiveconf.getValue());
}
//对于远程JDBC客户端,请尝试使用“set hivevar:key=value”设置配置单元变量
对于(Map.Entry hiveVar:connParams.getHiveVars().entrySet()){
openConf.put(“set:hivevar:+hivevar.getKey(),hivevar.getValue());
}
//切换数据库
openConf.put(“使用:数据库”,connParams.getDbName());
//设置抓取大小
openConf.put(“set:hiveconf:hive.server2.thrift.resultset.default.fetch.size”,
toString(fetchSize));
//设置会话配置
Map sessVars=connParams.getSessionVars();
if(sessVars.containsKey(HiveAuthFactory.HS2_PROXY_USER)){
openConf.put(HiveAuthFactory.HS2_PROXY_用户,
get(HiveAuthFactory.HS2_PROXY_USER));
}
openReq.setConfiguration(openConf);
//如果没有非sasl身份验证,请将用户名存储在open请求中
if(jdbconnectionparams.AUTH_SIMPLE.equals(sessConfMap.get(jdbconnectionparams.AUTH_TYPE))){
openReq.setUsername(sessConfMap.get(jdbconnectionparams.AUTH_USER));
openReq.setPassword(sessConfMap.get(jdbconnectionparams.AUTH_PASSWD));
}
TOpenSessionResp openResp=client.OpenSession(openReq);
//验证连接
Utils.verifySuccess(openResp.getStatus());
如果(!supportedProtocols.contains(openResp.getServerProtocolVersion())){
抛出新的TException(“不支持的Hive2协议”);
}
protocol=openResp.getServerProtocolVersion();
sessHandle=openResp.getSessionHandle();
HivePreparedStatement HivePreparedStatement=新的HivePreparedStatement(con、client、sessandle、sql);

有什么简单的方法可以做到这一点吗?

你自己试过什么吗?看看单元测试?我已经更新了问题并感谢链接,但我不明白在哪里可以添加配置单元jdbc驱动程序的连接参数以及用户名和密码。在Junit中,他们只是模拟了连接和构造的其他参数。您不应该自己实例化
HivePreparedStatement
,您应该使用
Connection.prepareStatement(…)
。这对所有JDBC驱动程序都是一样的。查看任何关于JDBC的教程。然而,代码的其余部分似乎表明,您甚至没有使用HiveJDBC驱动程序,而是在使用较低级别的协议实现。您可能无法以这种方式使用HivePreparedStatement(或者至少不容易)。(作为注释发布,因为我没有特定于配置单元的知识)Connection.preparestatement返回PreparedStatement的对象,而不是HivePreparedStatement。我尝试使用jdbc准备的语句将数据插入配置单元表,然后得到一个java.sql.SQLException:methodnotsupported。这里可以做什么?应该根据这个问题来工作