Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/367.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 尝试使用配置单元流api建立连接时,配置单元暂存目录的权限被拒绝_Java_Hadoop_Permissions_Hive_Hdfs - Fatal编程技术网

Java 尝试使用配置单元流api建立连接时,配置单元暂存目录的权限被拒绝

Java 尝试使用配置单元流api建立连接时,配置单元暂存目录的权限被拒绝,java,hadoop,permissions,hive,hdfs,Java,Hadoop,Permissions,Hive,Hdfs,我正在使用HCatalogue流式api编写一个示例程序 我有一个正在运行的hadoop、Hiveserver和Hivemetastore服务器 我编写了一个java程序来连接到hive metastore public class HCatalogueStreamingclient { public static void main(String[] args) { System.setProperty("hadoop.home.dir", "E:\\midhun\\h

我正在使用HCatalogue流式api编写一个示例程序

我有一个正在运行的hadoop、Hiveserver和Hivemetastore服务器

我编写了一个java程序来连接到hive metastore

public class HCatalogueStreamingclient {
    public static void main(String[] args) {
        System.setProperty("hadoop.home.dir", "E:\\midhun\\hadoop\\hive\\winutils");
        String dbName = "hive_streaming";
        String tblName = "alerts";
        ArrayList<String> partitionVals = new ArrayList<String>(2);
        partitionVals.add("Asia");
        partitionVals.add("India");
        HiveEndPoint hiveEP = new HiveEndPoint("thrift://192.168.10.149:8000", dbName, tblName, partitionVals);
        HiveConf conf = new HiveConf();
        conf.set("hive.exec.scratchdir", "/tmp/hivetmp");

        try {
            StreamingConnection connection = hiveEP.newConnection(true,conf);
        } catch (ConnectionError e) {
            e.printStackTrace();
        } catch (InvalidPartition e) {
            e.printStackTrace();
        } catch (InvalidTable e) {
            e.printStackTrace();
        } catch (PartitionCreationFailed e) {
            e.printStackTrace();
        } catch (ImpersonationFailed e) {
            e.printStackTrace();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
}
公共类HCatalogueStreamingclient{
公共静态void main(字符串[]args){
System.setProperty(“hadoop.home.dir”,“E:\\midhun\\hadoop\\hive\\winutils”);
String dbName=“配置单元流”;
字符串tblName=“警报”;
ArrayList partitionVals=新的ArrayList(2);
添加部分(“亚洲”);
添加部分(“印度”);
HiveEndPoint hiveEP=新的HiveEndPoint(“thrift://192.168.10.149:8000,dbName,tblName,partitionVals);
HiveConf conf=新的HiveConf();
conf.set(“hive.exec.scratchdir”,“/tmp/hivetmp”);
试一试{
StreamingConnection=hiveEP.newConnection(true,conf);
}捕获(连接错误){
e、 printStackTrace();
}捕获(无效分区e){
e、 printStackTrace();
}捕获(无效表e){
e、 printStackTrace();
}捕获(分区创建失败){
e、 printStackTrace();
}捕获(模拟失败){
e、 printStackTrace();
}捕捉(中断异常e){
e、 printStackTrace();
}
}
}
运行程序时,我遇到以下异常

Exception in thread "main" java.lang.RuntimeException: The root scratch dir: /tmp/hivetmp on HDFS should be writable. Current permissions are: rw-rw-rw-
    at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:690)
    at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:622)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:550)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513)
    at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.createPartitionIfNotExists(HiveEndPoint.java:445)
    at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:314)
    at org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.<init>(HiveEndPoint.java:278)
    at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215)
    at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:192)
    at org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:113)
    at com.mj.HCatalogueStreamingclient.main(HCatalogueStreamingclient.java:27)
线程“main”java.lang.RuntimeException中的异常:HDFS上的root scratch dir:/tmp/hivetmp应该是可写的。当前权限为:rw rw rw- 位于org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:690) 位于org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:622) 位于org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:550) 位于org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:513) 位于org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.createPartitionIfNotExists(HiveEndPoint.java:445) 位于org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.(HiveEndPoint.java:314) 位于org.apache.hive.hcatalog.streaming.HiveEndPoint$ConnectionImpl.(HiveEndPoint.java:278) 位于org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnectionImpl(HiveEndPoint.java:215) 位于org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:192) 位于org.apache.hive.hcatalog.streaming.HiveEndPoint.newConnection(HiveEndPoint.java:113) 位于com.mj.HCatalogueStreamingclient.main(HCatalogueStreamingclient.java:27)
任何人都知道如何向hdfs文件夹/tmp/hivetmp授予grand write权限,问题已经确定

  • 我们需要为hadoop目录/tmp/hivetmp授予所有权限。授予权限的命令是$hadoop_HOME/bin/hadoop fs-chmod 777-R/tmp/hivetmp
  • 我在Windows 7 64位操作系统中运行该程序。我们需要下载适合操作系统和版本的winutils
在完成上述给定点之后,我能够使用HiveHCatalogue流式api建立连接