Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 使用插入覆盖目录拒绝配置单元权限_Hadoop_Permissions_Hive_Hdfs - Fatal编程技术网

Hadoop 使用插入覆盖目录拒绝配置单元权限

Hadoop 使用插入覆盖目录拒绝配置单元权限,hadoop,permissions,hive,hdfs,Hadoop,Permissions,Hive,Hdfs,运行以下命令时,我从hdfs获得了拒绝权限失败: hive -e "insert overwrite directory '/user/hadoop/a/b/c/d/e/f' select * from table_name limit 10;" 错误消息是: Permission denied: user=hadoop, access=WRITE, inode="/user/hadoop/a/b":hdfs:hive:drwxrwxr-x 但是当我运行:hadoop fs-ls/user/

运行以下命令时,我从hdfs获得了拒绝权限失败:

hive -e "insert overwrite directory '/user/hadoop/a/b/c/d/e/f' select * from table_name limit 10;"
错误消息是:

Permission denied: user=hadoop, access=WRITE, inode="/user/hadoop/a/b":hdfs:hive:drwxrwxr-x
但是当我运行:hadoop fs-ls/user/hadoop/a时,我得到:

似乎我已经打开了文件夹b的完全权限,为什么我仍然被拒绝了权限


PS:我已在配置单元配置文件中设置了hive.insert.into.multilevel.dirs=true。

打开一个新终端,然后尝试以下操作:

一,。将用户更改为root用户:

su
二,。将用户更改为hdfs:

su hdfs
三,。然后运行以下命令:

hadoop fs -chown -R hadoop /user/hadoop/a
现在,您可以尝试运行的命令


希望对你有帮助

我也遇到了同样的问题,我只是通过使用完全限定的HDFS路径来解决它。像这样

hive -e "insert overwrite directory 'hdfs://<cluster>/user/hadoop/a/b/c/d/e/f' select * from table_name limit 10;"
见对这一问题的提及


但是,我不知道根本原因,但与权限无关。

问题实际上与目录权限无关。配置单元应该可以访问路径,我的意思是,路径不在文件级别

下面是有关如何向用户/组授予对hdfs路径和数据库的访问权限的步骤。每个命令的注释都以开头


其他嵌套目录呢?你试过“hadoop fs-chmod 777-R/user/hadoop/a”吗?@arghtype是的,我试过,但仍然不起作用。我想知道为什么hive获得权限drwxrwxr-x,而hadoop fs-ls获得同一文件夹的权限DRWXRWX?只需使用hadoop fs-ls/user/hadoop/检查一下,哪个用户有权访问它是否是hadoop/hive?
hive -e "insert overwrite directory 'hdfs://<cluster>/user/hadoop/a/b/c/d/e/f' select * from table_name limit 10;"
#Login as hive superuser to perform the below steps
create role <role_name_x>;

#For granting to database
grant all on database to role <role_name_x>;

#For granting to HDFS path
grant all on URI '/hdfs/path' to role <role_name_x>;

#Granting the role to the user you will use to run the hive job
grant role <role_name_x> to group <your_user_name>;

#After you perform the below steps you can validate with the below commands
#grant role should show the URI or database access when you run the grant role check on the role name as below

show grant role <role_name_x>;

#Now to validate if the user has access to the role

show role grant group <your_user_name>;
 hdfs dfs -getfacl /tmp/
# file: /tmp
# owner: hdfs
# group: supergroup
# flags: --t
user::rwx
group::rwx
other::rwx