无法启动配置单元CLI Hadoop(MapR)

无法启动配置单元CLI Hadoop(MapR),hadoop,hive,mapr,Hadoop,Hive,Mapr,我正在尝试访问配置单元CLI。但是,它无法从以下访问控制问题开始。 令人窒息的是,我能够从Hue查询配置单元数据,而无需访问控制问题。但是,配置单元CLI不工作。 我在MapR集群上 非常感谢您的帮助 [<user_name>@<edge_node> ~]$ hive SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/mapr/hive/hi

我正在尝试访问配置单元CLI。但是,它无法从以下访问控制问题开始。 令人窒息的是,我能够从Hue查询配置单元数据,而无需访问控制问题。但是,配置单元CLI不工作。 我在MapR集群上

非常感谢您的帮助

[<user_name>@<edge_node> ~]$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/mapr/hive/hive-2.1/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/mapr/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in file:/opt/mapr/hive/hive-2.1/conf/hive-log4j2.properties Async: true
2017-09-23 23:52:08,988 WARN [main] DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-api-jdo-4.2.4.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/mapr/hive/hive-2.1/lib/datanucleus-api-jdo-4.2.1.jar."
2017-09-23 23:52:08,993 WARN [main] DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-core-4.1.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/mapr/hive/hive-2.1/lib/datanucleus-core-4.1.6.jar."
2017-09-23 23:52:09,004 WARN [main] DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-rdbms-4.1.19.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/mapr/hive/hive-2.1/lib/datanucleus-rdbms-4.1.7.jar."
2017-09-23 23:52:09,038 INFO [main] DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
2017-09-23 23:52:09,039 INFO [main] DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
2017-09-23 23:52:14,2251 ERROR JniCommon fs/client/fileclient/cc/jni_MapRClient.cc:2172 Thread: 20235 mkdirs failed for /user/<user_name>, error 13
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: User <user_name>(user id 50005586) has been denied access to create <user_name>
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:617)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:714)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:646)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.security.AccessControlException: User <user_name>(user id 50005586) has been denied access to create <user_name>
at com.mapr.fs.MapRFileSystem.makeDir(MapRFileSystem.java:1256)
at com.mapr.fs.MapRFileSystem.mkdirs(MapRFileSystem.java:1276)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1913)
at org.apache.hadoop.hive.ql.exec.tez.DagUtils.getDefaultDestDir(DagUtils.java:823)
at org.apache.hadoop.hive.ql.exec.tez.DagUtils.getHiveJarDirectory(DagUtils.java:917)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.createJarLocalResource(TezSessionState.java:616)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:256)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.beginOpen(TezSessionState.java:220)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:614)
... 10 more
[@~]$hive
SLF4J:类路径包含多个SLF4J绑定。
SLF4J:在[jar:file:/opt/mapr/hive/hive-2.1/lib/log4j-SLF4J-impl-2.4.1.jar!/org/SLF4J/impl/StaticLoggerBinder.class]中找到绑定
SLF4J:在[jar:file:/opt/mapr/lib/SLF4J-log4j12-1.7.12.jar!/org/SLF4J/impl/StaticLoggerBinder.class]中找到绑定
SLF4J:参见http://www.slf4j.org/codes.html#multiple_bindings 我需要一个解释。
SLF4J:实际绑定的类型为[org.apache.logging.SLF4J.Log4jLoggerFactory]
使用文件中的配置初始化日志记录:/opt/mapr/hive/hive-2.1/conf/hive-log4j2.properties异步:true
2017-09-23 23:52:08988警告[main]DataNucleus.General:Plugin(Bundle)“org.DataNucleus.api.jdo”已经注册。确保类路径中没有同一插件的多个JAR版本。URL“file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-api-jdo-4.2.4.jar”已经注册,您正在尝试注册位于URL“file:/opt/mapr/hive/hive-2.1/lib/datanucleus-api-jdo-4.2.1.jar”的相同插件
2017-09-23 23:52:08993警告[main]DataNucleus.General:插件(Bundle)“org.DataNucleus”已注册。确保类路径中没有同一插件的多个JAR版本。URL“file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-core-4.1.6.jar”已经注册,您正在尝试注册一个位于URL“file:/opt/mapr/hive/hive-2.1/lib/datanucleus-core-4.1.6.jar”的相同插件
2017-09-23 23:52:09004警告[main]DataNucleus.General:插件(Bundle)“org.DataNucleus.store.rdbms”已注册。确保类路径中没有同一插件的多个JAR版本。URL“file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-rdbms-4.1.19.jar”已经注册,您正在尝试注册位于URL“file:/opt/mapr/hive/hive-2.1/lib/datanucleus-rdbms-4.1.7.jar”的相同插件
2017-09-23 23:52:09038信息[main]DataNucleus.持久性:属性DataNucleus.cache.level2未知-将被忽略
2017-09-23 23:52:09039信息[main]DataNucleus.持久性:属性hive.metastore.integral.jdo.pushdown未知-将被忽略
2017-09-23 23:52:142251错误JniCommon fs/client/fileclient/cc/jni_MapRClient.cc:2172线程:20235 mkdirs对/user/失败,错误13
线程“main”java.lang.RuntimeException:org.apache.hadoop.security.AccessControlException:User(用户id 50005586)中的异常已被拒绝创建
位于org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:617)
位于org.apache.hadoop.hive.ql.session.SessionState.begintart(SessionState.java:531)
位于org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:714)
位于org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:646)
位于org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)处
在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)中
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)中
位于java.lang.reflect.Method.invoke(Method.java:606)
位于org.apache.hadoop.util.RunJar.run(RunJar.java:221)
位于org.apache.hadoop.util.RunJar.main(RunJar.java:136)
原因:org.apache.hadoop.security.AccessControlException:用户(用户id 50005586)被拒绝创建
位于com.mapr.fs.MapRFileSystem.makeDir(MapRFileSystem.java:1256)
位于com.mapr.fs.MapRFileSystem.mkdirs(MapRFileSystem.java:1276)
位于org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1913)
位于org.apache.hadoop.hive.ql.exec.tez.DagUtils.getDefaultDestDir(DagUtils.java:823)
位于org.apache.hadoop.hive.ql.exec.tez.DagUtils.getHiveJarDirectory(DagUtils.java:917)
位于org.apache.hadoop.hive.ql.exec.tez.TezSessionState.createJarLocalResource(TezSessionState.java:616)
位于org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:256)
位于org.apache.hadoop.hive.ql.exec.tez.TezSessionState.beginOpen(TezSessionState.java:220)
位于org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:614)
... 10多

错误是,您已定义了在文件系统中创建目录的访问权限。这可能是
/user/
,需要由HDFS/MapR FS超级用户添加

我能够在没有AccessControl的情况下从Hue查询配置单元数据

Hue通过Thrift和HiveServer2进行通信

Hive CLI绕过HiveServer2,已弃用

你应该用直线代替

beeline -n $(whoami) -u jdbc:hive2://hiveserver:10000/default
如果你在kerberized集群中,那么你需要一些额外的选项