Hadoop 无法从ZooKeeper读取HiveServer2配置

Hadoop 无法从ZooKeeper读取HiveServer2配置,hadoop,hive,hortonworks-data-platform,kylin,Hadoop,Hive,Hortonworks Data Platform,Kylin,我使用HDP3.1。我还希望Ambari能够部署hadoop集群和hive。部署后,我可以在shell中成功运行配置单元。然后我部署ApacheKylin2.6,它可以同步配置单元表。但是当我构建多维数据集时,我得到了以下错误: java.io.IOException: OS command error exit with return code: 1, error message: SLF4J: Class path contains multiple SLF4J bindings. SLF4

我使用HDP3.1。我还希望Ambari能够部署hadoop集群和hive。部署后,我可以在shell中成功运行配置单元。然后我部署ApacheKylin2.6,它可以同步配置单元表。但是当我构建多维数据集时,我得到了以下错误:

java.io.IOException: OS command error exit with return code: 1, error message: SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://datacenter1:2181,datacenter2:2181,datacenter3:2181/default;password=hdfs;serviceDiscoveryMode=zooKeeper;user=hdfs;zooKeeperNamespace=hiveserver2
19/02/15 10:04:53 [main]: INFO jdbc.HiveConnection: Connected to datacenter3:10000
19/02/15 10:04:53 [main]: WARN jdbc.HiveConnection: Failed to connect to datacenter3:10000
19/02/15 10:04:53 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify dfs.replication at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
Cannot run commands specified using -e. No current connection
The command is: 
hive -e "USE default;
我在shell中运行配置单元命令。这是成功。连接字符串与在kylin中运行build cube时的字符串相同。我不明白为什么它在shell中成功,但在构建立方体中失败了

SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/3.1.0.0-78/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Connecting to jdbc:hive2://datacenter1:2181,datacenter2:2181,datacenter3:2181/default;password=hdfs;serviceDiscoveryMode=zooKeeper;user=hdfs;zooKeeperNamespace=hiveserver2
19/02/15 12:10:19 [main]: INFO jdbc.HiveConnection: Connected to datacenter3:10000
Connected to: Apache Hive (version 3.1.0.3.1.0.0-78)
Driver: Hive JDBC (version 3.1.0.3.1.0.0-78)
Transaction isolation: TRANSACTION_REPEATABLE_READ
Beeline version 3.1.0.3.1.0.0-78 by Apache Hive
0: jdbc:hive2://datacenter1:2181,datacenter2:> 

您可以尝试将这两个属性添加到hive-site.xml

<property>
  <name>hive.security.authorization.sqlstd.confwhitelist</name>
  <value>mapred.*|hive.*|mapreduce.*|spark.*</value>
</property>

<property>
  <name>hive.security.authorization.sqlstd.confwhitelist.append</name>
  <value>mapred.*|hive.*|mapreduce.*|spark.*</value>
</property>

hive.security.authorization.sqlstd.conf白名单
mapred.*hive.*mapreduce.*spark*
hive.security.authorization.sqlstd.confwhitelist.append
mapred.*hive.*mapreduce.*spark*

最后,我找到了根本原因。错误日志中存在“无法在运行时修改dfs.replication.”错误消息。Kylin在
$Kylin\u HOME/conf/Kylin\u hive\u conf.xml
中设置此属性。当它运行配置单元命令时,它将自动附加该文件中的属性。最后一个命令是:
hive--hiveconf dfs.replication=2………

看起来dfs.replication属性无法应用于配置单元命令。我在
kylin\u hive\u conf.xml
中删除了这个属性。现在它可以工作了。

似乎多个slf4j绑定为您带来了问题,请删除一个并重试。我删除了一个slf4j jar。问题仍然存在。