Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/hadoop/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 服务器安装hdfs客户端失败_Hadoop_Hdfs_Hortonworks Data Platform_Ambari - Fatal编程技术网

Hadoop 服务器安装hdfs客户端失败

Hadoop 服务器安装hdfs客户端失败,hadoop,hdfs,hortonworks-data-platform,ambari,Hadoop,Hdfs,Hortonworks Data Platform,Ambari,我在Ambari上安装HDFS客户端时遇到以下错误。已多次重置服务器,但仍无法解决此问题。知道怎么解决吗 标准: Traceback (most recent call last): File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 120, in <module> HdfsClient().execute() Fi

我在Ambari上安装HDFS客户端时遇到以下错误。已多次重置服务器,但仍无法解决此问题。知道怎么解决吗

标准:

Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 120, in <module>
    HdfsClient().execute()
 File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 219, in execute
    method(env)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 36, in install
    self.configure(env)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs_client.py", line 41, in configure
    hdfs()
 File "/usr/lib/python2.6/site-packages/ambari_commons/os_family_impl.py", line 89, in thunk
    return fn(*args, **kwargs)
 File "/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/hdfs.py", line 61, in hdfs
    group=params.user_group
 File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
 File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py", line 67, in action_create
    encoding = self.resource.encoding
 File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 152, in run
    self.run_action(resource, action)
 File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 118, in run_action
    provider_action()
 File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line 87, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist
回溯(最近一次呼叫最后一次):
文件“/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/HDFS_-client.py”,第120行,在
HdfsClient().execute()
文件“/usr/lib/python2.6/site packages/resource_management/libraries/script/script.py”,执行中第219行
方法(env)
文件“/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/HDFS_-client.py”,第36行,安装中
自我配置(env)
文件“/var/lib/ambari agent/cache/common services/HDFS/2.1.0.2.0/package/scripts/HDFS_client.py”,第41行,在configure中
hdfs()
文件“/usr/lib/python2.6/site packages/ambari_commons/os_family_impl.py”,第89行,在thunk中
返回fn(*args,**kwargs)
HDFS中的文件“/var/lib/ambari-agent/cache/common-services/HDFS/2.1.0.2.0/package/scripts/HDFS.py”,第61行
组=params.user\u组
文件“/usr/lib/python2.6/site packages/resource_management/core/base.py”,第154行,在u_init中__
self.env.run()文件
文件“/usr/lib/python2.6/site packages/resource_management/core/environment.py”,第152行,正在运行
self.run_操作(资源、操作)
文件“/usr/lib/python2.6/site packages/resource\u management/core/environment.py”,第118行,在运行操作中
提供者_操作()
文件“/usr/lib/python2.6/site-packages/resource\u-management/libraries/providers/xml\u-config.py”,第67行,正在创建
编码=self.resource.encoding
文件“/usr/lib/python2.6/site packages/resource_management/core/base.py”,第154行,在u_init中__
self.env.run()文件
文件“/usr/lib/python2.6/site packages/resource_management/core/environment.py”,第152行,正在运行
self.run_操作(资源、操作)
文件“/usr/lib/python2.6/site packages/resource\u management/core/environment.py”,第118行,在运行操作中
提供者_操作()
文件“/usr/lib/python2.6/site packages/resource\u management/core/providers/system.py”,第87行,正在创建
raise Fail(“应用%s失败,父目录%s不存在”%(self.resource,dirname))
resource_management.core.exceptions.Fail:应用文件['/usr/hdp/current/hadoop client/conf/hadoop policy.xml']失败,父目录/usr/hdp/current/hadoop client/conf不存在

在故障主机上创建
/usr/hdp/current/hadoop client/conf
可以解决问题。

这是一个软链接,链接到/etc/hadoop/conf

我跑

python /usr/lib/python2.6/site-packages/ambari_agent/HostCleanup.py --silent --skip=users
运行它之后,它将删除
/etc/hadoop/conf

但是,重新安装不会重新创建它

因此,您可能必须自己创建所有conf文件。
希望有人能修补它。

我遇到了同样的问题:我在Centos 7上使用的是HDP2.3.2

第一个问题: 一些conf文件指向/etc//conf目录(与预期的相同) 但是,/etc//conf指向另一个conf目录,这将导致一个无休止的循环

我可以通过删除/etc//conf符号链接并创建目录来解决这个问题

第二个问题 但是,如果运行python脚本清理安装并重新开始,则不会重新创建多个目录,例如hadoop客户端目录。这将导致准确的错误消息。此外,此清理脚本也无法正常工作,因为它无法清理多个用户和目录。您必须使用userdel和groupdel

更新: 这似乎是HDP2.3.2的问题。在HDP2.3.4中,我不再遇到这个问题

yum -y erase hdp-select
如果您多次进行安装,可能无法清理某些软件包

要删除所有HDP软件包并重新开始安装,请清除HDP选择

如果没有帮助,请从
/usr/hdp
中删除所有版本。如果此目录包含多个
hdp

删除所有已安装的软件包,如hadoop、hdfs、zookeeper等。

yum remove zookeeper* hadoop* hdp* zookeeper*

您应该使用HDP2.3.2和Ambari 2。1@adouang,ambari服务器的版本为2.1.2,hdp堆栈为2.3。这实际上不会造成混乱,甚至可能无法解决问题。所有配置都应该在/etc下的子文件夹中。因此,删除/etc/hadoop/conf(这是一个链接回/user/hdp/current/hadoop client/conf的符号链接)并重新创建conf dir是另一种解决方法(尽管仍然很脏),这对我来说不起作用,我从一个新的操作系统开始安装just
yum-y erase hdp select
帮助我修复了失败的节点。