Hadoop 使用ambari2.7.4.0 centOS 7安装hdp 3.1.0.0

Hadoop 使用ambari2.7.4.0 centOS 7安装hdp 3.1.0.0,hadoop,hdfs,ambari,hdp,Hadoop,Hdfs,Ambari,Hdp,使用ambari 2.7.4安装HDP-3.1.0.0时出现以下错误。你能帮我解决这个问题吗? 我在yum.conf中使用了代理连接到internet,ambari服务器正在使用root用户运行 ============= `stderr: There was an error communicating with RHN. Red Hat Satellite or RHN Classic support will be disabled. rhn-plugin: Error communic

使用ambari 2.7.4安装HDP-3.1.0.0时出现以下错误。你能帮我解决这个问题吗? 我在yum.conf中使用了代理连接到internet,ambari服务器正在使用root用户运行

=============

`stderr: 
There was an error communicating with RHN.
Red Hat Satellite or RHN Classic support will be disabled.
rhn-plugin: Error communicating with server. The message was:
Connection refused
There was an error communicating with RHN.
Red Hat Satellite or RHN Classic support will be disabled.
rhn-plugin: Error communicating with server. The message was:
Connection refused
There was an error communicating with RHN.
Red Hat Satellite or RHN Classic support will be disabled.
rhn-plugin: Error communicating with server. The message was:
Connection refused
2019-11-22 16:25:32,084 - Reporting component version failed
Traceback (most recent call last):
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 363, in execute
    self.save_component_version_to_structured_out(self.command_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 223, in save_component_version_to_structured_out
    stack_select_package_name = stack_select.get_package_name()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 109, in get_package_name
    package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 223, in get_packages
    supported_packages = get_supported_packages()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 147, in get_supported_packages
    raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
Fail: Unable to query for supported packages using /usr/bin/hdp-select
There was an error communicating with RHN.
Red Hat Satellite or RHN Classic support will be disabled.
rhn-plugin: Error communicating with server. The message was:
Connection refused
2019-11-22 16:28:13,557 - Reporting component version failed
Traceback (most recent call last):
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 363, in execute
    self.save_component_version_to_structured_out(self.command_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 223, in save_component_version_to_structured_out
    stack_select_package_name = stack_select.get_package_name()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 109, in get_package_name
    package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 223, in get_packages
    supported_packages = get_supported_packages()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 147, in get_supported_packages
    raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
Fail: Unable to query for supported packages using /usr/bin/hdp-select
2019-11-22 16:28:14,080 - Reporting component version failed
Traceback (most recent call last):
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 363, in execute
    self.save_component_version_to_structured_out(self.command_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 223, in save_component_version_to_structured_out
    stack_select_package_name = stack_select.get_package_name()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 109, in get_package_name
    package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 223, in get_packages
    supported_packages = get_supported_packages()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 147, in get_supported_packages
    raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
Fail: Unable to query for supported packages using /usr/bin/hdp-select
Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stack-hooks/after-INSTALL/scripts/hook.py", line 39, in 
    AfterInstallHook().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stack-hooks/after-INSTALL/scripts/hook.py", line 32, in hook
    setup_stack_symlinks(self.stroutfile)
  File "/var/lib/ambari-agent/cache/stack-hooks/after-INSTALL/scripts/shared_initialization.py", line 53, in setup_stack_symlinks
    stack_packages = stack_select.get_packages(stack_select.PACKAGE_SCOPE_INSTALL)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 223, in get_packages
    supported_packages = get_supported_packages()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 147, in get_supported_packages
    raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select
 stdout:
2019-11-22 16:24:59,287 - Stack Feature Version Info: Cluster Stack=3.1, Command Stack=None, Command Version=None -> 3.1
2019-11-22 16:24:59,297 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2019-11-22 16:24:59,300 - Group['kms'] {}
2019-11-22 16:24:59,302 - Group['livy'] {}
2019-11-22 16:24:59,302 - Group['spark'] {}
2019-11-22 16:24:59,303 - Group['ranger'] {}
2019-11-22 16:24:59,303 - Group['hdfs'] {}
2019-11-22 16:24:59,304 - Group['zeppelin'] {}
2019-11-22 16:24:59,304 - Group['hadoop'] {}
2019-11-22 16:24:59,304 - Group['users'] {}
2019-11-22 16:24:59,305 - Group['knox'] {}
2019-11-22 16:24:59,306 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,314 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,319 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,325 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,331 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,336 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-11-22 16:24:59,342 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,347 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,353 - User['ranger'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['ranger', 'hadoop'], 'uid': None}
2019-11-22 16:24:59,358 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-11-22 16:24:59,363 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['zeppelin', 'hadoop'], 'uid': None}
2019-11-22 16:24:59,369 - User['kms'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['kms', 'hadoop'], 'uid': None}
2019-11-22 16:24:59,375 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,380 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'hadoop'], 'uid': None}
2019-11-22 16:24:59,386 - User['druid'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,392 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['spark', 'hadoop'], 'uid': None}
2019-11-22 16:24:59,398 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'users'], 'uid': None}
2019-11-22 16:24:59,404 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,409 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop'], 'uid': None}
2019-11-22 16:24:59,415 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,421 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,426 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,433 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2019-11-22 16:24:59,444 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop', 'knox'], 'uid': None}
2019-11-22 16:24:59,455 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-11-22 16:24:59,458 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2019-11-22 16:24:59,473 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if
2019-11-22 16:24:59,474 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2019-11-22 16:24:59,478 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-11-22 16:24:59,481 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-11-22 16:24:59,483 - call['/var/lib/ambari-agent/tmp/changeUid.sh hbase'] {}
2019-11-22 16:24:59,505 - call returned (0, '1033')
2019-11-22 16:24:59,506 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1033'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2019-11-22 16:24:59,518 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase 1033'] due to not_if
2019-11-22 16:24:59,519 - Group['hdfs'] {}
2019-11-22 16:24:59,520 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': ['hdfs', 'hadoop', u'hdfs']}
2019-11-22 16:24:59,523 - FS Type: HDFS
2019-11-22 16:24:59,524 - Directory['/etc/hadoop'] {'mode': 0755}
2019-11-22 16:24:59,525 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2019-11-22 16:24:59,554 - Repository['HDP-3.1-repo-153'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-153', 'mirror_list': None}
2019-11-22 16:24:59,567 - Repository['HDP-3.1-GPL-repo-153'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0', 'action': ['prepare'], 'components': [u'HDP-GPL', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-153', 'mirror_list': None}
2019-11-22 16:24:59,571 - Repository['HDP-UTILS-1.1.0.22-repo-153'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'ambari-hdp-153', 'mirror_list': None}
2019-11-22 16:24:59,574 - Repository[None] {'action': ['create']}
2019-11-22 16:24:59,576 - File['/tmp/tmp0eMg1W'] {'content': '[HDP-3.1-repo-153]\nname=HDP-3.1-repo-153\nbaseurl=http://public-repo-1.hortonworks.com/HDP/centos7/3.x/updates/3.1.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-3.1-GPL-repo-153]\nname=HDP-3.1-GPL-repo-153\nbaseurl=http://public-repo-1.hortonworks.com/HDP-GPL/centos7/3.x/updates/3.1.0.0\n\npath=/\nenabled=1\ngpgcheck=0\n[HDP-UTILS-1.1.0.22-repo-153]\nname=HDP-UTILS-1.1.0.22-repo-153\nbaseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2019-11-22 16:24:59,577 - Writing File['/tmp/tmp0eMg1W'] because contents don't match
2019-11-22 16:24:59,578 - Rewriting /etc/yum.repos.d/ambari-hdp-153.repo since it has changed.
2019-11-22 16:24:59,578 - File['/etc/yum.repos.d/ambari-hdp-153.repo'] {'content': StaticFile('/tmp/tmp0eMg1W')}
2019-11-22 16:24:59,580 - Writing File['/etc/yum.repos.d/ambari-hdp-153.repo'] because it doesn't exist
2019-11-22 16:24:59,581 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-11-22 16:25:00,081 - Skipping installation of existing package unzip
2019-11-22 16:25:00,081 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-11-22 16:25:00,380 - Skipping installation of existing package curl
2019-11-22 16:25:00,381 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-11-22 16:25:00,697 - Installing package hdp-select ('/usr/bin/yum -y install hdp-select')
2019-11-22 16:25:32,084 - Reporting component version failed
Traceback (most recent call last):
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 363, in execute
    self.save_component_version_to_structured_out(self.command_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 223, in save_component_version_to_structured_out
    stack_select_package_name = stack_select.get_package_name()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 109, in get_package_name
    package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 223, in get_packages
    supported_packages = get_supported_packages()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 147, in get_supported_packages
    raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
Fail: Unable to query for supported packages using /usr/bin/hdp-select
2019-11-22 16:25:32,505 - Command repositories: HDP-3.1-repo-153, HDP-3.1-GPL-repo-153, HDP-UTILS-1.1.0.22-repo-153
2019-11-22 16:25:32,505 - Applicable repositories: HDP-3.1-repo-153, HDP-3.1-GPL-repo-153, HDP-UTILS-1.1.0.22-repo-153
2019-11-22 16:25:32,506 - Looking for matching packages in the following repositories: HDP-3.1-repo-153, HDP-3.1-GPL-repo-153, HDP-UTILS-1.1.0.22-repo-153
2019-11-22 16:26:06,035 - Package['accumulo_3_1_0_0_78'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-11-22 16:26:06,268 - Installing package accumulo_3_1_0_0_78 ('/usr/bin/yum -y install accumulo_3_1_0_0_78')
2019-11-22 16:28:13,372 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2019-11-22 16:28:13,376 - XmlConfig['accumulo-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/accumulo-client/conf', 'mode': 0644, 'configuration_attributes': {}, 'owner': 'accumulo', 'configurations': ...}
2019-11-22 16:28:13,396 - Generating config: /usr/hdp/current/accumulo-client/conf/accumulo-site.xml
2019-11-22 16:28:13,397 - File['/usr/hdp/current/accumulo-client/conf/accumulo-site.xml'] {'owner': 'accumulo', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': 0644, 'encoding': 'UTF-8'}
2019-11-22 16:28:13,415 - Writing File['/usr/hdp/current/accumulo-client/conf/accumulo-site.xml'] because contents don't match
2019-11-22 16:28:13,423 - File['/usr/hdp/current/accumulo-client/conf/accumulo-env.sh'] {'content': InlineTemplate(...), 'owner': 'accumulo', 'group': 'hadoop', 'mode': 0644}
2019-11-22 16:28:13,424 - Writing File['/usr/hdp/current/accumulo-client/conf/accumulo-env.sh'] because contents don't match
2019-11-22 16:28:13,428 - PropertiesFile['/usr/hdp/current/accumulo-client/conf/client.conf'] {'owner': 'accumulo', 'group': 'hadoop', 'properties': {'instance.zookeeper.host': u'server1.aaa.com:2181,server3.aaa.com:2181,server2.aaa.com:2181', 'instance.name': u'hdp-accumulo-instance', 'instance.zookeeper.timeout': u'30s'}}
2019-11-22 16:28:13,438 - Generating properties file: /usr/hdp/current/accumulo-client/conf/client.conf
2019-11-22 16:28:13,439 - File['/usr/hdp/current/accumulo-client/conf/client.conf'] {'owner': 'accumulo', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2019-11-22 16:28:13,442 - Writing File['/usr/hdp/current/accumulo-client/conf/client.conf'] because contents don't match
2019-11-22 16:28:13,445 - File['/usr/hdp/current/accumulo-client/conf/log4j.properties'] {'content': ..., 'owner': 'accumulo', 'group': 'hadoop', 'mode': 0644}
2019-11-22 16:28:13,447 - TemplateConfig['/usr/hdp/current/accumulo-client/conf/auditLog.xml'] {'owner': 'accumulo', 'template_tag': None, 'group': 'hadoop'}
2019-11-22 16:28:13,453 - File['/usr/hdp/current/accumulo-client/conf/auditLog.xml'] {'content': Template('auditLog.xml.j2'), 'owner': 'accumulo', 'group': 'hadoop', 'mode': None}
2019-11-22 16:28:13,456 - TemplateConfig['/usr/hdp/current/accumulo-client/conf/generic_logger.xml'] {'owner': 'accumulo', 'template_tag': None, 'group': 'hadoop'}
2019-11-22 16:28:13,460 - File['/usr/hdp/current/accumulo-client/conf/generic_logger.xml'] {'content': Template('generic_logger.xml.j2'), 'owner': 'accumulo', 'group': 'hadoop', 'mode': None}
2019-11-22 16:28:13,463 - TemplateConfig['/usr/hdp/current/accumulo-client/conf/monitor_logger.xml'] {'owner': 'accumulo', 'template_tag': None, 'group': 'hadoop'}
2019-11-22 16:28:13,467 - File['/usr/hdp/current/accumulo-client/conf/monitor_logger.xml'] {'content': Template('monitor_logger.xml.j2'), 'owner': 'accumulo', 'group': 'hadoop', 'mode': None}
2019-11-22 16:28:13,470 - File['/usr/hdp/current/accumulo-client/conf/accumulo-metrics.xml'] {'content': StaticFile('accumulo-metrics.xml'), 'owner': 'accumulo', 'group': 'hadoop', 'mode': 0644}
2019-11-22 16:28:13,473 - TemplateConfig['/usr/hdp/current/accumulo-client/conf/tracers'] {'owner': 'accumulo', 'template_tag': None, 'group': 'hadoop'}
2019-11-22 16:28:13,476 - File['/usr/hdp/current/accumulo-client/conf/tracers'] {'content': Template('tracers.j2'), 'owner': 'accumulo', 'group': 'hadoop', 'mode': None}
2019-11-22 16:28:13,479 - TemplateConfig['/usr/hdp/current/accumulo-client/conf/gc'] {'owner': 'accumulo', 'template_tag': None, 'group': 'hadoop'}
2019-11-22 16:28:13,487 - File['/usr/hdp/current/accumulo-client/conf/gc'] {'content': Template('gc.j2'), 'owner': 'accumulo', 'group': 'hadoop', 'mode': None}
2019-11-22 16:28:13,491 - TemplateConfig['/usr/hdp/current/accumulo-client/conf/monitor'] {'owner': 'accumulo', 'template_tag': None, 'group': 'hadoop'}
2019-11-22 16:28:13,499 - File['/usr/hdp/current/accumulo-client/conf/monitor'] {'content': Template('monitor.j2'), 'owner': 'accumulo', 'group': 'hadoop', 'mode': None}
2019-11-22 16:28:13,503 - TemplateConfig['/usr/hdp/current/accumulo-client/conf/slaves'] {'owner': 'accumulo', 'template_tag': None, 'group': 'hadoop'}
2019-11-22 16:28:13,506 - File['/usr/hdp/current/accumulo-client/conf/slaves'] {'content': Template('slaves.j2'), 'owner': 'accumulo', 'group': 'hadoop', 'mode': None}
2019-11-22 16:28:13,508 - TemplateConfig['/usr/hdp/current/accumulo-client/conf/masters'] {'owner': 'accumulo', 'template_tag': None, 'group': 'hadoop'}
2019-11-22 16:28:13,511 - File['/usr/hdp/current/accumulo-client/conf/masters'] {'content': Template('masters.j2'), 'owner': 'accumulo', 'group': 'hadoop', 'mode': None}
2019-11-22 16:28:13,514 - TemplateConfig['/usr/hdp/current/accumulo-client/conf/hadoop-metrics2-accumulo.properties'] {'owner': 'accumulo', 'template_tag': None, 'group': 'hadoop'}
2019-11-22 16:28:13,524 - File['/usr/hdp/current/accumulo-client/conf/hadoop-metrics2-accumulo.properties'] {'content': Template('hadoop-metrics2-accumulo.properties.j2'), 'owner': 'accumulo', 'group': 'hadoop', 'mode': None}
2019-11-22 16:28:13,557 - Reporting component version failed
Traceback (most recent call last):
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 363, in execute
    self.save_component_version_to_structured_out(self.command_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 223, in save_component_version_to_structured_out
    stack_select_package_name = stack_select.get_package_name()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 109, in get_package_name
    package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 223, in get_packages
    supported_packages = get_supported_packages()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 147, in get_supported_packages
    raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
Fail: Unable to query for supported packages using /usr/bin/hdp-select
2019-11-22 16:28:14,031 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2019-11-22 16:28:14,080 - Reporting component version failed
Traceback (most recent call last):
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 363, in execute
    self.save_component_version_to_structured_out(self.command_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 223, in save_component_version_to_structured_out
    stack_select_package_name = stack_select.get_package_name()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 109, in get_package_name
    package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 223, in get_packages
    supported_packages = get_supported_packages()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 147, in get_supported_packages
    raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
Fail: Unable to query for supported packages using /usr/bin/hdp-select

Command failed after 1 tries`
=============
在服务器上运行命令“/usr/bin/hdp select”时,我能够获得该命令的输出

从所有节点完全删除ambari服务器和代理,并删除ambari repo文件。 输入命令apt get update


然后下载并重新安装。

不要回答离题问题。投票关闭: