Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/ssl/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 在Ubunut 16.04上安装Ambari Spark2失败,错误为';E:找不到组件spark-atlas-connector-3-0-1-0-187';_Apache Spark_Ubuntu 16.04_Ambari_Apache Spark 2.0 - Fatal编程技术网

Apache spark 在Ubunut 16.04上安装Ambari Spark2失败,错误为';E:找不到组件spark-atlas-connector-3-0-1-0-187';

Apache spark 在Ubunut 16.04上安装Ambari Spark2失败,错误为';E:找不到组件spark-atlas-connector-3-0-1-0-187';,apache-spark,ubuntu-16.04,ambari,apache-spark-2.0,Apache Spark,Ubuntu 16.04,Ambari,Apache Spark 2.0,已尝试从Ambari UI安装Spark2历史记录服务器 标准: File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/job_history_server.py", line 102, in <module> JobHistoryServer().execute() File "/usr/lib/ambari-agent/lib/resource_manage

已尝试从Ambari UI安装Spark2历史记录服务器

标准:

  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/job_history_server.py", line 102, in <module>
    JobHistoryServer().execute()
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 352, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/3.0/services/SPARK2/package/scripts/job_history_server.py", line 42, in install
    self.install_packages(env)
  File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 849, in install_packages
    retry_count=agent_stack_retry_count)
  File "/usr/lib/ambari-agent/lib/resource_management/core/base.py", line 166, in __init__
    self.env.run()
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 160, in run
    self.run_action(resource, action)
  File "/usr/lib/ambari-agent/lib/resource_management/core/environment.py", line 124, in run_action
    provider_action()
  File "/usr/lib/ambari-agent/lib/resource_management/core/providers/packaging.py", line 30, in action_install
    self._pkg_manager.install_package(package_name, self.__create_context())
  File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/apt_manager.py", line 35, in wrapper
    return function_to_decorate(self, name, *args[2:], **kwargs)
  File "/usr/lib/ambari-agent/lib/ambari_commons/repo_manager/apt_manager.py", line 279, in install_package
    shell.repository_manager_executor(cmd, self.properties, context, env=self.properties.install_cmd_env)
  File "/usr/lib/ambari-agent/lib/ambari_commons/shell.py", line 753, in repository_manager_executor
    raise RuntimeError(message)
RuntimeError: Failed to execute command '/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install spark-atlas-connector-3-0-1-0-187', exited with code '100', message: 'E: Unable to locate package spark-atlas-connector-3-0-1-0-187
'
安巴里:HDP-3.0.1.0

Ubuntu 16.04

回购协议:安巴里大街

当尝试从ambari UI安装时,它总是失败

是否有任何方法可以成功运行此功能,或者有任何其他方法可以克服此问题


我是否遗漏了要检查或配置的内容?

您是否找到了问题的解决方案?我有完全一样的problem@Depa我没有找到解决方案,但我可以通过使用不同版本的url成功安装堆栈。
2019-12-24 14:48:57,890 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2019-12-24 14:48:57,894 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-12-24 14:48:57,895 - Group['livy'] {}
2019-12-24 14:48:57,896 - Group['ubuntu'] {}
2019-12-24 14:48:57,896 - Group['spark'] {}
2019-12-24 14:48:57,896 - Group['hdfs'] {}
2019-12-24 14:48:57,897 - User['livy'] {'gid': 'ubuntu', 'fetch_nonlocal_groups': True, 'groups': ['livy', 'ubuntu'], 'uid': None}
2019-12-24 14:48:57,897 - User['ubuntu'] {'gid': 'ubuntu', 'fetch_nonlocal_groups': True, 'groups': ['ubuntu'], 'uid': None}
2019-12-24 14:48:57,898 - User['spark'] {'gid': 'ubuntu', 'fetch_nonlocal_groups': True, 'groups': ['ubuntu', 'spark'], 'uid': None}
2019-12-24 14:48:57,898 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2019-12-24 14:48:57,899 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ubuntu /tmp/hadoop-ubuntu,/tmp/hsperfdata_ubuntu,/home/ubuntu,/tmp/ubuntu,/tmp/sqoop-ubuntu 0'] {'not_if': '(test $(id -u ubuntu) -gt 1000) || (false)'}
2019-12-24 14:48:57,921 - Group['ubuntu'] {}
2019-12-24 14:48:57,922 - User['ubuntu'] {'fetch_nonlocal_groups': True, 'groups': ['ubuntu', u'ubuntu']}
2019-12-24 14:48:57,922 - FS Type: HDFS
2019-12-24 14:48:57,922 - Directory['/etc/hadoop'] {'mode': 0755}
2019-12-24 14:48:57,933 - File['/usr/hdp/3.0.1.0-187/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'ubuntu', 'group': 'ubuntu'}
2019-12-24 14:48:57,934 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'ubuntu', 'group': 'ubuntu', 'mode': 01777}
2019-12-24 14:48:57,947 - Repository['HDP-3.0-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/ubuntu16/3.x/updates/3.0.1.0', 'action': ['prepare'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-12-24 14:48:57,951 - Repository with url http://public-repo-1.hortonworks.com/HDP-GPL/ubuntu16/3.x/updates/3.0.1.0 is not created due to its tags: set([u'GPL'])
2019-12-24 14:48:57,951 - Repository['HDP-UTILS-1.1.0.22-repo-1'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/ubuntu16', 'action': ['prepare'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'ambari-hdp-1', 'mirror_list': None}
2019-12-24 14:48:57,952 - Repository[None] {'action': ['create']}
2019-12-24 14:48:57,954 - File['/tmp/tmpv32KZc'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu16/3.x/updates/3.0.1.0 HDP main\ndeb http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.22/repos/ubuntu16 HDP-UTILS main'}
2019-12-24 14:48:57,954 - Writing File['/tmp/tmpv32KZc'] because contents don't match
2019-12-24 14:48:57,954 - File['/tmp/tmpaUg0V6'] {'content': StaticFile('/etc/apt/sources.list.d/ambari-hdp-1.list')}
2019-12-24 14:48:57,955 - Writing File['/tmp/tmpaUg0V6'] because contents don't match
2019-12-24 14:48:57,955 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:57,993 - Skipping installation of existing package unzip
2019-12-24 14:48:57,994 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,028 - Skipping installation of existing package curl
2019-12-24 14:48:58,029 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,063 - Skipping installation of existing package hdp-select
2019-12-24 14:48:58,067 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed
2019-12-24 14:48:58,214 - Using hadoop conf dir: /usr/hdp/3.0.1.0-187/hadoop/conf
2019-12-24 14:48:58,223 - Package['spark2-3-0-1-0-187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,261 - Skipping installation of existing package spark2-3-0-1-0-187
2019-12-24 14:48:58,261 - Package['spark2-3-0-1-0-187-python'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,295 - Skipping installation of existing package spark2-3-0-1-0-187-python
2019-12-24 14:48:58,296 - Package['livy2-3-0-1-0-187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,322 - Skipping installation of existing package livy2-3-0-1-0-187
2019-12-24 14:48:58,322 - Package['spark-atlas-connector-3-0-1-0-187'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2019-12-24 14:48:58,357 - Installing package spark-atlas-connector-3-0-1-0-187 ('/usr/bin/apt-get -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install spark-atlas-connector-3-0-1-0-187')
2019-12-24 14:49:30,887 - The repository with version 3.0.1.0-187 for this command has been marked as resolved. It will be used to report the version of the component which was installed

Command failed after 1 tries