Google cloud platform 谷歌云平台上的HDP

Google cloud platform 谷歌云平台上的HDP,google-cloud-platform,cloud,gcloud,hortonworks-data-platform,Google Cloud Platform,Cloud,Gcloud,Hortonworks Data Platform,我创建了一个带有6个节点集群的dataproc集群,在安装bdutil时面临以下问题: ******************* gcloud compute stderr ******************* ERROR: (gcloud.compute.disks.create) Could not fetch resource: - Insufficient Permission ERROR: (gcloud.compute.disks.create) Could not fetch r

我创建了一个带有6个节点集群的dataproc集群,在安装bdutil时面临以下问题:

******************* gcloud compute stderr *******************
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
 - Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
 - Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
 - Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
 - Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
 - Insufficient Permission
************ ERROR logs from gcloud compute stderr ************
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
******************* Exit codes and VM logs *******************
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-0-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-1-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-2-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-3-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-m-pd --zone=zone(un
set)

HDP和Dataproc是不同的产品。我的意思是,执行bdutil不需要创建Dataproc集群。从单个实例执行就足够了,因为所有必需的配置都在bdutil_env.sh/ambari.conf中设置。工具bdutil不创建任何Dataproc集群,而是创建自定义vm实例来承载HDP

以下是一些没有很好记录的步骤:

  • 我设置了变量,权限问题消失了。很可能这就是你面临的问题。 1.1如果不起作用,请执行以下命令:
    gcloud auth activate service account--key file=/PATH/JSON\u CREDENTIALS

  • 如果其他错误显示为“无效值区域(未设置)”,只需在bdutil_env.sh中设置它们
    2.1如果仍然存在相同的错误,请直接转到platforms/hdp/ambari.conf以更新配置

  • 您需要设置许可防火墙规则来访问实例,允许节点之间的通信,并允许您访问主服务器中的Ambari


  • 完成上述步骤后,我可以使用Ambari安装HDP。

    但您想做什么?请发布您的代码,否则我们无法找出错误发生的位置。我正在谷歌云上安装ambari,我遵循以下文档,当我尝试部署ambari时,我面临上述问题。代码是./bdutil-e ambari deploy