Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/python-3.x/19.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/jpa/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 3.x dataproc在python中创建集群gcloud等效命令_Python 3.x_Google Cloud Platform_Dataproc - Fatal编程技术网

Python 3.x dataproc在python中创建集群gcloud等效命令

Python 3.x dataproc在python中创建集群gcloud等效命令,python-3.x,google-cloud-platform,dataproc,Python 3.x,Google Cloud Platform,Dataproc,如何在python中复制以下gcloud命令 gcloud beta dataproc clusters create spark-nlp-cluster \ --region global \ --metadata 'PIP_PACKAGES=google-cloud-storage spark-nlp==2.5.3' \ --worker-machine-type n1-standard-1 \ --num-workers 2 \ --ima

如何在python中复制以下gcloud命令

gcloud beta dataproc clusters create spark-nlp-cluster \
     --region global \
     --metadata 'PIP_PACKAGES=google-cloud-storage spark-nlp==2.5.3' \
     --worker-machine-type n1-standard-1 \
     --num-workers 2 \
     --image-version 1.4-debian10 \
     --initialization-actions gs://dataproc-initialization-actions/python/pip-install.sh \
     --optional-components=JUPYTER,ANACONDA \
     --enable-component-gateway 
以下是迄今为止我在python中所掌握的内容:


    cluster_data = {
        "project_id": project,
        "cluster_name": cluster_name,
        "config": {
            "gce_cluster_config": {"zone_uri": zone_uri},
            "master_config": {"num_instances": 1, "machine_type_uri": "n1-standard-1"},
            "worker_config": {"num_instances": 2, "machine_type_uri": "n1-standard-1"},
            "software_config":{"image_version":"1.4-debian10","optional_components":{"JUPYTER","ANACONDA"}}
            
        },
    }

    cluster = dataproc.create_cluster(
        request={"project_id": project, "region": region, "cluster": cluster_data}
    )
不确定如何将这些gcloud命令转换为python:

     --metadata 'PIP_PACKAGES=google-cloud-storage spark-nlp==2.5.3' \  
     --initialization-actions gs://dataproc-initialization-actions/python/pip-install.sh \
     --enable-component-gateway