Dataproc通过Python客户端提交Hadoop作业

Dataproc通过Python客户端提交Hadoop作业,python,google-cloud-platform,gcloud,google-cloud-dataproc,Python,Google Cloud Platform,Gcloud,Google Cloud Dataproc,我尝试使用Dataproc API,尝试将gcloud命令转换为API,但在文档中找不到好的示例 %pip install google-cloud-dataproc 我发现的唯一好的样本是这个,它工作得很好: from google.cloud import dataproc_v1 client = dataproc_v1.ClusterControllerClient() project_id = 'test-project' region = 'global' for elemen

我尝试使用Dataproc API,尝试将gcloud命令转换为API,但在文档中找不到好的示例

%pip install google-cloud-dataproc
我发现的唯一好的样本是这个,它工作得很好:

from google.cloud import dataproc_v1

client = dataproc_v1.ClusterControllerClient()

project_id = 'test-project'
region = 'global'

for element in client.list_clusters(project_id, region):   
    print('Dataproc cluster name:', element.cluster_name)
我需要将以下gcloud命令转换为Python代码:

gcloud dataproc jobs submit hadoop --cluster "${CLUSTER_NAME}" \
    --class com.mycompany.product.MyClass \
    --jars "${JAR_FILE}" -- \
    --job_venv=venv.zip \
    --job_binary_path=venv/bin/python3.5 \
    --job_executes program.py \
这项工作:

project_id = 'your project'
region = 'global'

# Define Job arguments:

job_args = ['--job_venv=venv.zip',
            '--job_binary_path=venv/bin/python3.5',
            '--job_executes program.py']


job_client = dataproc_v1.JobControllerClient()

# Create Hadoop Job
hadoop_job = dataproc_v1.types.HadoopJob(jar_file_uris=[JAR_FILE], main_class='com.mycompany.product.MyClass',args=job_args)

# Define Remote cluster to send Job
job_placement = dataproc_v1.types.JobPlacement()
job_placement.cluster_name = 'your_cluster_name'

# Define Job configuration
main_job = dataproc_v1.types.Job(hadoop_job=hadoop_job, placement=job_placement)

# Send job
job_client.submit_job(project_id, region, main_job)

# Monitor in Dataproc UI or perform another API call to track status