有没有办法在cloudbuild步骤中运行python脚本?

有没有办法在cloudbuild步骤中运行python脚本?,python,docker,google-cloud-platform,google-cloud-build,kubeflow-pipelines,Python,Docker,Google Cloud Platform,Google Cloud Build,Kubeflow Pipelines,我有一系列的cloudbuild步骤,将管道上传到gcp kubeflow。现在我想在下一步中运行该管道。为此,我编写了一个python脚本,我希望它在接下来的cloudbuild步骤中运行这个python脚本 这是我的python脚本 import kfp, import os, EXPERIMENT_NAME = 'Covertype_Classifier_Training', RUN_ID = 'Run_001', SOURCE_TABLE = 'covertype_dataset.co

我有一系列的cloudbuild步骤,将管道上传到gcp kubeflow。现在我想在下一步中运行该管道。为此,我编写了一个python脚本,我希望它在接下来的cloudbuild步骤中运行这个python脚本

这是我的python脚本

import kfp,
import os,

EXPERIMENT_NAME = 'Covertype_Classifier_Training',
RUN_ID = 'Run_001',
SOURCE_TABLE = 'covertype_dataset.covertype',
DATASET_ID = 'splits',
EVALUATION_METRIC = 'accuracy',
EVALUATION_METRIC_THRESHOLD = '0.69',
MODEL_ID = 'covertype_classifier',
VERSION_ID = 'v01',
REPLACE_EXISTING_VERSION = 'True',
ARTIFACT_STORE_URI = 'gs://hostedkfp-default-e8c59nl4zo',
GCS_STAGING_PATH = '{}/staging'.format(ARTIFACT_STORE_URI),
REGION = 'us-central1',
runname=testind,
params = {,
     EXPERIMENT_NAME:'Covertype_Classifier_Training',,
     project_id:'kkkkk',,
     gcs_root:gs://hostedkfp-default-e8c59nl4zo/staging,,
     region:us-central1,,
     source_table_name :covertype_dataset.covertype,,
     dataset_id :splits,,
     evaluation_metric_name:accuracy,,
     evaluation_metric_threshold:0.69,,
     model_id:covertype_classifier,,
     version_id:v01,,
     replace_existing_version:True,
    },
,
exp_id=Covertype_Classifier_Training_4,
client = kfp.Client(host=myhost),
pipelines = client.list_pipelines(),
total_pipeline = len(pipelines.pipelines),
pipeline_id=pipelines.pipelines[total_pipeline-1].id,
kfp.run_pipeline(experiment_id=exp_id, job_name=runname, pipeline_id=pipeline_id, params=params)



这可能会有所帮助。我尝试了这个步骤,但它给出了这个错误“step#4:docker.io/library/python:3.7 step#4:/usr/local/bin/python:没有名为pipeline的模块。uu main_u#“pipeline”是一个包,不能直接执行”这里讨论了这个错误。您可以共享您的云构建管道吗?