Google cloud platform 如何从控制台向pyspark作业提交传递和访问参数?

Google cloud platform 如何从控制台向pyspark作业提交传递和访问参数?,google-cloud-platform,pyspark,pyspark-sql,Google Cloud Platform,Pyspark,Pyspark Sql,目前,我们在google存储上有sample.py文件,我们需要从控制台向这个脚本传递参数 #sample.py from pyspark.sql import SparkSession from pyspark.sql import functions as F import sys reg = (sys.argv[1]) month = (sys.argv[2]) current_date = (sys.argv[3]) 我们正在尝试使用以下命令提交作业:- gcloud datapr

目前,我们在google存储上有sample.py文件,我们需要从控制台向这个脚本传递参数

#sample.py
from pyspark.sql import SparkSession
from pyspark.sql import functions as F
import sys


reg = (sys.argv[1])
month = (sys.argv[2])
current_date = (sys.argv[3])
我们正在尝试使用以下命令提交作业:-

gcloud dataproc jobs submit pyspark --project=my_project --cluster=my_cluster --region=region_1 gs://shashi/python-scripts/sample.py abc 11 2019-12-05
它给出了以下错误:-

ERROR: (gcloud.dataproc.jobs.submit.pyspark) argument --properties: Bad syntax for dict arg: [spark.driver.memory]. Please see `gcloud topic flags-file` or `gcloud topic escaping` for information on providing list or dictionary flag values with special characters.
Usage: gcloud dataproc jobs submit pyspark PY_FILE --cluster=CLUSTER [optional flags] [-- JOB_ARGS ...]
  optional flags may be  --archives | --async | --bucket | --driver-log-levels |
                         --files | --help | --jars | --labels |
                         --max-failures-per-hour | --properties | --py-files |
                         --region

您忘记了在参数之前包含
--

gcloud dataproc jobs submit pyspark --project=my_project --cluster=my_cluster --region=region_1 gs://shashi/python-scripts/sample.py -- abc 11 2019-12-05