Pyspark 气流火花提交操作员

Pyspark 气流火花提交操作员,pyspark,airflow,airflow-scheduler,airflow-operator,Pyspark,Airflow,Airflow Scheduler,Airflow Operator,我发出spark2提交命令,如下所示: value = Varibale.get('value') cmd = """ spark2-submit --master yarn --deploy-mode cluster --driver-memory=10G --conf spark.dynamicAllocation.minE

我发出spark2提交命令,如下所示:

    value = Varibale.get('value')
    cmd = """
                    spark2-submit --master yarn --deploy-mode cluster 
                    --driver-memory=10G 
                    --conf spark.dynamicAllocation.minExecutors=5
                    --conf spark.dynamicAllocation.maxExecutors=10 
                    --queue test 
                    --executor-memory=10G
                    --executor-cores=2
                    --conf spark.yarn.driver.memoryOverhead=5120
                    --conf spark.driver.maxResultSize=2G 
                    --conf spark.yarn.executor.memoryOverhead=5120 
                    --conf spark.kryoserializer.buffer.max=1000m 
                    --conf spark.executor.extraJavaOptions=-XX:+UseG1GC 
                    --conf spark.network.timeout=15000s
                    --conf spark.executor.heartbeatInterval=1500s 
                    --conf spark.task.maxDirectResultSize=8G 
                    --principal test-host@test
                    --keytab /home/test-host.keytab 
                    --conf spark.ui.view.acls="*" 
                    /home/test/test.py {0}
                    """.format(value)

    test = SSHOperator(task_id='TEST',
                   ssh_conn_id='test-conn',
                   command=cmd
                   )

我想把它转换成SparkSubmitOperator。另外,我需要spark2提交

如何将上述内容转换为SparkSubmitor运算符? 到目前为止,我已经尝试:

          
                                      
SparkSubmitOperator(task_id='TEST',
conn_id='test-conn',
application=f'/home/test/test.py {0}'.format(value),
executor_cores=2,
executor_memory='10g',
)


气流中的SparkSubmitOperator所需的选项可以在字典中发送。请记住,字典中的键应该与函数的参数名相同

创建以下两个词典:

base_config = {
    "task_id":"TEST",
    "conn_id":"test-conn",
    "application": "/home/test/test.py"
    "executor-memory":"10G",
    "driver-memory":"10G",
    "executor-cores":2,
    "principal":"test-host@test",
    "keytab":"/home/test-host.keytab",
    "env_vars":{"SPARK_MAJOR_VERSION":2}
    }

spark_config = {
    "spark.master": "yarn",
    "spark.submit.deployMode": "client",
    "spark.yarn.queue":"test",
    "spark.dynamicAllocation.minExecutors":5,
    "spark.dynamicAllocation.maxExecutors":10, 
    "spark.yarn.driver.memoryOverhead":5120,
    "spark.driver.maxResultSize":"2G",
    "spark.yarn.executor.memoryOverhead":5120,
    "spark.kryoserializer.buffer.max":"1000m",
    "spark.executor.extraJavaOptions":"-XX:+UseG1GC",
    "spark.network.timeout":"15000s",
    "spark.executor.heartbeatInterval":"1500s",
    "spark.task.maxDirectResultSize":"8G",
    "spark.ui.view.acls":"*"
}

SparkSubmitOperator(**base_config,conf=spark_config)

这将使您的流量配置受到驱动。

气流中的SparkSubmitOperator所需的选项可以在字典中发送。请记住,字典中的键应该与函数的参数名相同

创建以下两个词典:

base_config = {
    "task_id":"TEST",
    "conn_id":"test-conn",
    "application": "/home/test/test.py"
    "executor-memory":"10G",
    "driver-memory":"10G",
    "executor-cores":2,
    "principal":"test-host@test",
    "keytab":"/home/test-host.keytab",
    "env_vars":{"SPARK_MAJOR_VERSION":2}
    }

spark_config = {
    "spark.master": "yarn",
    "spark.submit.deployMode": "client",
    "spark.yarn.queue":"test",
    "spark.dynamicAllocation.minExecutors":5,
    "spark.dynamicAllocation.maxExecutors":10, 
    "spark.yarn.driver.memoryOverhead":5120,
    "spark.driver.maxResultSize":"2G",
    "spark.yarn.executor.memoryOverhead":5120,
    "spark.kryoserializer.buffer.max":"1000m",
    "spark.executor.extraJavaOptions":"-XX:+UseG1GC",
    "spark.network.timeout":"15000s",
    "spark.executor.heartbeatInterval":"1500s",
    "spark.task.maxDirectResultSize":"8G",
    "spark.ui.view.acls":"*"
}

SparkSubmitOperator(**base_config,conf=spark_config)

这将使您的流配置受驱动。

是使用spark2提交还是仅使用spark提交?这取决于您的spark安装,默认情况下运行spark提交。您可以传递环境变量的字典。我已将此更改添加到回答中,它是使用spark2提交还是仅使用spark提交?这取决于您的spark安装,默认情况下运行spark提交。您可以传递环境变量的字典。我已在答案中添加了此更改