如何在本地pyspark会话中启用配置单元动态分区
我试图在本地spark会话中启用动态分区(不是在应用程序模式下) 我在pyspark shell中运行以下命令(使用spark 2.4) spark.sqlContext.setConf(“hive.exec.dynamic.partition”,“true”) spark.sqlContext.setConf(“hive.exec.dynamic.partition.mode”,“nonstrict”) 低于误差如何在本地pyspark会话中启用配置单元动态分区,pyspark,apache-spark-sql,Pyspark,Apache Spark Sql,我试图在本地spark会话中启用动态分区(不是在应用程序模式下) 我在pyspark shell中运行以下命令(使用spark 2.4) spark.sqlContext.setConf(“hive.exec.dynamic.partition”,“true”) spark.sqlContext.setConf(“hive.exec.dynamic.partition.mode”,“nonstrict”) 低于误差 AttributeError:“SparkSession”对象没有属性“sqlC
AttributeError:“SparkSession”对象没有属性“sqlContext”能否尝试获取上下文作为
from pyspark.sql import SQLContext
sqlContext = SQLContext(spark.sparkContext)
sqlContext.setConf("hive.exec.dynamic.partition", "true")
sqlContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")
你能试着获取上下文吗
from pyspark.sql import SQLContext
sqlContext = SQLContext(spark.sparkContext)
sqlContext.setConf("hive.exec.dynamic.partition", "true")
sqlContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")
似乎是的复制品?似乎是的复制品?