Apache spark spark收集特定分区的列统计信息

Apache spark spark收集特定分区的列统计信息,apache-spark,apache-spark-sql,partition,cost-based-optimizer,Apache Spark,Apache Spark Sql,Partition,Cost Based Optimizer,如何让spark只收集特定分区的列统计信息 WARN SparkSqlAstBuilder: Partition specification is ignored when collecting column statistics: PARTITION(myPart='myValue') 似乎忽略了我的过滤器: ANALYZE TABLE ${fullyQualifiedTable} PARTITION(${table.partitionColumn} = '$partitionVal') C

如何让spark只收集特定分区的列统计信息

WARN SparkSqlAstBuilder: Partition specification is ignored when collecting column statistics: PARTITION(myPart='myValue')
似乎忽略了我的过滤器:

ANALYZE TABLE ${fullyQualifiedTable}
PARTITION(${table.partitionColumn} = '$partitionVal')
COMPUTE STATISTICS
FOR COLUMNS ${co.mkString(", ")}

可能有关联是否可以计算临时注册表(即筛选到所需分区的spark df)上的统计信息(是),并将统计信息可靠地存储到原始表的表属性(?)中。