Apache spark pyspark sql中函数的使用范围
当我跑的时候Apache spark pyspark sql中函数的使用范围,apache-spark,pyspark,pyspark-sql,Apache Spark,Pyspark,Pyspark Sql,当我跑的时候 spark.sql(''' select client,avg(amount) over (partition by client order by my_timestamp range between interval 30 days preceding and current row) as moving_avg_amount from my_table''') 它起作用了 但是如果我想排除最后的X天,它
spark.sql('''
select client,avg(amount) over (partition by client
order by my_timestamp
range between interval 30 days preceding and current row) as moving_avg_amount
from my_table''')
它起作用了
但是如果我想排除最后的X天,它就失败了
... range between interval 30 days preceding and 12 days preceding ..
or :
... range between interval 30 days preceding and interval 12 days preceding ..
正确的语法是什么?在本文中,建议范围间隔在SparkSQL中不能正常工作 谢谢@RickyG,看起来像个老虫子,不明白为什么它没有得到修复。。。。