Sql 按子查询内部分组
我在Databricks内部使用SQL。下面的查询是有效的,但我还想按一个名为sale_id的列进行分组。我该如何进行Sql 按子查询内部分组,sql,group-by,apache-spark-sql,Sql,Group By,Apache Spark Sql,我在Databricks内部使用SQL。下面的查询是有效的,但我还想按一个名为sale_id的列进行分组。我该如何进行 %sql select (select count(distinct time) from table where sign_up > 0) / (select count(distinct time) from table where action > 0 or clic
%sql
select
(select
count(distinct time)
from
table
where
sign_up > 0)
/
(select
count(distinct time)
from
table
where
action > 0 or
click > 0)
as cc3
使用条件聚合写入查询:
select (count(distinct case when sign_up > 0 then time end) /
count(distinct case when action > 0 or click > 0 then time end)
) as cc3
from table;
然后,通过分组很容易:
select col,
(count(distinct case when sign_up > 0 then time end) /
count(distinct case when action > 0 or click > 0 then time end)
) as cc3
from table
group by col;
工作得很有魅力。非常感谢。