Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/visual-studio-2012/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Sql 在BigQuery中按2列进行分区_Sql_Google Bigquery_Ranking Functions - Fatal编程技术网

Sql 在BigQuery中按2列进行分区

Sql 在BigQuery中按2列进行分区,sql,google-bigquery,ranking-functions,Sql,Google Bigquery,Ranking Functions,假设我想对每个公寓的前3个月收入和收入金额进行排名,我可以执行以下操作,没有问题: Select * FROM (SELECT Flat, EXTRACT(YEAR FROM pay_day) AS Year, EXTRACT(MONTH FROM pay_day) AS Month, RANK() OVER(PARTITION BY Flat ORDER BY SUM(USD_amt) DESC) AS rank, SUM(USD_amt) AS Revenue FROM `finances

假设我想对每个公寓的前3个月收入和收入金额进行排名,我可以执行以下操作,没有问题:

Select * FROM (SELECT Flat,
EXTRACT(YEAR FROM pay_day) AS Year,
EXTRACT(MONTH FROM pay_day) AS Month,
RANK() OVER(PARTITION BY Flat ORDER BY SUM(USD_amt) DESC) AS rank, 
SUM(USD_amt) AS Revenue
FROM `finances.reservations` AS f
GROUP BY Flat, Year, Month)
WHERE rank<=3 AND Flat IS NOT NULL
ORDER BY Flat, Year, Rank, Month 
ASC
但现在,如果说我想要每个单位每年的前3个月,我想我只需要按年份修改分区如下:

RANK() OVER(PARTITION BY Flat, EXTRACT(YEAR FROM pay_day) ORDER BY SUM(USD_amt) DESC) AS rank
Field name            | Type    | Mode
----------------------+---------+------------
Flat                  | STRING  | NULLABLE
pay_day               | DATE    | NULLABLE
nights                | INTEGER | NULLABLE
check_in              | DATE    | NULLABLE
check_out             | DATE    | NULLABLE
nights__in_month_     | STRING  | NULLABLE
nights_outside_month_ | STRING  | NULLABLE
cleaning              | INTEGER | NULLABLE
currency              | STRING  | NULLABLE
USD_amt               | INTEGER | NULLABLE
EANR                  | STRING  | NULLABLE
name                  | STRING  | NULLABLE
people                | INTEGER | NULLABLE
country               | STRING  | NULLABLE
reservation_no_       | STRING  | NULLABLE
payment_processor     | STRING  | NULLABLE
Check_in_day          | STRING  | NULLABLE
Cleaner               | STRING  | NULLABLE
Review                | STRING  | NULLABLE
我希望结果是这样的:

   Flat Year Month Rank Revenue
    1   2019    12  1   3281
    1   2019    4   2   3031
    1   2019    1   3   2031
    1   2020    4   1   3031
    1   2020    9   2   3001
    1   2020    7   3   2919
但这会导致一个错误“按表达式划分引用pay_day列,该列在[4:50]时既没有分组也没有聚合”,我想知道我做错了什么

表架构如下所示:

RANK() OVER(PARTITION BY Flat, EXTRACT(YEAR FROM pay_day) ORDER BY SUM(USD_amt) DESC) AS rank
Field name            | Type    | Mode
----------------------+---------+------------
Flat                  | STRING  | NULLABLE
pay_day               | DATE    | NULLABLE
nights                | INTEGER | NULLABLE
check_in              | DATE    | NULLABLE
check_out             | DATE    | NULLABLE
nights__in_month_     | STRING  | NULLABLE
nights_outside_month_ | STRING  | NULLABLE
cleaning              | INTEGER | NULLABLE
currency              | STRING  | NULLABLE
USD_amt               | INTEGER | NULLABLE
EANR                  | STRING  | NULLABLE
name                  | STRING  | NULLABLE
people                | INTEGER | NULLABLE
country               | STRING  | NULLABLE
reservation_no_       | STRING  | NULLABLE
payment_processor     | STRING  | NULLABLE
Check_in_day          | STRING  | NULLABLE
Cleaner               | STRING  | NULLABLE
Review                | STRING  | NULLABLE

我想您可以在BigQuery中引用列别名:

SELECT *
FROM (SELECT Flat, EXTRACT(YEAR FROM pay_day) AS Year,
             EXTRACT(MONTH FROM pay_day) AS Month,
             RANK() OVER (PARTITION BY Flat, Year ORDER BY SUM(USD_amt) DESC) AS rank, 
             SUM(USD_amt) AS Revenue
      FROM `finances.reservations` AS f
      GROUP BY Flat, Year, Month
     ) ym
WHERE rank <= 3 AND Flat IS NOT NULL
ORDER BY Flat, Year, Rank, Month ASC
GROUP BY
范围内的最小日期具有相同的年份,因此实际上是相同的。我意识到这就是我通常处理这个问题的方式