Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/sql/85.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Sql 如何编写查询以将rownumber(1到n)附加到每个组的每个记录_Sql_Vertica - Fatal编程技术网

Sql 如何编写查询以将rownumber(1到n)附加到每个组的每个记录

Sql 如何编写查询以将rownumber(1到n)附加到每个组的每个记录,sql,vertica,Sql,Vertica,我有一个类似下面的数据集 |date|flag| |20190503|0| |20190504|1| |20190505|1| |20190506|1| |20190507|1| |20190508|0| |20190509|0| |20190510|0| |20190511|1| |20190512|1| |20190513|0| |20190514|0| |20190515|1| 我想要实现的是按flag=1对连续日期进行分组,并在flag=1的连续日期的第一天为mark1添加一个列计数器

我有一个类似下面的数据集

|date|flag|
|20190503|0|
|20190504|1|
|20190505|1|
|20190506|1|
|20190507|1|
|20190508|0|
|20190509|0|
|20190510|0|
|20190511|1|
|20190512|1|
|20190513|0|
|20190514|0|
|20190515|1|
我想要实现的是按flag=1对连续日期进行分组,并在flag=1的连续日期的第一天为mark1添加一个列计数器,在第二天为2,以此类推,为flag=0指定0

|date|flag|counter|
|20190503|0|0|
|20190504|1|1|
|20190505|1|2|
|20190506|1|3|
|20190507|1|4|
|20190508|0|0|
|20190509|0|0|
|20190510|0|0|
|20190511|1|1|
|20190512|1|2|
|20190513|0|0|
|20190514|0|0|
|20190515|1|1|
我尝试了分析功能和层次查询,但仍然没有找到解决方案,寻求帮助,任何提示都非常感谢

谢谢,
Hong

您可以使用零的累积和来定义组。然后使用行号:

另一种非常不同的方法是采用当前日期与标志的累计最大值之间的差值=0日期:


请注意,这两种方法的逻辑是不同的——尽管它们应该为您提供的数据产生相同的结果。对于缺少的日期,第一个只是忽略缺少的日期。第二个将增加缺少日期的计数器。

嗯-Vertica有一个非常好的条件更改事件函数,可以帮助您实现这一点

每次括号之间的表达式更改时,一个整数将递增1。每当标志更改时,这将为您提供一个新的组标识符或分区依据的标准。因此,我们选择获取分组信息,然后根据获得的分组信息进行分区。下面是:

WITH
input(dt,flag) AS (
          SELECT '2019-05-03'::DATE,0
UNION ALL SELECT '2019-05-04'::DATE,1
UNION ALL SELECT '2019-05-05'::DATE,1
UNION ALL SELECT '2019-05-06'::DATE,1
UNION ALL SELECT '2019-05-07'::DATE,1
UNION ALL SELECT '2019-05-08'::DATE,0
UNION ALL SELECT '2019-05-09'::DATE,0
UNION ALL SELECT '2019-05-10'::DATE,0
UNION ALL SELECT '2019-05-11'::DATE,1
UNION ALL SELECT '2019-05-12'::DATE,1
UNION ALL SELECT '2019-05-13'::DATE,0
UNION ALL SELECT '2019-05-14'::DATE,0
UNION ALL SELECT '2019-05-15'::DATE,1
)
,
grp_input AS (
SELECT
*
, CONDITIONAL_CHANGE_EVENT(flag) OVER(ORDER BY dt) AS grp
FROM input
)
SELECT
dt
, flag
, CASE FLAG
WHEN 0 THEN 0
ELSE ROW_NUMBER() OVER(PARTITION BY grp ORDER BY dt)
END AS counter
FROM grp_input;
-- out      dt     | flag | counter 
-- out ------------+------+---------
-- out  2019-05-03 |    0 |       0
-- out  2019-05-04 |    1 |       1
-- out  2019-05-05 |    1 |       2
-- out  2019-05-06 |    1 |       3
-- out  2019-05-07 |    1 |       4
-- out  2019-05-08 |    0 |       0
-- out  2019-05-09 |    0 |       0
-- out  2019-05-10 |    0 |       0
-- out  2019-05-11 |    1 |       1
-- out  2019-05-12 |    1 |       2
-- out  2019-05-13 |    0 |       0
-- out  2019-05-14 |    0 |       0
-- out  2019-05-15 |    1 |       1
-- out (13 rows)
-- out 
select t.*,
       datediff(day,
                max(case when flag = 0 then date end) over (order by date),
                date
               ) as counter
from t;
WITH
input(dt,flag) AS (
          SELECT '2019-05-03'::DATE,0
UNION ALL SELECT '2019-05-04'::DATE,1
UNION ALL SELECT '2019-05-05'::DATE,1
UNION ALL SELECT '2019-05-06'::DATE,1
UNION ALL SELECT '2019-05-07'::DATE,1
UNION ALL SELECT '2019-05-08'::DATE,0
UNION ALL SELECT '2019-05-09'::DATE,0
UNION ALL SELECT '2019-05-10'::DATE,0
UNION ALL SELECT '2019-05-11'::DATE,1
UNION ALL SELECT '2019-05-12'::DATE,1
UNION ALL SELECT '2019-05-13'::DATE,0
UNION ALL SELECT '2019-05-14'::DATE,0
UNION ALL SELECT '2019-05-15'::DATE,1
)
,
grp_input AS (
SELECT
*
, CONDITIONAL_CHANGE_EVENT(flag) OVER(ORDER BY dt) AS grp
FROM input
)
SELECT
dt
, flag
, CASE FLAG
WHEN 0 THEN 0
ELSE ROW_NUMBER() OVER(PARTITION BY grp ORDER BY dt)
END AS counter
FROM grp_input;
-- out      dt     | flag | counter 
-- out ------------+------+---------
-- out  2019-05-03 |    0 |       0
-- out  2019-05-04 |    1 |       1
-- out  2019-05-05 |    1 |       2
-- out  2019-05-06 |    1 |       3
-- out  2019-05-07 |    1 |       4
-- out  2019-05-08 |    0 |       0
-- out  2019-05-09 |    0 |       0
-- out  2019-05-10 |    0 |       0
-- out  2019-05-11 |    1 |       1
-- out  2019-05-12 |    1 |       2
-- out  2019-05-13 |    0 |       0
-- out  2019-05-14 |    0 |       0
-- out  2019-05-15 |    1 |       1
-- out (13 rows)
-- out