Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/oracle/9.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Sql 重写查询以使用分析函数_Sql_Oracle_Analytics - Fatal编程技术网

Sql 重写查询以使用分析函数

Sql 重写查询以使用分析函数,sql,oracle,analytics,Sql,Oracle,Analytics,我有一个事件表,记录事件的插入、更新和删除。 见MWE her: DDL语句 CREATE TABLE "EVENTS" ( "EVENT_ID" VARCHAR2(30 CHAR), --Name of the Event "EVENT_LOCATION" VARCHAR2(60 CHAR), --Location on which the event occured "EVENT_TRIGGER" VARCHAR2(2 CHAR), --Trigger wh

我有一个事件表,记录事件的插入、更新和删除。 见MWE her:

DDL语句

CREATE TABLE "EVENTS" 
   (
    "EVENT_ID" VARCHAR2(30 CHAR), --Name of the Event
    "EVENT_LOCATION" VARCHAR2(60 CHAR), --Location on which the event occured
    "EVENT_TRIGGER" VARCHAR2(2 CHAR),  --Trigger which protocolled the event (I,U or D)
    "EVENT_CHANGE_ID" NUMBER,  --Unique Sequence Number
    "EVENT_CHANGE_DATE" DATE DEFAULT SYSTIMESTAMP
   );

INSERT INTO EVENTS (EVENT_ID,EVENT_LOCATION,EVENT_TRIGGER,EVENT_CHANGE_ID,EVENT_CHANGE_DATE) 
VALUES ('EVENT1','LOC1','I',1,SYSTIMESTAMP-1);
INSERT INTO EVENTS (EVENT_ID,EVENT_LOCATION,EVENT_TRIGGER,EVENT_CHANGE_ID,EVENT_CHANGE_DATE) 
VALUES ('EVENT1','LOC2','U',11,SYSTIMESTAMP-1);
INSERT INTO EVENTS (EVENT_ID,EVENT_LOCATION,EVENT_TRIGGER,EVENT_CHANGE_ID,EVENT_CHANGE_DATE) 
VALUES ('EVENT1','LOC4','U',117,SYSTIMESTAMP-1);
INSERT INTO EVENTS (EVENT_ID,EVENT_LOCATION,EVENT_TRIGGER,EVENT_CHANGE_ID,EVENT_CHANGE_DATE) 
VALUES ('EVENT1','LOC7','D',1430,SYSTIMESTAMP-1);

INSERT INTO EVENTS (EVENT_ID,EVENT_LOCATION,EVENT_TRIGGER,EVENT_CHANGE_ID,EVENT_CHANGE_DATE) 
VALUES ('EVENT2','LOC1','I',2,SYSTIMESTAMP-1/48);
INSERT INTO EVENTS (EVENT_ID,EVENT_LOCATION,EVENT_TRIGGER,EVENT_CHANGE_ID,EVENT_CHANGE_DATE) 
VALUES ('EVENT2','LOC2','U',131,SYSTIMESTAMP-1/48);
INSERT INTO EVENTS (EVENT_ID,EVENT_LOCATION,EVENT_TRIGGER,EVENT_CHANGE_ID,EVENT_CHANGE_DATE) 
VALUES ('EVENT2','LOC5','D',11337,SYSTIMESTAMP-1/48);
INSERT INTO EVENTS (EVENT_ID,EVENT_LOCATION,EVENT_TRIGGER,EVENT_CHANGE_ID,EVENT_CHANGE_DATE) 
VALUES ('EVENT2','LOC7','D',14430,SYSTIMESTAMP-1/48);
我想确定在LOC1插入并在LOC7删除的事件数量,其间没有任何删除

SELECT COUNT(*) AS QTY, TRUNC(A.EVENT_CHANGE_DATE) AS DAY
FROM (
    SELECT EVENT_ID, EVENT_CHANGE_ID, EVENT_CHANGE_DATE FROM EVENTS WHERE EVENT_TRIGGER = 'I' AND EVENT_LOCATION = 'LOC1'
    ) A,
    (SELECT EVENT_ID, EVENT_CHANGE_ID, EVENT_CHANGE_DATE FROM EVENTS WHERE EVENT_TRIGGER = 'D' AND EVENT_LOCATION = 'LOC7')
    B
WHERE B.EVENT_CHANGE_ID > A.EVENT_CHANGE_ID AND A.EVENT_ID = B.EVENT_ID
    AND not exists (SELECT EVENT_ID, EVENT_CHANGE_ID, EVENT_CHANGE_DATE FROM EVENTS WHERE EVENT_TRIGGER = 'D' AND EVENT_CHANGE_ID > A.EVENT_CHANGE_ID AND EVENT_CHANGE_ID < B.EVENT_CHANGE_ID and EVENT_ID = A.EVENT_ID) 
group by TRUNC(A.EVENT_CHANGE_DATE)
ORDER BY TRUNC(A.EVENT_CHANGE_DATE);
我的天真方法是可行的,但是我想知道是否可以使用分析函数重写这个查询。 原始表包含多达100万条记录。就执行时间和性能而言,3倍全表扫描是一个错误


甚至可以通过分析函数使此查询更高效吗?

这看起来非常适合SQL模式匹配:

select * from events
match_recognize (
  partition by event_id
  order by event_change_date
  measures 
    count ( ins.* ) ins_count,
    min ( event_change_date ) dt
  pattern ( ins upd* del )
  define 
    ins as event_trigger = 'I' and event_location = 'LOC1',
    upd as event_trigger = 'U',
    del as event_trigger = 'D' and event_location = 'LOC7'
);

INS_COUNT    DT                     
           1 16-MAR-2020 12:33:58 

这将在LOC1处搜索INSERT,然后在LOC7处搜索Delete,其间有任意数量的更新。

仅使用经典分析函数

仅过滤相关事件

(EVENT_TRIGGER = 'I' AND EVENT_LOCATION = 'LOC1')  OR  -- only LOC1 inserts
 EVENT_TRIGGER = 'D')                                  -- all deletes
然后引导下一次删除并检查位置

with evnt as
(
  select EVENT_ID, EVENT_LOCATION, EVENT_TRIGGER, EVENT_CHANGE_DATE,
    lead(EVENT_TRIGGER) over (PARTITION BY EVENT_ID 
                                  order by EVENT_CHANGE_DATE, EVENT_LOCATION)
      as EVENT_TRIGGER_LEAD,
    lead(EVENT_LOCATION) over (PARTITION BY EVENT_ID
                                   order by EVENT_CHANGE_DATE, EVENT_LOCATION)
      as EVENT_LOCATION_LEAD
  from EVENTS
  where (EVENT_TRIGGER = 'I' AND EVENT_LOCATION = 'LOC1') OR EVENT_TRIGGER = 'D'
)
select 
  EVENT_ID, EVENT_LOCATION, EVENT_TRIGGER, EVENT_CHANGE_DATE,
  EVENT_TRIGGER_LEAD, EVENT_LOCATION_LEAD
from evnt
where EVENT_TRIGGER = 'I'
  and EVENT_TRIGGER_LEAD = 'D' 
  and EVENT_LOCATION_LEAD = 'LOC7'
order by EVENT_ID, EVENT_CHANGE_DATE, EVENT_LOCATION;

可以使用分析函数SUM将1添加到结果中(当其为LOC1和I时),并将-1添加到结果中(当其为D时),则最终结果将是SUM=0且位置为LOC7的记录

请看答案:

SQL> SELECT EVENT_ID FROM
  2      ( SELECT SUM(CASE
  3                  WHEN EVENT_LOCATION = 'LOC1' AND EVENT_TRIGGER = 'I' THEN 1
  4                  WHEN EVENT_TRIGGER = 'D' THEN - 1
  5               END) OVER( PARTITION BY EVENT_ID ORDER BY EVENT_CHANGE_DATE ) AS SM,
  6               T.*
  7          FROM EVENTS T
  8      ) T
  9  WHERE EVENT_LOCATION = 'LOC7' AND SM = 0;

EVENT_ID
------------
EVENT1

SQL>

干杯

使用铅分析功能:

SELECT COUNT(*) as qty,
       TRUNC(event_change_date)day  
       FROM(
            SELECT
                event_location,
                event_trigger,
                event_change_date,
                lead(event_trigger) 
                  OVER(PARTITION BY trunc(event_change_date) 
                       ORDER BY to_number(substr(event_location, - 1, 1))) rn 
            FROM events
) WHERE event_trigger <> 'D'
AND rn <> 'D'
AND event_trigger = rn
GROUP BY trunc(event_change_date);

QTY        DAY     
---------- --------
         1 16-03-20
逻辑:

将每天的事件分组,并使用SUBSTR根据位置从1到7进行排序,然后从字符串的背面获取数字。 使用LEAD将事件触发与其LEAD进行比较。 每个日期的分区组中的事件_触发器不应具有从1到7的DELETE。
在问题中包括代码。不要依赖外部站点。不起作用我更新了问题,不知怎的sqlfiddle不起作用匹配的好例子!非常好的解决方案。以前从未听说过,但这是一个强大的工具。有趣的方法,仅限于所有删除都跟随插入,并且在同一位置没有重复插入的情况。