Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/sql/71.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Sql 按5、10、15分钟块分组时间戳记录_Sql_Postgresql_Psql - Fatal编程技术网

Sql 按5、10、15分钟块分组时间戳记录

Sql 按5、10、15分钟块分组时间戳记录,sql,postgresql,psql,Sql,Postgresql,Psql,我的表中以类似的格式存储了每分钟的财务记录 dt | open | high | low | close | vol ---------------------+----------+----------+----------+----------+------- 2018-05-04 15:30:00 | 171.0000 | 171.3000 | 170.9000 | 171.0000 | 42817 2018-0

我的表中以类似的格式存储了每分钟的财务记录

         dt          |   open   |   high   |   low    |  close   |  vol  
---------------------+----------+----------+----------+----------+-------
 2018-05-04 15:30:00 | 171.0000 | 171.3000 | 170.9000 | 171.0000 | 42817
 2018-05-04 15:29:00 | 170.8000 | 171.0000 | 170.8000 | 170.9500 | 32801
 2018-05-04 15:28:00 | 170.8500 | 171.0000 | 170.8000 | 170.8000 | 22991
 2018-05-04 15:27:00 | 170.8500 | 170.8500 | 170.7500 | 170.8000 | 40283
 2018-05-04 15:26:00 | 170.9500 | 171.0000 | 170.8000 | 170.8500 | 46636
等等

我想把它们分为5分钟,10分钟,60分钟,就像烛台一样。使用
date\u trunc('hour',dt)
是不可能的,因为我想将它们分组为最后60分钟、最后15分钟等


我使用的是PostgreSQL。

您可以使用
generate_series()
创建您想要的任何范围


然后检查您的记录是否在该范围内。

您应该使用
分组依据
和:

floor(extract('epoch' from dt) / 300)
以5分钟为间隔对数据进行分组。300是5分钟内的秒数。因此,如果你想要10分钟,你可以除以600。如果你想要1小时,3600

如果希望间隔时间从00 05 10开始,请使用
floor()
。如果希望它们在00、05、10点结束,请使用
ceil()

SELECT
子句中,您应该使用

不清楚您是否希望所有的“块”结果都在同一个查询中,如果您想要一个烛台图,我假设是。我还从逻辑上推导了每列的正确聚合函数(MIN、MAX、AVG、SUM),紧跟在它们的名称后面。你可能必须适应这个

我们开始:

 SELECT '5 minutes' as block,
        to_timestamp(floor((extract('epoch' from dt) / 300)) * 300)  as ts, 
        round(AVG(open),4) as avg_open,  
        round(MAX(high),4) as max_high, 
        round(MIN(low),4) as min_low, 
        round(AVG(close),4) as avg_close,  
        SUM(vol) as sum_vol  
 FROM mytable
 GROUP BY floor(extract('epoch' from dt) / 300)

 UNION ALL

  SELECT '10 minutes' as block,
        to_timestamp(floor((extract('epoch' from dt) / 600)) * 600)  as ts, 
        round(AVG(open),4) as avg_open,  
        round(MAX(high),4) as max_high, 
        round(MIN(low),4) as min_low, 
        round(AVG(close),4) as avg_close,  
        SUM(vol) as sum_vol  
 FROM mytable
 GROUP BY floor(extract('epoch' from dt) / 600)

  UNION ALL

  SELECT '1 hour' as block,
        to_timestamp(floor((extract('epoch' from dt) / 3600)) * 3600)  as ts, 
        round(AVG(open),4) as avg_open,  
        round(MAX(high),4) as max_high, 
        round(MIN(low),4) as min_low, 
        round(AVG(close),4) as avg_close,  
        SUM(vol) as sum_vol  
 FROM mytable
 GROUP BY floor(extract('epoch' from dt) / 3600)
结果:

    block       ts                  avg_open    max_high    min_low     avg_close   sum_vol
    5 minutes   04.05.2018 17:30:00 171         171,3       170,9       171         42817
    5 minutes   04.05.2018 17:25:00 170,8625    171         170,75      170,85      142711
    10 minutes  04.05.2018 17:20:00 170,8625    171         170,75      170,85      142711
    10 minutes  04.05.2018 17:30:00 171         171,3       170,9       171         42817
    1 hour      04.05.2018 17:00:00 170,89      171,3       170,75      170,88      185528

Show us db schema、示例数据、当前和预期输出上测试它。请阅读,这里是学习如何提高问题质量和获得更好答案的好地方。非常感谢,它成功了,我不得不做一些轻微的调整,现在它很完美选择“5分钟”作为块,将时区“UTC”处的时间戳(ceil((从dt中提取('epoch'))/300))*300)作为ts,(数组_agg(由dt ASC打开的订单))[1]o,最大(高)作为h,最小(低)作为l,(数组_agg(由dt DESC关闭的订单))[1]c,总和(vol)作为来自dt>=“2018-05-03”组中的数据(摘录('epoch'from dt)/300)ts desc limit 20命令,``由于声誉原因,它不会记录我的投票,这里新增:D
 SELECT '5 minutes' as block,
        to_timestamp(floor((extract('epoch' from dt) / 300)) * 300)  as ts, 
        round(AVG(open),4) as avg_open,  
        round(MAX(high),4) as max_high, 
        round(MIN(low),4) as min_low, 
        round(AVG(close),4) as avg_close,  
        SUM(vol) as sum_vol  
 FROM mytable
 GROUP BY floor(extract('epoch' from dt) / 300)

 UNION ALL

  SELECT '10 minutes' as block,
        to_timestamp(floor((extract('epoch' from dt) / 600)) * 600)  as ts, 
        round(AVG(open),4) as avg_open,  
        round(MAX(high),4) as max_high, 
        round(MIN(low),4) as min_low, 
        round(AVG(close),4) as avg_close,  
        SUM(vol) as sum_vol  
 FROM mytable
 GROUP BY floor(extract('epoch' from dt) / 600)

  UNION ALL

  SELECT '1 hour' as block,
        to_timestamp(floor((extract('epoch' from dt) / 3600)) * 3600)  as ts, 
        round(AVG(open),4) as avg_open,  
        round(MAX(high),4) as max_high, 
        round(MIN(low),4) as min_low, 
        round(AVG(close),4) as avg_close,  
        SUM(vol) as sum_vol  
 FROM mytable
 GROUP BY floor(extract('epoch' from dt) / 3600)
    block       ts                  avg_open    max_high    min_low     avg_close   sum_vol
    5 minutes   04.05.2018 17:30:00 171         171,3       170,9       171         42817
    5 minutes   04.05.2018 17:25:00 170,8625    171         170,75      170,85      142711
    10 minutes  04.05.2018 17:20:00 170,8625    171         170,75      170,85      142711
    10 minutes  04.05.2018 17:30:00 171         171,3       170,9       171         42817
    1 hour      04.05.2018 17:00:00 170,89      171,3       170,75      170,88      185528