mysql:统计子表中的记录,按日期分组

mysql:统计子表中的记录,按日期分组,mysql,Mysql,我有两个表,一个包含任务的表和一个存储任务进度的表“Tasklog” Tasks ID Description 1 task1 2 task2 3 task3 4 task4 5 task5 任务日志 ID taskId dtTime Status 1 1 2016-01-01 new 2 1 2016-02-10 in progress 3 1 2016-03-03 closed 4 2 2016

我有两个表,一个包含任务的表和一个存储任务进度的表“Tasklog”

Tasks
ID Description 
1  task1
2  task2
3  task3
4  task4
5  task5
任务日志

ID taskId dtTime        Status
1   1       2016-01-01  new
2   1       2016-02-10  in progress
3   1       2016-03-03  closed
4   2       2016-01-01  new
5   2       2016-01-10  in progress
6   2       2016-01-11  closed
7   3       2016-01-01  new 
8   4       2016-01-01  new
9   5       2016-01-01  new
10  5       2016-01-01  in progress
一个任务可以有多个任务日志记录 现在,我想创建一个报告,显示在过去24个月中,每月有多少个任务处于“打开”状态(打开意味着在该月没有状态为“关闭”的tasklog记录)

所以我需要这样的东西:

2016-01 4
2016-02 4
2016-03 3
我相信这些应该是查询结果的实际数字

我正在努力写一个合适的查询来实现这一点。我想我很接近了,但是通过下面的查询,我正在计算tasklog记录,相反,我希望计算任务记录,而不管它们有多少tasklog记录

select month(dtTime) month, year(dtTime) year, count(t.id) as number
    from tasklog tl
    join tasks t on t.id = taskId
    where dtTime > DATE_ADD(now(), INTERVAL -2 YEAR)
    and t.id not in ( 
        select id from tasklog tl1 
        where status in ('closed')
        and month(tl1.dtTime) = month(tl.dtTime) 
        and year(tl1.dtTime) = year(tl.dtTime)
    )
    group by month(dtTime), year(dtTime)
    order by dtTime;
有什么建议我可以最好地做到这一点吗

创建/填写表格的代码:

CREATE TABLE tasks
    (`id` int, `description` varchar(5))
;

INSERT INTO tasks
    (`id`, `description`)
VALUES
    (1, 'task1'),
    (2, 'task2'),
    (3, 'task3'),
    (4, 'task4'),
    (5, 'task5')
;


CREATE TABLE tasklog
    (`ID` int, `taskId` int, `dtTime` datetime, `Status` varchar(11))
;

INSERT INTO tasklog
    (`ID`, `taskId`, `dtTime`, `Status`)
VALUES
    (1, 1, '2016-01-01 00:00:00', 'new'),
    (2, 1, '2016-02-10 00:00:00', 'in progress'),
    (3, 1, '2016-03-03 00:00:00', 'closed'),
    (4, 2, '2016-01-01 00:00:00', 'new'),
    (5, 2, '2016-01-10 00:00:00', 'in progress'),
    (6, 2, '2016-01-11 00:00:00', 'closed'),
    (7, 3, '2016-01-01 00:00:00', 'new'),
    (8, 4, '2016-01-01 00:00:00', 'new'),
    (9, 5, '2016-01-01 00:00:00', 'new'),
    (10, 5, '2016-01-01 00:00:00', 'in progress')
;
针对Stefano Zanini的更新

似乎这两种解决方案都没有得到正确的结果。我认为,正确的结果是:

2016-01 4
2016-02 4
2016-03 3
你的询问只给了我:

1   2016    4
如果我将distinct添加到查询中,我会得到:

1   2016    5
2   2016    1
3   2016    1 

我会选择左路

select month(tl.dtTime) month, year(tl.dtTime) year, count(t.id) as number
    from tasks t
    join tasklog tl
    on t.id = tl.taskId and tl.status = 'new'
    left join tasklog tl2
    on t.id = tl2.taskId and month(tl.dtTime) = month(tl2.dtTime) and tl2.status = 'closed'
    where dtTime > DATE_ADD(now(), INTERVAL -2 YEAR)
    and tl2.taskId is null
    group by month(tl.dtTime), year(tl.dtTime)
    order by dtTime;
编辑

…但您可能只需要在
计数中添加
distinct
子句

count(distinct t.id)
进一步解释后编辑

我误解了这个要求。这将实现以下目的:

select  distinct mm.month, count(distinct tl.taskId)
from    (select distinct month(dtTime) mm from tasklog) mm
join    tasklog tl
 on     mm.montId >= month(tl.dtTime) and tl.Status = 'new'
left join
        tasklog tl2
 on     mm.month >= month(tl2.dtTime) and tl2.Status = 'closed' and tl.taskId = tl2.taskId
where   tl2.taskId is null
group by mm.month
order by 1;
第一个join会在每个月加入到该月为止打开的所有任务。对于第二个联接(左侧联接),如果任务在该月或更早的一个月内关闭,则为每个月/taskId指定一个值。从这里,您可以只筛选每个月打开的任务,筛选不具有此值的行


您将不得不添加年份,也许还需要优化连接子句,但是使用您提供的示例数据,这个查询非常好。

这就是我的结论:

select  distinct mm.year as year, 
                 mm.month as month, 
                 count(distinct tl.taskId) number
from    (select distinct month(dtTime) month , year(dttime) year  from tasklog) mm
join    tasklog tl
 on     mm.month >= month(tl.dtTime) and mm.year >= year(tl.dtTime) and tl.Status !='closed'
left join
        tasklog tl2
 on     mm.month >= month(tl2.dtTime) and mm.year >= year(tl2.dtTime) and tl2.Status = 'closed' and tl.taskId = tl2.taskId
where   tl2.taskId is null
group by mm.month
order by 1;

请看我在问题中的回答(请大耳朵评论)对不起,我误解了您的要求。我用一个解决方案编辑了我的答案,这个解决方案应该正是你所需要的。谢谢Stefano,这个查询确实给了我想要的结果,我花了一段时间才弄清楚发生了什么,但现在我明白了。这似乎是一个聪明的解决方案,谢谢!