Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/sql/76.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
在MySQL中操作用户数据_Mysql_Sql - Fatal编程技术网

在MySQL中操作用户数据

在MySQL中操作用户数据,mysql,sql,Mysql,Sql,MySQL新手,需要帮助将表1中的用户数据操作到表2中所示的结构中 表1 表2 用户会话定义为至少每30分钟请求一次的用户活动周期。当用户处于非活动状态超过30分钟时,会话结束 有人知道如何编写将表1转换为表2的mysql代码吗 以下代码可用于创建日志表: CREATE TABLE log ( user_id int, request_timestamp datetime); INSERT INTO log VALUES (1, '2014-10-26 10:51:18'), (1, '2

MySQL新手,需要帮助将表1中的用户数据操作到表2中所示的结构中

表1

表2

用户会话定义为至少每30分钟请求一次的用户活动周期。当用户处于非活动状态超过30分钟时,会话结束

有人知道如何编写将表1转换为表2的mysql代码吗

以下代码可用于创建日志表:

CREATE TABLE log
( user_id int, request_timestamp datetime);

INSERT INTO log
VALUES
(1, '2014-10-26 10:51:18'), (1, '2014-10-26 10:52:20'), (1, '2014-10-26 11:15:03'), (1, '2014-10-26 11:39:18'), (1, '2014-10-26 15:01:18'), (1, '2014-10-26 15:01:21'), (1, '2014-10-27 21:22:19'),
(2, '2014-10-15 12:19:01'), (2, '2014-10-15 12:19:12'), (2, '2014-10-15 12:19:45'), (2, '2014-10-15 12:20:03'), (2, '2014-10-17 14:55:13'), (2, '2014-10-17 14:55:19'),(2, '2014-10-17 14:55:22')
;
试试这个:

SELECT user_id,
    count(*) as request_count,
    min(request_timestamp) as session_start,
    max(request_timestamp) as session_end,
    timestampdiff(
                SECOND,
                min(request_timestamp), 
                max(request_timestamp) 
            ) as session_duration 
FROM `log` 
GROUP BY user_id
追加

现在,有了@drew这个有价值的答案,您就可以得到确切建议的表2:

以我的输出表为例,在括号内插入他的代码

SELECT user_id,
    sessionnum as `session`,
    count(*) as request_count,
    min(request_timestamp) as session_start,
    max(request_timestamp) as session_end,
    timestampdiff(
                SECOND,
                min(request_timestamp), 
                max(request_timestamp) 
            ) as session_duration 
FROM (put code of drew here) ttt
GROUP BY user_id, sessionnum
然而

  • 我仍然认为,您最好在单独的字段中设置会话号,方法是插入从具有观察到的活动的表中触发的触发器,以防止将来当日志变得太大时数据库负载过大

  • 停止使用保留字和MySQL函数名作为表(列)名的别名(例如示例中的
    log
    session

  • 模式 首先,我们将为以下内容命名,以便将其可视化:

    注:下方的1800表示30分钟*60秒/分钟

    Specimen A
    -----  
    select l.user_id,l.request_timestamp,
    @sessionnum := 
    if((@curuser = user_id and TIME_TO_SEC(TIMEDIFF(request_timestamp,@theDt))>1800),@sessionnum + 1, 
    if(@curuser <> user_id,1,@sessionnum))  as sessionnum,
    @curuser := user_id as v_curuser,
    @theDt:=request_timestamp as v_theDt
    from log l cross join
    (select @curuser := '', @sessionnum := 0,@theDt:='') gibberish
    order by l.user_id,l.request_timestamp
    +---------+---------------------+------------+-----------+---------------------+
    | user_id | request_timestamp   | sessionnum | v_curuser | v_theDt             |
    +---------+---------------------+------------+-----------+---------------------+
    |       1 | 2014-10-26 10:51:18 | 1          |         1 | 2014-10-26 10:51:18 |
    |       1 | 2014-10-26 10:52:20 | 1          |         1 | 2014-10-26 10:52:20 |
    |       1 | 2014-10-26 11:15:03 | 1          |         1 | 2014-10-26 11:15:03 |
    |       1 | 2014-10-26 11:39:18 | 1          |         1 | 2014-10-26 11:39:18 |
    |       1 | 2014-10-26 15:01:18 | 2          |         1 | 2014-10-26 15:01:18 |
    |       1 | 2014-10-26 15:01:21 | 2          |         1 | 2014-10-26 15:01:21 |
    |       1 | 2014-10-27 21:22:19 | 3          |         1 | 2014-10-27 21:22:19 |
    |       2 | 2014-10-15 12:19:01 | 1          |         2 | 2014-10-15 12:19:01 |
    |       2 | 2014-10-15 12:19:12 | 1          |         2 | 2014-10-15 12:19:12 |
    |       2 | 2014-10-15 12:19:45 | 1          |         2 | 2014-10-15 12:19:45 |
    |       2 | 2014-10-15 12:20:03 | 1          |         2 | 2014-10-15 12:20:03 |
    |       2 | 2014-10-17 14:55:13 | 2          |         2 | 2014-10-17 14:55:13 |
    |       2 | 2014-10-17 14:55:19 | 2          |         2 | 2014-10-17 14:55:19 |
    |       2 | 2014-10-17 14:55:22 | 2          |         2 | 2014-10-17 14:55:22 |
    +---------+---------------------+------------+-----------+---------------------+
    

    请注意OP对会话的定义。这是一个不活动的状态,而不是持续时间。

    为什么编写代码?您发布了insert语句。另外,您使用的是mysql还是sqlserver?它们不是一回事。看起来dts86希望将“日志”转换为表2。我建议在日志表上创建一个插入触发器。这几乎可行,但没有考虑到以下情况:“用户会话定义为至少每30分钟发出一次请求的用户活动周期。当用户处于非活动状态超过30分钟时,会话结束。您知道如何将其纳入代码中吗?@dts86我认为这是可能的。但是我建议您最好先在日志表中设置会话号。日志表似乎将由其他表中的触发器填充一段时间后,它可能会变得巨大。一旦您计划填充它,如果当前用户先前的活动发生在>30分钟之前,则DB设置会话数递增的第三个int字段将不会是一个沉重的负担。使用
    timestampdiff()
    函数比较最新输入和
    current\u时间戳()
    用于此目的。@dts86.如果您添加此字段,只需将其名称添加到此查询的“分组依据”部分。@dts86我已附加了我的答案
    Specimen A
    -----  
    select l.user_id,l.request_timestamp,
    @sessionnum := 
    if((@curuser = user_id and TIME_TO_SEC(TIMEDIFF(request_timestamp,@theDt))>1800),@sessionnum + 1, 
    if(@curuser <> user_id,1,@sessionnum))  as sessionnum,
    @curuser := user_id as v_curuser,
    @theDt:=request_timestamp as v_theDt
    from log l cross join
    (select @curuser := '', @sessionnum := 0,@theDt:='') gibberish
    order by l.user_id,l.request_timestamp
    +---------+---------------------+------------+-----------+---------------------+
    | user_id | request_timestamp   | sessionnum | v_curuser | v_theDt             |
    +---------+---------------------+------------+-----------+---------------------+
    |       1 | 2014-10-26 10:51:18 | 1          |         1 | 2014-10-26 10:51:18 |
    |       1 | 2014-10-26 10:52:20 | 1          |         1 | 2014-10-26 10:52:20 |
    |       1 | 2014-10-26 11:15:03 | 1          |         1 | 2014-10-26 11:15:03 |
    |       1 | 2014-10-26 11:39:18 | 1          |         1 | 2014-10-26 11:39:18 |
    |       1 | 2014-10-26 15:01:18 | 2          |         1 | 2014-10-26 15:01:18 |
    |       1 | 2014-10-26 15:01:21 | 2          |         1 | 2014-10-26 15:01:21 |
    |       1 | 2014-10-27 21:22:19 | 3          |         1 | 2014-10-27 21:22:19 |
    |       2 | 2014-10-15 12:19:01 | 1          |         2 | 2014-10-15 12:19:01 |
    |       2 | 2014-10-15 12:19:12 | 1          |         2 | 2014-10-15 12:19:12 |
    |       2 | 2014-10-15 12:19:45 | 1          |         2 | 2014-10-15 12:19:45 |
    |       2 | 2014-10-15 12:20:03 | 1          |         2 | 2014-10-15 12:20:03 |
    |       2 | 2014-10-17 14:55:13 | 2          |         2 | 2014-10-17 14:55:13 |
    |       2 | 2014-10-17 14:55:19 | 2          |         2 | 2014-10-17 14:55:19 |
    |       2 | 2014-10-17 14:55:22 | 2          |         2 | 2014-10-17 14:55:22 |
    +---------+---------------------+------------+-----------+---------------------+
    
    select user_id,request_timestamp,sessionnum
    from
    (   select l.user_id,l.request_timestamp,
        @sessionnum := 
        if((@curuser = user_id and TIME_TO_SEC(TIMEDIFF(request_timestamp,@theDt))>1800),@sessionnum + 1, 
        if(@curuser <> user_id,1,@sessionnum))  as sessionnum,
        @curuser := user_id as v_curuser,
        @theDt:=request_timestamp as v_theDt
        from log l cross join
        (select @curuser := '', @sessionnum := 0,@theDt:='') gibberish
        order by l.user_id,l.request_timestamp
    ) SpecimenA
    order by user_id,sessionnum
    +---------+---------------------+------------+
    | user_id | request_timestamp   | sessionnum |
    +---------+---------------------+------------+
    |       1 | 2014-10-26 10:51:18 | 1          |
    |       1 | 2014-10-26 10:52:20 | 1          |
    |       1 | 2014-10-26 11:15:03 | 1          |
    |       1 | 2014-10-26 11:39:18 | 1          |
    |       1 | 2014-10-26 15:01:18 | 2          |
    |       1 | 2014-10-26 15:01:21 | 2          |
    |       1 | 2014-10-27 21:22:19 | 3          |
    |       2 | 2014-10-15 12:19:01 | 1          |
    |       2 | 2014-10-15 12:19:12 | 1          |
    |       2 | 2014-10-15 12:19:45 | 1          |
    |       2 | 2014-10-15 12:20:03 | 1          |
    |       2 | 2014-10-17 14:55:13 | 2          |
    |       2 | 2014-10-17 14:55:19 | 2          |
    |       2 | 2014-10-17 14:55:22 | 2          |
    +---------+---------------------+------------+
    14 rows in set (0.02 sec)