Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/mysql/61.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
AWS RDS上MySQL通用查询日志的大小_Mysql_Csv_Amazon Web Services_Logging - Fatal编程技术网

AWS RDS上MySQL通用查询日志的大小

AWS RDS上MySQL通用查询日志的大小,mysql,csv,amazon-web-services,logging,Mysql,Csv,Amazon Web Services,Logging,我正在尝试查找常规查询日志的大小。我无法通过mysql界面找到,因为它是通过CSV引擎存储的,查询时只显示0: show table status from mysql; # Name, Engine, Version, Row_format, Rows, Avg_row_length, Data_length, Max_data_length, Index_length, Data_free, Auto_increment, Create_time, Update_time, Check_t

我正在尝试查找常规查询日志的大小。我无法通过mysql界面找到,因为它是通过CSV引擎存储的,查询时只显示0:

show table status from mysql;

# Name, Engine, Version, Row_format, Rows, Avg_row_length, Data_length, Max_data_length, Index_length, Data_free, Auto_increment, Create_time, Update_time, Check_time, Collation, Checksum, Create_options, Comment

'general_log', 'CSV', '10', 'Dynamic', '1', '0', '0', '0', '0', '0', NULL, NULL, NULL, NULL, 'utf8_general_ci', NULL, '', 'General log'
我知道其中至少有10万行,主要是通过以下方式手动检查:

select * from mysql.general_log;
问题是,我似乎找不到从AWS控制台端访问日志的方法。在管理控制台中,只有包含以下信息的常规日志:

/rdsdbbin/mysql/bin/mysqld,版本:5.6.23-log mysql社区 服务器GPL。开始于:Tcp端口:3306 Unix套接字: /tmp/mysql.sock

。。。差不多

我无法访问实际的csv,因为我无法控制实际的服务器


有没有人有一个聪明的方法来确定桌子的大小?最坏的情况下,我可以计算每个字段的长度,并通过行计数进行估计?

我最终要做的是从MySQL控制台查询大小,并根据列类型估计大小

DESCRIBE mysql.general_log;
SELECT COUNT(*) FROM mysql.general_log;
您将看到6列,每个列的大小相对固定,但两个中等文本列除外,您必须对其进行估计。但是你可以得到一个像样的球标。

您可以添加以下过程,该过程将从mysql控制台确定CSV表的确切大小:

DELIMITER //
DROP PROCEDURE IF EXISTS checkcsv//
CREATE PROCEDURE checkcsv(IN databasename CHAR(200),IN tablename CHAR(200))
BEGIN
  SET SESSION group_concat_max_len=10*1024*1024; /* 10Mb buffer for CONCAT_WS */
  SELECT GROUP_CONCAT(COLUMN_NAME) INTO @columnames FROM INFORMATION_SCHEMA.COLUMNS WHERE (TABLE_SCHEMA = databasename AND TABLE_NAME = tablename);
  SET @get_colsizes_stmt = CONCAT("SELECT SUM(CHAR_LENGTH(REPLACE(REPLACE(REPLACE(CONCAT_WS(',',",@columnames,"),UNHEX('0A'),'nn'),UNHEX('22'),'nn'),UNHEX('5C'),'nn'))) INTO @total_length FROM ",databasename,".",tablename,";");
  PREPARE get_colsizes FROM @get_colsizes_stmt;
  EXECUTE get_colsizes;
  DEALLOCATE PREPARE get_colsizes;
  SET @get_count_stmt = CONCAT('SELECT COUNT(*) INTO @rowcount FROM ',databasename,'.',tablename,';');
  PREPARE get_count FROM @get_count_stmt;
  EXECUTE get_count;
  DEALLOCATE PREPARE get_count;
  SELECT 2*COUNT(COLUMN_NAME) INTO @non_numeric_cols_count FROM INFORMATION_SCHEMA.COLUMNS WHERE (TABLE_SCHEMA = databasename AND TABLE_NAME = tablename AND NUMERIC_SCALE IS NULL); /* Counting quotes */
  SET @total_size=@total_length+(@rowcount*@non_numeric_cols_count) /* Adding counted quotes */ +@rowcount /* one LineFeed per row */;
  SET @avg_row_length=@total_size/@rowcount;
  SET @output_stmt = CONCAT ("SELECT CONCAT('",databasename,"','.','",tablename,"') AS 'Table', ",@rowcount," AS 'Number Of Rows', ROUND(@avg_row_length) AS 'Average Row Length', ",ROUND(@total_size)," AS 'Total size' FROM ",databasename,".",tablename," LIMIT 1;");
  PREPARE outputr FROM @output_stmt;
  EXECUTE outputr;
  DEALLOCATE PREPARE outputr;
END;
//
DELIMITER ;
----- 

----- Usage Example: ----- 
mysql> CALL checkcsv("mysql","general_log");
+-------------------+----------------+--------------------+------------+
| Table             | Number Of Rows | Average Row Length | Total size |
+-------------------+----------------+--------------------+------------+
| mysql.general_log |             53 |                183 |       9673 |
+-------------------+----------------+--------------------+------------+
1 row in set (0.01 sec)

Query OK, 0 rows affected (0.06 sec)
-----