Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/sockets/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hive 在配置单元中每周计算列的总和_Hive_Apache Spark Sql_Hiveql_Hive Query - Fatal编程技术网

Hive 在配置单元中每周计算列的总和

Hive 在配置单元中每周计算列的总和,hive,apache-spark-sql,hiveql,hive-query,Hive,Apache Spark Sql,Hiveql,Hive Query,我在Hive中有一个表saytestTable(包含3年的数据),包含以下列: retailers, order_total, order_total_qty, order_date 'source_name' as source, sum(retailers), sum(order_total), sum(order_total_qty) 我必须使用以下列创建一个新表: retailers, order_total, order_total_qty, order_date 'sourc

我在Hive中有一个表say
testTable
(包含3年的数据),包含以下列:

retailers, order_total, order_total_qty, order_date
'source_name' as source, sum(retailers), sum(order_total), sum(order_total_qty) 
我必须使用以下列创建一个新表:

retailers, order_total, order_total_qty, order_date
'source_name' as source, sum(retailers), sum(order_total), sum(order_total_qty) 
自开始订单日期起的每周

我被这件事缠住了。我如何将以下数据按每周汇总的方式分组

使用
WEEKOFYEAR()
函数每周计算汇总

select 
  'source_name'           source, 
   sum(retailers)         sum_retailers, 
   sum(order_total)       sum_order_total, 
   sum(order_total_qty)   sum_order_total_qty,
   WEEKOFYEAR(order_date) week,
   year(order_date)       year
from testTable 
where order_date >= '2015-01-01' --start_date
group by WEEKOFYEAR(order_date), year(order_date)
order by year, week; --order if necessary