Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/sql-server/26.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Sql 分组元素的最大计数_Sql_Sql Server_Select - Fatal编程技术网

Sql 分组元素的最大计数

Sql 分组元素的最大计数,sql,sql-server,select,Sql,Sql Server,Select,数据库是AdventureWorks select COUNT(*) as 'Number of times a product is sold at same quantity' from Sales.SalesOrderDetail group by OrderQty, ProductID order by COUNT(*) desc 这将返回如下数据: Number of times a product is sold at same q

数据库是AdventureWorks

select COUNT(*) as 'Number of times a product is sold at same quantity' 
from
  Sales.SalesOrderDetail 
group by 
         OrderQty, ProductID 
order by 
         COUNT(*) desc
这将返回如下数据:

Number of times a product is sold at same quantity
--------------------------------------------------
4279
3216
3095
2376
2334
2319
2234
2201
2121
2025
1712
1488
1396
1161
1044
和其他2600多行

我有兴趣得到4279作为输出

我无法应用Max,因为它不适用于聚合函数或子查询。反正我试过了。没用

我猜我不能,因为count(*)不是一列。但如果有办法的话:


如何最大限度地利用这些输出?

只需添加
TOP
即可限制结果数量

select TOP 1 COUNT(*) as 'Number of times a product is sold at same quantity' 
from  Sales.SalesOrderDetail 
group by  OrderQty, ProductID 
order by  COUNT(*) desc

更新1

WITH results 
AS
(
  select COUNT(*) as [Number of times a product is sold at same quantity],
         DENSE_RANK() OVER (ORDER BY COUNT(*) DESC) rank_no 
  from   Sales.SalesOrderDetail 
  group   by OrderQty, ProductID 
)
SELECT [Number of times a product is sold at same quantity]
FROM   results
WHERE  rank_no = 2

@AnubhavSaini-如果这不是你的实际目标,为什么要问这样一个问题:“我有兴趣获得4279作为输出?”?回答者首先必须回答您实际提出的问题,然后将其更改为回答您想问的问题,这只会浪费回答者的时间。您可以使用
CTE
窗口功能来完成此操作。请参见此处,将此答案设为子查询,但将1更改为2。然后选择min()。@MartinSmith我的理由是,如果我能获得最高分数,我也能找到下一个。@AnubhavSaini-从你接下来的问题来看,情况显然不是这样。下次请问你真正的问题。