Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/linq/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/powerbi/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
C# jquery datatables服务器端筛选导致EF超时?_C#_Linq_Entity Framework_Datatables_Jquery Datatables - Fatal编程技术网

C# jquery datatables服务器端筛选导致EF超时?

C# jquery datatables服务器端筛选导致EF超时?,c#,linq,entity-framework,datatables,jquery-datatables,C#,Linq,Entity Framework,Datatables,Jquery Datatables,我有下面的方法可以过滤200万条记录,但大多数情况下,如果我想得到最后一页,它会导致entity framework超时。有没有办法改进下面的代码,使其运行得更快 public virtual ActionResult GetData(DataTablesParamsModel param) { try { int totalRowCount = 0; // Gene

我有下面的方法可以过滤200万条记录,但大多数情况下,如果我想得到最后一页,它会导致entity framework超时。有没有办法改进下面的代码,使其运行得更快

 public virtual ActionResult GetData(DataTablesParamsModel param)
        {
            try
            {
                int totalRowCount = 0;
                // Generate Data
                var allRecords = _echoMediaRepository.GetMediaList();
                //Apply search criteria to data

                var predicate = PredicateBuilder.True<MediaChannelModel>();

                if (!String.IsNullOrEmpty(param.sSearch))
                {
                    var wherePredicate = PredicateBuilder.False<MediaChannelModel>();
                    int i;
                    if (int.TryParse(param.sSearch, out i))
                    {
                        wherePredicate = wherePredicate.Or(m => m.ID == i);
                    }
                    wherePredicate = wherePredicate.Or(m => m.Name.Contains(param.sSearch));

                    predicate = predicate.And(wherePredicate);
                }

                if (param.iMediaGroupID > 0)
                {
                    var wherePredicate = PredicateBuilder.False<MediaChannelModel>();

                    var mediaTypes = new NeptuneRepository<Lookup_MediaTypes>();
                    var mediaGroups = mediaTypes.FindWhere(m => m.MediaGroupID == param.iMediaGroupID)
                    .Select(m => m.Name)
                    .ToArray();

                    wherePredicate = wherePredicate.Or(m => mediaGroups.Contains(m.NeptuneMediaType) || mediaGroups.Contains(m.MediaType));
                    predicate = predicate.And(wherePredicate);
                }

                var filteredRecord = allRecords.Where(predicate);

                var columnCriteria = param.sColumns.Split(',').ToList();
                if (!String.IsNullOrEmpty(columnCriteria[param.iSortCol_0]))
                {
                    filteredRecord = filteredRecord.ApplyOrder(
                        columnCriteria[param.iSortCol_0],
                        param.sSortDir_0 == "asc" ? QuerySortOrder.OrderBy : QuerySortOrder.OrderByDescending);
                }

                totalRowCount = filteredRecord.Count();

                var finalQuery = filteredRecord.Skip(param.iDisplayStart).Take(param.iDisplayLength).ToList();

                // Create response
                return Json(new
                {
                    sEcho = param.sEcho,
                    aaData = finalQuery,
                    iTotalRecords = allRecords.Count(),
                    iTotalDisplayRecords = totalRowCount
                }, JsonRequestBehavior.AllowGet);
            }
            catch (Exception ex)
            {
                Logger.Error(ex);
                throw;
            }
        }
公共虚拟操作结果GetData(DataTablesParamsModel param)
{
尝试
{
int totalRowCount=0;
//生成数据
var allRecords=_echoMediaRepository.GetMediaList();
//对数据应用搜索条件
var predicate=PredicateBuilder.True();
如果(!String.IsNullOrEmpty(param.sSearch))
{
var wherePredicate=PredicateBuilder.False();
int i;
if(内部TryParse(参数sSearch,外部i))
{
wherePredicate=wherePredicate.Or(m=>m.ID==i);
}
wherePredicate=wherePredicate.Or(m=>m.Name.Contains(param.sSearch));
谓词=谓词和(wherePredicate);
}
如果(param.iMediaGroupID>0)
{
var wherePredicate=PredicateBuilder.False();
var mediaTypes=新的存储库();
var mediaGroups=mediaTypes.FindWhere(m=>m.MediaGroupID==param.iMediaGroupID)
.选择(m=>m.Name)
.ToArray();
wherePredicate=wherePredicate.Or(m=>mediaGroups.Contains(m.NeptuneMediaType)| mediaGroups.Contains(m.MediaType));
谓词=谓词和(wherePredicate);
}
var filteredRecord=allRecords.Where(谓词);
var columnCriteria=param.sColumns.Split(',').ToList();
如果(!String.IsNullOrEmpty(columnCriteria[param.iSortCol_0]))
{
filteredRecord=filteredRecord.ApplyOrder(
columnCriteria[param.iSortCol_0],
param.sSortDir_0==“asc”?QuerySortOrder.OrderBy:QuerySortOrder.OrderByDescending);
}
totalRowCount=filteredRecord.Count();
var finalQuery=filteredRecord.Skip(param.iDisplayStart).Take(param.idisplayslength.ToList();
//创建响应
返回Json(新的
{
sEcho=param.sEcho,
aa数据=最终查询,
iTotalRecords=allRecords.Count(),
iTotalDisplayRecords=totalRowCount
},JsonRequestBehavior.AllowGet);
}
捕获(例外情况除外)
{
记录器错误(ex);
投掷;
}
}

您的代码和查询看起来已经优化,所以问题应该是数据库中缺少索引,这会降低orderby(skip使用)的性能

使用与您非常相似的测试代码,我在本地测试数据库中对500万行的表(所有XML类型的列都已填充)进行了一些测试,正如预期的那样,使用按索引排序的查询速度非常快,但对于未索引的列,它们可能需要非常、非常、很长的时间


我建议您分析动态Where和Order函数最常用的列,并通过创建相应的索引进行一些性能测试。

“\u echoMediaRepository.GetMediaList();”它是否返回IQueryable结果?是IQueryable您知道哪个查询给出了超时吗?可能是“filteredRecord.Skip(param.iDisplayStart).Take(param.iDisplayLength).ToList()”?是的,这是最耗时的查询。听起来像是sql执行超时。如果可能的话(使用动态查询不容易),优化应该在数据库端。黑客是使用更大的CommandTimeout值。