Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/csharp/334.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
C# 弹性搜索滚动使用Nest C获取所有记录花费了太多时间#_C#_<img Src="//i.stack.imgur.com/RUiNP.png" Height="16" Width="18" Alt="" Class="sponsor Tag Img">elasticsearch_Nest - Fatal编程技术网 elasticsearch,nest,C#,elasticsearch,Nest" /> elasticsearch,nest,C#,elasticsearch,Nest" />

C# 弹性搜索滚动使用Nest C获取所有记录花费了太多时间#

C# 弹性搜索滚动使用Nest C获取所有记录花费了太多时间#,c#,elasticsearch,nest,C#,elasticsearch,Nest,这是我的搜索查询过滤器 这里定义参数ScrollPageSize和ScrollTimeOutMinutes public static string scrollTimeoutMinutes = "2m"; public static int scrollPageSize = 10000; 我遇到的问题是,ScrollTimeOutMinutes=2m,在kibana上运行相同的查询时有315000条记录。使用Scroll几乎需要7秒,但在C#使用nest时,ScrollI

这是我的搜索查询过滤器

这里定义参数ScrollPageSize和ScrollTimeOutMinutes

public static string scrollTimeoutMinutes = "2m";
public static int scrollPageSize = 10000;
我遇到的问题是,ScrollTimeOutMinutes=2m,在kibana上运行相同的查询时有315000条记录。使用Scroll几乎需要7秒,但在C#使用nest时,ScrollId过期的时间比预期的要长

filters.Add(new TermsQuery
    {
        Field = new Field("MERCHANTNO"),
        Terms = MERCHANTNO,
    }
    && new TermsQuery
    {
        Field = new Field("NumericFileDate"),
        Terms = UploadedFileData,
    }
);

var SearchRequest = new SearchRequest<MISTransactionResponseElastic>(l_SendRequest.idxName)
{
    From = 0,
    Scroll = scrollTimeoutMinutes,
    Size = scrollPageSize,
    Query = new BoolQuery
    {
        Must = filters,
        Filter = filterClause
    }
};

过滤器。添加(新术语)
{
字段=新字段(“商品编号”),
术语=商品编号,
}
&&新术语
{
字段=新字段(“NumericFileDate”),
Terms=上传的文件数据,
}
);
var SearchRequest=新的SearchRequest(l_SendRequest.idxName)
{
From=0,
滚动=滚动超时分钟,
大小=滚动页面大小,
Query=newboolquery
{
必须=过滤器,
过滤器=filterClause
}
};
我在这里使用滚动键并获取所有记录

var searchResponse = _elasticClient.SearchAsync<MISTransactionResponseElastic>(SearchRequest);
          
List<MISTransactionResponseElastic> results = new List<MISTransactionResponseElastic>();
if (searchResponse.Result.Documents.Any())
    results.AddRange(searchResponse.Result.Documents);

string scrollid = searchResponse.Result.ScrollId;
bool isScrollSetHasData = true;

while (isScrollSetHasData)
{
    ISearchResponse<MISTransactionResponseElastic> loopingResponse = _elasticClient.Scroll<MISTransactionResponseElastic>(scrollTimeoutMinutes, scrollid);
    if (loopingResponse.IsValid)
    {
        results.AddRange(loopingResponse.Documents);
        scrollid = loopingResponse.ScrollId;
    }
    isScrollSetHasData = loopingResponse.Documents.Any();
}

var records = results;
var searchResponse=\u elasticClient.SearchAsync(SearchRequest);
列表结果=新列表();
if(searchResponse.Result.Documents.Any())
results.AddRange(searchResponse.Result.Documents);
string scrollid=searchResponse.Result.scrollid;
bool-isScrollSetHasData=true;
while(isScrollSetHasData)
{
ISearchResponse loopingResponse=_elasticClient.Scroll(scrollTimeoutMinutes,scrollid);
if(loopingResponse.IsValid)
{
结果.AddRange(loopingResponse.Documents);
scrollid=loopingResponse.scrollid;
}
isScrollSetHasData=loopingResponse.Documents.Any();
}
var记录=结果;

也许,首先您应该看看kibana是否返回分页数据。下一步,为什么isScrollSetHasData初始值为true,它可能由第一个滚动查询中是否存在数据来分配。