elasticsearch 带空间的弹性搜索部分映射
我的部分映射和查询非常有效,直到涉及到一个空间,例如术语Jon Doe将其术语向量分解为elasticsearch 带空间的弹性搜索部分映射,elasticsearch,mapping,partial,setting,elasticsearch,Mapping,Partial,Setting,我的部分映射和查询非常有效,直到涉及到一个空间,例如术语Jon Doe将其术语向量分解为 "terms": { "j": { "term_freq": 1 }, "jo": { "term_freq": 1 }, "jon": { "term_freq": 1 },
"terms": {
"j": {
"term_freq": 1
},
"jo": {
"term_freq": 1
},
"jon": {
"term_freq": 1
},
"d": {
"term_freq": 1
},
"do": {
"term_freq": 1
},
"doe": {
"term_freq": 1
}
}
但我希望是
"terms": {
"j": {
"term_freq": 1
},
"jo": {
"term_freq": 1
},
"jon": {
"term_freq": 1
},
"jon ": {
"term_freq": 1
},
"jon d": {
"term_freq": 1
},
"jon do": {
"term_freq": 1
},
"jon doe": {
"term_freq": 1
}
}
以下是我的映射和设置:
映射:
name: {
type: 'string',
term_vector: 'yes',
analyzer: 'ngram_analyzer',
search_analyzer: 'standard',
include_in_all: true
}
设置:
settings: {
index: {
analysis: {
filter: {
ngram_filter: {
type: 'edge_ngram',
min_gram: 1,
max_gram: 15
}
},
analyzer: {
'ngram_analyzer': {
filter: [
'lowercase',
'ngram_filter'
],
type: 'custom',
tokenizer: 'standard'
}
}
},
number_of_shards: 1,
number_of_replicas: 1
}
}
};
我该怎么做呢?您只需要在自定义分析器中使用不同的标记器:
"analyzer": {
"ngram_analyzer": {
"filter": [
"lowercase",
"ngram_filter"
],
"type": "custom",
"tokenizer": "keyword"
}
}
工作如预期!谢谢Stefen这个方法有一个副作用,当你搜索时,没有找到与关键字标记器匹配的匹配项。为什么会这样?