elasticsearch Elasticsearch:根映射定义具有不支持的参数
我正在创建一种通过Elasticsearch和Kibana搜索泰语单词的方法。我对映射有问题
elasticsearch Elasticsearch:根映射定义具有不支持的参数,
elasticsearch,mapping,
elasticsearch,Mapping,我正在创建一种通过Elasticsearch和Kibana搜索泰语单词的方法。我对映射有问题 PUT test { "settings": { "analysis": { "analyzer": { "trigrams": { "tokenizer": "trigram_tokenizer", &q
PUT test
{
"settings": {
"analysis": {
"analyzer": {
"trigrams": {
"tokenizer": "trigram_tokenizer",
"filter": [
"lowercase"
]
}
},
"tokenizer": {
"trigram_tokenizer": {
"type": "ngram",
"min_ngram": 3,
"max_ngram": 3,
"token_chars": []
}
}
}
},
"mappings": {
"true_name": {
"properties": {
"correct": { "type": "text", "analyzer": "trigrams" }
}
}
}
}
像这样的错误
{
"error" : {
"root_cause" : [
{
"type" : "mapper_parsing_exception",
"reason" : "Root mapping definition has unsupported parameters: [true_name : {properties={correct={analyzer=trigrams, type=text}}}]"
}
],
"type" : "mapper_parsing_exception",
"reason" : "Failed to parse mapping [_doc]: Root mapping definition has unsupported parameters: [true_name : {properties={correct={analyzer=trigrams, type=text}}}]",
"caused_by" : {
"type" : "mapper_parsing_exception",
"reason" : "Root mapping definition has unsupported parameters: [true_name : {properties={correct={analyzer=trigrams, type=text}}}]"
}
},
"status" : 400
}
不推荐使用映射类型。请参阅本文以了解有关删除映射类型的更多信息 在Elasticsearch 6.0.0或更高版本中创建的索引只能包含 单一映射类型。使用多个映射在5.x中创建的索引 类型将继续在Elasticsearch 6.x中像以前一样发挥作用。类型 将在Elasticsearch 7.0.0的API中被弃用,并且完全 在8.0.0中删除 如果您的JSON文档是这样的:
{
"true_name": {
"correct": "mapping types deprecated"
}
}
然后,将创建索引映射-
{
"settings": {
"analysis": {
"analyzer": {
"trigrams": {
"tokenizer": "trigram_tokenizer",
"filter": [
"lowercase"
]
}
},
"tokenizer": {
"trigram_tokenizer": {
"type": "ngram",
"min_ngram": 3,
"max_ngram": 3,
"token_chars": []
}
}
}
},
"mappings": {
"properties": { // note this
"true_name": {
"properties": {
"correct": {
"type": "text",
"analyzer": "trigrams"
}
}
}
}
}
}
{
"settings": {
"analysis": {
"analyzer": {
"trigrams": {
"tokenizer": "trigram_tokenizer",
"filter": [
"lowercase"
]
}
},
"tokenizer": {
"trigram_tokenizer": {
"type": "ngram",
"min_ngram": 3,
"max_ngram": 3,
"token_chars": []
}
}
}
},
"mappings": {
"properties": { // note this
"true_name": {
"properties": {
"correct": {
"type": "text",
"analyzer": "trigrams"
}
}
}
}
}
}