Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/json/13.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Arrays 在elasticsearch中聚合值数组_Arrays_Json_<img Src="//i.stack.imgur.com/RUiNP.png" Height="16" Width="18" Alt="" Class="sponsor Tag Img">elasticsearch_Aggregation - Fatal编程技术网 elasticsearch,aggregation,Arrays,Json,elasticsearch,Aggregation" /> elasticsearch,aggregation,Arrays,Json,elasticsearch,Aggregation" />

Arrays 在elasticsearch中聚合值数组

Arrays 在elasticsearch中聚合值数组,arrays,json,elasticsearch,aggregation,Arrays,Json,elasticsearch,Aggregation,我需要按如下方式聚合一个数组 两个文档示例: { "_index": "log", "_type": "travels", "_id": "tnQsGy4lS0K6uT3Hwzzo-g", "_score": 1, "_source": { "state": "saopaulo", "date": "2014-10-30T17", "traveler": "patrick", "registr

我需要按如下方式聚合一个数组

两个文档示例:

{
    "_index": "log",
    "_type": "travels",
    "_id": "tnQsGy4lS0K6uT3Hwzzo-g",
    "_score": 1,
    "_source": {
        "state": "saopaulo",
        "date": "2014-10-30T17",
        "traveler": "patrick",
        "registry": "123123",
        "cities": {
            "saopaulo": 1,
            "riodejaneiro": 2,
            "total": 2
        },
        "reasons": [
            "Entrega de encomenda"
        ],
        "from": [
            "CompraRapida"
        ]
    }
},
{
    "_index": "log",
    "_type": "travels",
    "_id": "tnQsGy4lS0K6uT3Hwzzo-g",
    "_score": 1,
    "_source": {
        "state": "saopaulo",
        "date": "2014-10-31T17",
        "traveler": "patrick",
        "registry": "123123",
        "cities": {
            "saopaulo": 1,
            "curitiba": 1,
            "total": 2
        },
        "reasons": [
            "Entrega de encomenda"
        ],
        "from": [
            "CompraRapida"
        ]
    }
},
我想聚集
城市
数组,找出
旅行者
去的所有
城市
。我想要这样的东西:

{
    "traveler":{
        "name":"patrick"
    },
    "cities":{
        "saopaulo":2,
        "riodejaneiro":2,
        "curitiba":1,
        "total":3
    }
}
其中,
total
cities
数组的长度减去1。我尝试了术语聚合和求和,但无法输出所需的输出


可以对文档结构进行更改,因此,如果这样做对我有帮助,我很高兴知道。

在上面发布的文档中,“cities”不是json数组,而是json对象。 如果可以更改文档结构,我会将文档中的城市更改为对象数组

示例文档:

 cities : [
   {
     "name" :"saopaulo"
     "visit_count" :"2",

   },
   {
     "name" :"riodejaneiro"
     "visit_count" :"1",

   }
]
然后需要将城市设置为索引映射中的类型

   "mappings": {
         "<type_name>": {
            "properties": {
               "cities": {
                  "type": "nested",
                  "properties": {
                     "city": {
                        "type": "string"
                     },
                     "count": {
                        "type": "integer"
                     },
                     "value": {
                        "type": "long"
                     }
                  }
               },
               "date": {
                  "type": "date",
                  "format": "dateOptionalTime"
               },
               "registry": {
                  "type": "string"
               },
               "state": {
                  "type": "string"
               },
               "traveler": {
                  "type": "string"
               }
            }
         }
      }

你有可以分享的索引映射吗?但这不是双重计数“saopaulo”,因为我在第30天和第31天访问了它吗?@PatrickVillela是的,我误解了这个问题,我编辑了答案,使用基数给出了所需的不同城市的总数,但是必须将-1分为子项才能说明总数,可能total无论如何都不应该是“cities”对象/字段的一部分,而是Hello之外的一个单独字段。很抱歉我耽搁了,但我不得不为另一个问题设计解决方案。我可能在本周晚些时候再谈这个问题。我想这可能有用,不过。。。
{
   "query": {
      "match": {
         "traveler": "patrick"
      }
   },
   "aggregations": {
      "city_travelled": {
         "nested": {
            "path": "cities"
         },
         "aggs": {
            "citycount": {
               "cardinality": {
                  "field": "cities.city"
               }
            }
         }
      }
   }
}