I have an elasticsearch index with documents like these :
{
"_source": {
"category": 1,
"value": 10,
"utctimestamp": "2020-10-21T15:32:00.000+00:00"
}
}
In Grafana, I'm able to retrive the value of the most recent event with the following query:

Now, I would like to get the MAX value of the most recent documents for each distinct value of category in the given time range.
This means that if I have the 3 following documents in my index :
{
"_source": {
"category": 1,
"value": 10,
"utctimestamp": "2020-10-21T10:30:00"
}
},
{
"_source": {
"category": 2,
"value": 20,
"utctimestamp": "2020-10-21T10:20:00"
}
},
{
"_source": {
"category": 2,
"value": 30,
"utctimestamp": "2020-10-21T10:10:00"
}
}
I would like the query to return the value MAX(10, 20) which is 20. Because the last document for category 1 has the value 10, and the last document for category 2 has the value 20. (If there were a 3rd category, its last value should also be included in the MAX).
Is it possible ?
Thanks to @val for his brilliant query in Sum over top_hits aggregation, your query would be something like this:
{
"size": 0,
"aggs": {
"category": {
"terms": {
"field": "category",
"size": 10
},
"aggs": {
"latest_quantity": {
"scripted_metric": {
"init_script": "params._agg.quantities = new TreeMap()",
"map_script": "params._agg.quantities.put(doc.utctimestamp.date, [doc.utctimestamp.date.millis, doc.value.value])",
"combine_script": "return params._agg.quantities.lastEntry().getValue()",
"reduce_script": "def maxkey = 0; def qty = 0; for (a in params._aggs) {def currentKey = a[0]; if (currentKey > maxkey) {maxkey = currentKey; qty = a[1]} } return qty;"
}
}
}
},
"max_quantities": {
"max_bucket": {
"buckets_path": "category>latest_quantity.value"
}
}
}
}
I ended up creating a middleware service with a REST API between Elasticsearch and Grafana that can make all the custom requests to Elasticsearch (like the request given in the answer of @saeednasehi), and I query the middleware from Grafana with the JSON data source plugin
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With