2

I have a collection with nested arrays as below:

[
   {
      "title":"Foo",
      "arr":[
         [
            {
               "value":"2021-11-13T00:00:00.000Z",
               "baz":10
            },
            {
               "value":"2021-11-12T00:00:00.000Z",
               "baz":0
            }
         ]
      ]
   },
   {
      "title":"Bar",
      "arr":[
         [
            {
               "value":"2021-12-03T00:00:00.000Z",
               "baz":10
            },
            {
               "value":"2021-12-07T00:00:00.000Z",
               "baz":0
            }
         ]
      ]
   }
]

I want to filter out the largest value (i.e., latest date) for each document such that the result is:

[
    {
        "title": "Foo",
        "value": "2021-11-13T00:00:00.000Z",
    },
    {
        "title": "Bar",
        "value": "2021-12-07T00:00:00.000Z",
    },
]

How can this query be written?

2 Answers 2

2

Query1

  • find the local max in the iner arrays
  • find the global max in the outer arrays

Test code here

aggregate(
[{"$set": 
   {"value": 
     {"$max": {"$map": {"input": "$arr", "in": {"$max": "$$this.value"}}}}}},
 {"$unset": ["_id", "arr"]}])

Query2

  • this flattens the array to not be nested, and takes the max after
  • reduce and concatArrays to make the nested array one not nested array
  • take the max for each array
  • project to keep only those 2 fields like expected output

Test code here

aggregate(
[{"$set": 
    {"arr": 
      {"$reduce": 
        {"input": "$arr",
          "initialValue": [],
          "in": {"$concatArrays": ["$$value", "$$this"]}}}}},
  {"$set": {"value": {"$max": "$arr.value"}}},
  {"$project": {"_id": 0, "title": 1, "value": 1}}])
Sign up to request clarification or add additional context in comments.

2 Comments

Good example of the "$reduce with an initialValue of empty array and using concatArrays" pattern. Very powerful.
i use Clojure, so those are common things, but in MongoDB concatArrays is slow, for example you cant do this for arrays of 1k+ members, like use 1k concatArrays inside a reduce ,it will be slow if collection is big. Test it your self sometime, i did benchmarks in the past, but for not very big arrays works fine.
0

Instead of $reduce you can also use $filter and $map:

db.collection.aggregate([
   {
      $set: {
         arr: {
            $map: {
               input: "$arr",
               as: "item",
               in: {
                  $filter: {
                     input: "$$item",
                     cond: { $eq: ["$$this.baz", { $max: "$$item.baz" }] }
                  }
               }
            }
         }
      }
   }
])

It is not clear why you have an array arr with just one element. If your collection has always just one element element in arr, then you can also bypass the $map

db.collection.aggregate([
   {
      $set: {
         arr: {
            $let: {
               vars: { item: { $first: "$arr" } },
               in: {
                  $filter: {
                     input: "$$item",
                     cond: { $eq: ["$$this.value", { $max: "$$item.value" }] }
                  }
               }
            }
         }
      }
   }
])

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.