-1

I'm declaring the following array.

let arr = [
  { id:"aa", vals:[1,2,3] }, 
  { id:"cc", vals:[3,4,5] }, 
  { id:"bb", vals:[2,3,4] }];

Printing it in the console confirms the expectation as each element has two fiels: a string id and a number array vals. Then, I perform reduce(...) on it like this.

let reduction = arr.reduce((a,b) => a[b.id] = b, {});

My expectation is a single object with the fields corresponding to the ID values of the iterated array, i.e. aa, bb and cc, where each of those three fields contains the corresponding element from the original array.

What I get is only the last one element following the expectation, while the first two get convoluted somehow, being equipped with additional fields. Please note that I'm talking about changes in the original array! According to the docs: The reduce() method does not change the original array.

What am I missing?

The end-game desire is to simply convert an array to a dictionary by reducing it to a single instance but with arbitrarily added fields. There's a bunch of resources to get inspiration from out there. So this question relates only to this unexpected behavior.

8
  • 3
    Your callback function is returning the last element instead of the accumulator value (a). Commented Apr 20, 2024 at 13:42
  • 1
    (a, b) => ((a[b.id] = b), a) Commented Apr 20, 2024 at 13:45
  • 4
    Also it's true that .reduce() does not modify the original array, but the callback can modify the component elements of the original array. Commented Apr 20, 2024 at 13:46
  • 1
    @Pointy Thanks to your comments, I've figured out that I was returning the assignment not the result of the assignment, so I should have arr.reduce((a,b)=>{a[b.id]=b; return a;},{}, which seems to work. Still, I can't see howhathefuchrist happened there before I corrected it. I see that it was wrong. But how? I was, if anything , returning void, right?! Commented Apr 20, 2024 at 13:52
  • 1
    Your code returned b, one of the component objects in the original array. So on the next iteration, b was the first argument to the callback. Commented Apr 20, 2024 at 13:53

1 Answer 1

2

Arrays in JavaScript are objects and all objects in JavaScript are passed by reference. In other words, whenever you alter any property on your object, all its references are going to point to the same object, with changed property.

In your case, you can consider returning a new object each time, like this:

let reduction = arr.reduce((a,b) => ({...a, [b.id]: b}), {});
Sign up to request clarification or add additional context in comments.

8 Comments

Yeah, now I get it. I received helpful comments, too. Just one FUQ: using the spreader was claimed to be an anti-pattern (see the link in my question). Is that the case? Or is that poster confused?
Thanks for the clarification. It was abundantly clear where you stand in that regard. I asked that poster to provide some more info, perhaps they just though of a different case or such. I like providing the benefit of doubt and will gladly push for a informative and rewarding discussion on such topics. Since there a chance they'll read your comment, you may wish to rephrase it as not to inflict infected sensation. It's your call. I wouldn't case - if someone points out a mistake, I fix it, never taking offense. Others may react differently.
@KonradViltersten Yes, using spread to repeatedly create new objects and copy the old is an antipattern, it has quite bad time complexity. It'll already be a problem with a thousand elements in the array. Really the best solution for this particular case is Object.fromEntries(arr.map(b => [b.id, b])), which is both performant and readable.
@Bergi any proof that it would make any difference with a thousand elements array?
@RoboRobok Sure, try it! A simple test with an array of a thousand objects with random ids and console.time() yields 3.2ms for reduce when returning the accumulator, 3.7ms for Object.entries+map, and over 200ms for reduce with spread syntax! (Not saying these numbers, in a very badly done microbenchmark, are representative, but they clearly show a difference). I wouldn't think of trying it with a million elements given the quadratic time complexity.
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.