1

Let's say I've got the following array of objects in JavaScript:

const requests = [
  {
    id: 1,
    person: {
      id: 1
    }
  },
  {
    id: 2,
    person: {
      id: 1
    }
  },
  {
    id: 3,
    person: {
      id: 2
    }
  },
  {
    id: 4,
    person: {
      id: 3
    }
  },
  {
    id: 5,
    person: {
      id: 2
    }
  }
]

And what I've written below will go over each item in the array, and then create a new array containing just the person object.

const requestsPeopleIds = []
for (const request of requests) {
  requestsPeopleIds.push(request.person.id)
}

I then take that new array and create another new array using Set to remove the duplicate ids:

const uniquePeopleIds = Array.from(new Set(requestsPeopleIds))

The final result is as I'd expect:

console.log(uniquePeopleIds) // [1, 2, 3]

where these are the unique ids of the people who made a request. So out of the 5 requests, these were made by 3 people.

There must be a more efficient way of doing this, so I'm reaching out to you stack overflow JS gurus.

Thanks in advance.

5
  • 1
    Define "efficent". In regards to CPU cycles, memory usage, ... - Unless we are talking about hundreds of thousands of requests, or some really strict memory restrictions this is another case of micro-optimization which is probably not useful or even necessary. Commented Mar 19, 2020 at 14:07
  • This is pretty much as efficient as you can get. It's an O(n) solution that goes over the input three times max. You can shave off one iteration if you skip making the final array and just consume the iterator of the Set and there might be some slightly more efficient iteration methods in some circumstances but overall from complexity angle it seems as good as it gets. Commented Mar 19, 2020 at 14:07
  • @VLAZ - I disagree. It's always going to be O(n log n) because you will always have to iterate through each item in the original array and then search through the set of unique values that you are composing. Commented Mar 19, 2020 at 14:17
  • @mankowitz there is no search, though. Adding to a Set is O(1) operation, iteration over set/array is O(n). This is doing iteration and addition, for a total of O(n). Commented Mar 19, 2020 at 14:21
  • I am sorry that I misled you. Please, see my updated answer. It looks like the fastest solution can varies and depends on many factors. Commented Mar 22, 2020 at 18:33

4 Answers 4

4

I think you got the basics. Here's a way to tighten the code:

var ids = new Set;
requests.forEach(i => ids.add(i.person.id));
Sign up to request clarification or add additional context in comments.

2 Comments

Why not just var ids = new Set(ids.map(i => i.person.id));?
@StepUp - thanks for the performance tests, much appreciated :)
1

You could also do this with map method and spread syntax ....

const requests = [{"id":1,"person":{"id":1}},{"id":2,"person":{"id":1}},{"id":3,"person":{"id":2}},{"id":4,"person":{"id":3}},{"id":5,"person":{"id":2}}]
const result = [...new Set(requests.map(({ person: { id }}) => id))]
console.log(result)

2 Comments

Is this more efficient? Because I suspect it'd be about the same performance-wise.
And this is in which way more "efficent"? Lines of code because you've replaced the a for...of... + .push() with a .map()?
0

You can do it by making an object by the person's id as a key and get the keys of the object.

const requests = [{"id":1,"person":{"id":1}},{"id":2,"person":{"id":1}},{"id":3,"person":{"id":2}},{"id":4,"person":{"id":3}},{"id":5,"person":{"id":2}}]


// Take an empty object
const uniques = {};

// Iterate through the requests array and make person's id as a
// key of the object and put any value at this index (here I put 1).
requests.forEach(request => (uniques[request.person.id] = 1));

// Finally get the keys of the unique object.
console.log(Object.keys(uniques));

4 Comments

And is this more efficient than a Set?
The set answer is nice. I don't know the cost of adding new item to a Set. But this approach takes O(n) times. It's good I think.
OP asked for a more efficient method. And the code by OP is already O(n).
There is no chance of reducing the time complexity under O(n). I think what did he want is code style efficiency.
-1

I've done some research and have inferred some interesting facts:

  1. It looks like when we have very various data and larger array, then Set collection shows not best results. Set is very optimized collection, however, in my view, it should always check whether element is already added into Set. And this checking will take O(n) complexity. But we can use simple JavaScript object. Checking whether object contains key is O(1). So object will have huge advantage over Set.

  2. foreach arrow function is very convenient, however, simple for loop is faster.

  3. Adding console.log makes Set the most fastest solution, however, without console.log, the most fastest solution is combination of for loop and object.

So the most performant code without console.log() looks like this:

const hashMap = {};
const uniques = [];
for (let index = 0; index < requests.length; index++) {  
  if (!hashMap.hasOwnProperty(requests[index].person.id)){
      hashMap[requests[index].person.id] = 1;
      uniques.push(requests[index].person.id);
  }
}

However, the most performant code with console.log() looks like this(I cannot understand the reason why it happens. It would be really great to know why it happens):

var ids = new Set;
requests.forEach(i => ids.add(i.person.id));
console.log(ids)

Tests:

2 Comments

Is this more efficient than using a Set?
@downvoter what's reason tp downvote? It would be really helpful to know. It will help to improve my answer

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.