0

So I'm just wondering whether assigning value to a variable takes away from the run-time efficiency in this simple function below:

const biggestNumberInArray = (arr) => {
  let biggest = 0;
  for (item of arr) {
    biggest = (item > biggest) ? item : biggest;
  }
  return biggest;
}

Inside the for loop, every iteration is, well, assigning a value to the variable biggest. So if I would write instead:

if (biggest < item) { biggest=item;};

would the function become more efficient? I don't really have any big array, this question is mostly theoretical, I wanna understand how to mechanics work.

thank you!

3
  • 1
    Math.max would be more efficient Commented Jan 5, 2020 at 16:31
  • The assignment will take a trivial amount of time. Unless you have actually benchmarked it and have reason to suspect that it could be faster, go with whatever reads better. Commented Jan 5, 2020 at 16:32
  • Running the array once is N. Comparing each is N. Assigning in each iteration is N. So you have a max of 3N operations and a minimum of 2N+1... It's a good compromise, I don't see something wrong with your code. Commented Jan 5, 2020 at 16:39

2 Answers 2

2

Considering two points of view:

  1. From a theoretical point of view, i.e., considering the Asymptotic Complexity, it does not make any difference. Both algorithms would run on linear time - O(n), since you have to iterate over the array anyway. Assignments take constant time. The if clause also takes constant time. One constant time or two constant time is still constant time.

  2. From a practical point of view, it may make some difference, but it probably wouldn’t be relevant, especially for big arrays. For small arrays, the relative difference may be more relevant, but since the time is low, we usually don't care much about it.

Sign up to request clarification or add additional context in comments.

Comments

1

I benchmarked the following 5 functions on jsbench.me (link goes to saved tests):

const data = [1,2,3,4,5,6,7,8,9,0,99,88,77,66,55,44,33,22,11]

const fn1 = xs => xs.reduce((acc, x) => acc > x ? acc : x, -Infinity)

const fn2 = xs => {
  let res = -Infinity
  for (const x of xs) res = x > res ? x : res
  return res
}

const fn3 = xs => {
  let res = -Infinity
  for (const x of xs) {
    if (x > res) res = x
  }
  return res
}

const fn4 = xs => xs.reduce((acc, x) => Math.max(acc, x), -Infinity)

const fn5 = ([x, ...xs], acc = -Infinity) =>
  x == undefined ? acc : fn5(xs, x > acc ? x : acc)

The fastest function, by far, is #1, based on Array.prototype.reduce and avoiding any reassignment.

Functions #2 - #4 (including your original function and one with the potential perf improvement you suggested) are all consistently between 40-55% slower than #1. #4 (reduce with Math.max) seems to be consistently faster than #2 and #3 by around 20%.

The slowest function (which depends on recursion) is #5, clocking in at around 80-90% slower than #1.

Math.max(x, y) is slower than x > y ? x : y.

I tested on a 2019 MacBook Pro running macOS 10.15.1 (Catalina) and Chrome v79.0.3945.88 (64bit). You may end up with different results with a different test setup, but I'm reasonably confident that a solution based on reduce and avoiding reassignment will be the fastest option in most modern browsers.

1 Comment

Interesting. On my android smartphone that 'reduce Math.max' is consistently the fastest. But with the 'reduce ternary' only 3.2% slower. +1

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.