I benchmarked the following 5 functions on jsbench.me (link goes to saved tests):
const data = [1,2,3,4,5,6,7,8,9,0,99,88,77,66,55,44,33,22,11]
const fn1 = xs => xs.reduce((acc, x) => acc > x ? acc : x, -Infinity)
const fn2 = xs => {
let res = -Infinity
for (const x of xs) res = x > res ? x : res
return res
}
const fn3 = xs => {
let res = -Infinity
for (const x of xs) {
if (x > res) res = x
}
return res
}
const fn4 = xs => xs.reduce((acc, x) => Math.max(acc, x), -Infinity)
const fn5 = ([x, ...xs], acc = -Infinity) =>
x == undefined ? acc : fn5(xs, x > acc ? x : acc)
The fastest function, by far, is #1, based on Array.prototype.reduce and avoiding any reassignment.
Functions #2 - #4 (including your original function and one with the potential perf improvement you suggested) are all consistently between 40-55% slower than #1. #4 (reduce with Math.max) seems to be consistently faster than #2 and #3 by around 20%.
The slowest function (which depends on recursion) is #5, clocking in at around 80-90% slower than #1.
Math.max(x, y) is slower than x > y ? x : y.
I tested on a 2019 MacBook Pro running macOS 10.15.1 (Catalina) and Chrome v79.0.3945.88 (64bit). You may end up with different results with a different test setup, but I'm reasonably confident that a solution based on reduce and avoiding reassignment will be the fastest option in most modern browsers.