DEV Community

Cover image for Time complexity Big 0 for Javascript Array methods and examples.

Time complexity Big 0 for Javascript Array methods and examples.

Luis Castillo on June 03, 2020

Hello everyone, some weeks ago I started to study some computer science algorithms using JavaScript as the programming language, and normally after...
Collapse
 
miketalbot profile image
Mike Talbot ⭐

Like it - just a point of clarification - a sliced array is a shallow copy and changing the original array won't modify it as you seem to suggest:

    const a = [1,2,3,4]
    const b = a.slice(-2)
    a[3] = 5
    console.log(a) // -> [1,2,3,5]
    console.log(b) // -> [3,4]
Enter fullscreen mode Exit fullscreen mode

If it's an array of objects, clearly it's a shallow copy so changing an object will change the one referenced by both arrays.

Collapse
 
lukocastillo profile image
Luis Castillo • Edited

You're right! Thank you to share this clarification. 👍

Collapse
 
faiwer profile image
Stepan Zubashev

Thx for the article. I'm sure it's very important for the frontend community.

People these days often use .reduce with spread .... So it becomes O(n^2). There're lots of articles (even on dev.to) which showcase such an approach. My experience of interviewing says me that people don't understand that there's a problem.

Also I think that BigO of .splice depends on the arguments. I don't think e.g. that 1 to 1 replacement would cause O(n), because it's a pretty simple optimization.

Collapse
 
_mikeusa profile image
Mike

people don't understand that there's a problem

I would update this to can be a problem. Often time/space complexity only matters at scale and in many cases it will never be a problem; for instance, when populating a select-dropdown. In other cases, for instance big data, time/space complexity may always be a consideration, especially for an iterative process like that of graphing or machine learning.

I think that BigO of .splice depends on the arguments

Absolutely. That's often the case with Big-O. Big-O is the worst-case scenario, but worst-case may rarely be reached. Big-O means f(n) is less than a constant of O(g(n)). So in the worst case, splice may accept every element but on average it won't and a best-case scenario it would be zero or one element (if 0 isn't practical) at the end of the stack.

For best-case big-omega is used (e.g., Ω(n)) and for average-case big-theta is used (e.g., Θ(n)). Notice the internal ligature of the big-theta vs that of a big O as it may be easy to mistake one for the other.

Collapse
 
vcheeney profile image
Victor Cheeney

People these days often use .reduce with spread .... So it becomes O(n^2). There're lots of articles (even on dev.to) which showcase such an approach. My experience of interviewing says me that people don't understand that there's a problem.

I'm not sure I understand what you mean, do have an example that would illustrate this kind of situation? Thanks!

Collapse
 
_mikeusa profile image
Mike

I think he means something like:

const users = [
  {name: 'foo',  admin: false, dateRegistered: '2022-02-01', },
  {name: 'bar',  admin: false, dateRegistered: '2010-05-30', },
  {name: 'baz',  admin: true,  dateRegistered: '2009-12-25', },
  {name: 'spaz', admin: true,  dateRegistered: '2020-06-30', },
];

const admins = users.reduce((agg, user) => (
  !user.admin ? agg : {
    ...agg,
    [user.name] : user.dateRegistered
  }
), {});
Enter fullscreen mode Exit fullscreen mode

Where ...agg is potentially being spread for each iteration over users. Instead of an internal traversal, use any kind of loop for O(n) time instead of O(n^2).

const users = [
  {name: 'foo',  admin: false, registered: '2022-02-01', },
  {name: 'bar',  admin: false, registered: '2010-05-30', },
  {name: 'baz',  admin: true,  registered: '2009-12-25', },
  {name: 'spaz', admin: true,  registered: '2020-06-30', },
]

const admins = {}
users.forEach((user)=>{
  if (user.admin) { admins[user.name]=user.registered }
})
Enter fullscreen mode Exit fullscreen mode

reduce could be used, but was abandoned because it's absolutely unnecessary. for .. of (or any loop structure) could be used instead of .forEach. The latter was used as a manner of preference.

Note: it's actually something closer to O(n(n+1)/2) + O(n) where once coefficients and smaller figures are removed simplifies as O(n^2).

Collapse
 
redbar0n profile image
Magne • Edited

What reference did you lean on to find their time complexity?
If their complexities are assumed or inferred, I'm not sure the assumptions hold, since each engine implements JS arrays differently. See: stackoverflow.com/a/61713477/380607

PS: I think it would be wise to mention that .some() is only O(n) in the worst case, but rather O(k) normally, where k is the index to be found, which on average will be the middle (median) index, so k = n/2. While a factor of n, it is different than other O(n) functions which necessarily will have to go through the entire array and thus amount to the full n.

Collapse
 
rokal profile image
Roland Kalmogo

Big O is already the worst case. otherwise, we have Omega and Theta

Collapse
 
shksa profile image
sreekar

Big O is not the worst case. Big O tells you the upper bound growth rate function. That's it. Worst case tells you a specific example of input which would make the algorithm run for the longest time. Big O can be used to describe the upper bound in worst case, average case, best case. Both are orthogonal.

Thread Thread
 
_mikeusa profile image
Mike

This may be accurate, but for all practical purposes of conceptual understanding; upper-bound ≈ worst case.

It's simpler to describe Big-O vs Big-Theta vs Big-Omega and also stay away from little-o. Again, this is for simplistic terms and anything outside a data scientist and seasoned engineer should probably stick to the simplified understanding as colloquial terms.

This is targeting entry level devs, which I think does service in how it describes the topic:
educative.io/blog/a-big-o-primer-f...

Collapse
 
salyadav profile image
Saloni Yadav

Crisp and to-the-point post. Well done!

Collapse
 
mirzayevorzu profile image
Mirzayev Orzu • Edited

what about .join() method?

Collapse
 
thamaraiselvam profile image
Thamaraiselvam

It should be O(n)

Collapse
 
irina_kats profile image
Irina Kats

Very useful. I had about the same idea on the complexity but it's good to know that you took the time to summarize this data.

Collapse
 
axibord profile image
Aghiles Lounis

@luiscastillokc for the method "indexOf" I think it's O(log(n)) time, because it uses something like Binary search

Collapse
 
konstantin_kulikov_5503ba profile image
Konstantin Kulikov

Binary search can only be executed on a sorted collection

Collapse
 
vcheeney profile image
Victor Cheeney

Very nice article. Straight and to the point. Thanks for sharing!