DEV Community

Discussion on: Time complexity Big 0 for Javascript Array methods and examples.

Collapse
 
faiwer profile image
Stepan Zubashev

Thx for the article. I'm sure it's very important for the frontend community.

People these days often use .reduce with spread .... So it becomes O(n^2). There're lots of articles (even on dev.to) which showcase such an approach. My experience of interviewing says me that people don't understand that there's a problem.

Also I think that BigO of .splice depends on the arguments. I don't think e.g. that 1 to 1 replacement would cause O(n), because it's a pretty simple optimization.

Collapse
 
vcheeney profile image
Victor Cheeney

People these days often use .reduce with spread .... So it becomes O(n^2). There're lots of articles (even on dev.to) which showcase such an approach. My experience of interviewing says me that people don't understand that there's a problem.

I'm not sure I understand what you mean, do have an example that would illustrate this kind of situation? Thanks!

Collapse
 
_mikeusa profile image
Mike

I think he means something like:

const users = [
  {name: 'foo',  admin: false, dateRegistered: '2022-02-01', },
  {name: 'bar',  admin: false, dateRegistered: '2010-05-30', },
  {name: 'baz',  admin: true,  dateRegistered: '2009-12-25', },
  {name: 'spaz', admin: true,  dateRegistered: '2020-06-30', },
];

const admins = users.reduce((agg, user) => (
  !user.admin ? agg : {
    ...agg,
    [user.name] : user.dateRegistered
  }
), {});
Enter fullscreen mode Exit fullscreen mode

Where ...agg is potentially being spread for each iteration over users. Instead of an internal traversal, use any kind of loop for O(n) time instead of O(n^2).

const users = [
  {name: 'foo',  admin: false, registered: '2022-02-01', },
  {name: 'bar',  admin: false, registered: '2010-05-30', },
  {name: 'baz',  admin: true,  registered: '2009-12-25', },
  {name: 'spaz', admin: true,  registered: '2020-06-30', },
]

const admins = {}
users.forEach((user)=>{
  if (user.admin) { admins[user.name]=user.registered }
})
Enter fullscreen mode Exit fullscreen mode

reduce could be used, but was abandoned because it's absolutely unnecessary. for .. of (or any loop structure) could be used instead of .forEach. The latter was used as a manner of preference.

Note: it's actually something closer to O(n(n+1)/2) + O(n) where once coefficients and smaller figures are removed simplifies as O(n^2).

Collapse
 
_mikeusa profile image
Mike

people don't understand that there's a problem

I would update this to can be a problem. Often time/space complexity only matters at scale and in many cases it will never be a problem; for instance, when populating a select-dropdown. In other cases, for instance big data, time/space complexity may always be a consideration, especially for an iterative process like that of graphing or machine learning.

I think that BigO of .splice depends on the arguments

Absolutely. That's often the case with Big-O. Big-O is the worst-case scenario, but worst-case may rarely be reached. Big-O means f(n) is less than a constant of O(g(n)). So in the worst case, splice may accept every element but on average it won't and a best-case scenario it would be zero or one element (if 0 isn't practical) at the end of the stack.

For best-case big-omega is used (e.g., Ω(n)) and for average-case big-theta is used (e.g., Θ(n)). Notice the internal ligature of the big-theta vs that of a big O as it may be easy to mistake one for the other.