re: Quick Tip: Transform an Array into an Object using .reduce() VIEW POST


Honestly, I really don't like naming the first parameter of the callback function as acc. It's very ambiguous, at least for me. I prefer to name it prev because it's much clearer to me that the prev contains the result of the previous iteration (or the initialized value if it is the first iteration).


That's the imperative name for it though, no?
You're not supposed to know that iteration is taking place, just that the values are being absorbed into the accumulator :v


That is indeed the true and only technical name for it, but semantically speaking, I prefer naming it prev. To each its own, I suppose.

Yeah, I'm totally just being a smartass.
Many, most even, reducers we write are not commutative.
This one could be parallelizable, actually, if we add a merging function, but still we start with an empty object for acc/prev, not one of the items.

@_bigblind You've seen people write something like

const allStr = strings.reduce((acc, next)=>acc+next, '')
// instead of
const allStr = strings.reduce((acc, next)=>acc+next)


Excuse my lack of knowledge on the subject, but what does it mean for a reducer to be "commutative" and "parallelizable"? And what do you mean by "merging function"?

Oh, now I understand your point about not thinking about the fact that iteration is being used! If they're not done in parallel, you don't get the previous value :).

If we don't care about the order of the incoming ids, and just want to get the sets of ids of each article, we could split the counting between multiple threads or even machines.
Something like this silly thing:

const posts = [
  { id: 0, category: "fairy tales", title: "Gommunist Manifesto" },
  { id: 1, category: "frontend", title: "All About That Sass" },
  { id: 2, category: "backend", title: "Beam me up, Scotty: Apache Beam tips" },
  { id: 3, category: "frontend", title: "Sanitizing HTML: Going antibacterial on XSS attacks" },
  { id: 4, category: "frontend", title: "All About That Sass" },
  { id: 5, category: "backend", title: "Beam me up, Scotty: Apache Beam tips" },
  { id: 6, category: "frontend", title: "Sanitizing HTML: Going antibacterial on XSS attacks" },
  { id: 7, category: "frontend", title: "All About That Sass" },
  { id: 8, category: "backend", title: "Beam me up, Scotty: Apache Beam tips" },
  { id: 9, category: "frontend", title: "Sanitizing HTML: Going antibacterial on XSS attacks" }

const idsByCategory = posts => {
  const categories = new Map()
  for (const { category, id } of posts) {
    const existing = categories.get(category)
    if (!existing) categories.set(category, [id])
    else existing.push(id)
  return categories

const mergingFunction = ([result, ...results]) => {
  for (const other of results)
    for (const [category, ids] of other) {
      const existing = result.get(category)
      if (!existing) result.set(category, ids)
      else existing.push(...ids)
  return result

const parallel = posts => {
  const { length } = posts
  const results = []
  for (let i = 0; i < length; i += 2)
    results.push(idsByCategory(posts.slice(i, i + 2)))
  return results

const results = parallel(posts)
const categoryPosts = mergingFunction(results)

And by "commutative" I mean that if you pushed an array into a number you'd get an error, and that 'a'+'b' and 'b'+'a' gives you different strings.
Whereas integer addition without overflow is commutative: 1+2 gives the same result as 2+1 and const s = new Set; s.add(1); s.add(2) as well.

Oh, wow. You're right about calling it "silly". 😂

It's silly in the sense we have only 10 items instead of billions, they are in memory at once, and it doesn't actually spawn threads or workers.


That makes sense for .map(), but for .reduce() the previous value is also the accumulated value which will eventually be returned. Making that distinction in the naming convention is a nice visual cue imo.

code of conduct - report abuse