DEV Community

loading...

Why you should use reduce instead of loops -- Part I

Babak
Twitter @babakness https://twitter.com/babakness
Updated on ・4 min read

Here is a common scenario: you want to iterate over all items in a list to produce new data. In this article, we'll discuss reduce and how and why you'll likely want to use it instead of loop constructs like for-of or while for situations like this. The examples are will be JavaScript and TypeScript. First, let's compare what the sight of each tells you when spotted in code:

Reduce

Reduce is about data transformations. At a glance, when you see a reduce, it communicates five key things

  1. That data will be transformed into another type
  2. What type the final data will be
  3. What the i/o of the transforming function will be
  4. That no side-effects will occur here
  5. That no mutations will occur here

That a lot of communication!

Loops

Loops are general purpose constructs. They don't communicate that any kind of transformation is happening. Literally anything can happen in a loop, its all fair game. Change data, don't change data, launch rockets into outer space... whatever!


Show me the reduce!

You might be familiar with the method Array.prototype.reduce. Yet in JavaScript you might be working with many iterable things, not just arrays. Some examples of iterables include strings, Maps, and asynchronous streams!

I'm going to write down a general purpose reduce as an abstraction of the for-of loop. One that not only works with arrays but anything iterable in JavaScript. For good measure I'll write down both a TypeScript version and a pure JS version.

Here is the TypeScript version. Its typed so you'll get all that IntelliSense goodness with this one.

type Reducer<V, D> = ( acc: V, item: D, count: number ) => V

function reduce<V, D>(
    initialValue: V,
    reducer: Reducer<V, D>,
    data: Iterable<D>,
  ): V {
    let acc = initialValue
    let count = 0
    for ( const item of data ) {
      acc = reducer( acc, item, count++ )
    }
    return acc
}

Here is the plain old JS version.

function reduce(
    initialValue,
    reducer,
    data,
  ) {
    let acc = initialValue
    let count = 0
    for ( const item of data ) {
      acc = reducer( acc, item, count++ )
    }
    return acc
}

As you can see, our iterator reduce is just an abstraction of the for-of loop. Its also an abstraction on mutation--our reduce implementation does the dirty work of mutating the initial value over our data.

So, how does it work?

parameter description
initialValue first, you set the initial value, which will match the final type. Meaning if you set the the initialValue to 0, then the return type will be a number. If you set it to [], the final type will be an array.
reducer a callback function that will take two parameters.
  • the first parameter is called the "accumulator". The first call to our callback will set the accumulator to our initialValue, after that, it will be the value our reducer callback returned the previous time it was called.
  • the second parameter will be set to the next iteration of iterable item. So, in the case of a string, it will start with the first character in the string, the move to the second, third, and so on.
  • finally, the third parameter is simply the current position in iterating through our iterable. First call, the value will be zero, then one, and son on.
data this is the data we want to process

Now let's solve some problems using both for loops and reduce

Write a function that returns the length of the longest word in a string.

First up, the way of the loop


function longestWordLength( str ) {
  const words = split( /\W+/g )
  let longestLength = 0
  for ( const word of words ) {
    longestLength = Math.max( longestLength, word.length )
  }
  return longestLength
}

Now let's look at how you would do this using reduce. First, we need to write down our reducer.

const longestWordLengthReducer = ( longestLength, word ) => {
  return Math.max( longestLength, word.length )
}

Then we provide our solution by declaring our initial value, reducer, and data.

const longestWordLength = str => reduce( 
    0, 
    longestWordLengthReducer, 
    str.split( /\W+/g )
)

Notice how the reduce API gives us the ability to quickly understand what this function will do. We know right away that the initialValue is set to a number. So we know the end data type is a number. Of course anything is possible is JS, but using the TypeScript version will help ensure this.

Also note that we've extracted the "business logic" of the loop, the part about how we find the largest word given the previous word length, into a separate, testable, function.

Using reduce, we've solved our problem by combining our reduce function with a reducer and a function that splits the string into words. We didn't explicitly have to write a loop. We can easily swap parts in and out to solve different problems.

With the for-of loop, we think about the solution iteratively.

With reduce, we think about the solution declaratively. We're writing more maintainable code.

Performance

Update: Thanks to Krzysztof Miemiec, I was able to catch an error in my loop implementation. The results are in fact neck-and-neck.

Let's dispel a few myths about the performance of reduce. This kind of programming is not only more maintainable, but it can be just as fast or faster! Our reduce here is just an abstraction over the for-of loop. Here you can see the benchmark results for two different runs. Very close.



Comparing our reduce and for-of examples
https://jsperf.com/loop-vs-iterator-reduce-corrected/1

Generally speaking, composing re-used and well tested functions is safer. Our functions are centralized--so if we improve them, our entire application improves with them. Functional programming promotes re-using your code.

So, using our example here, consider that if at some point in the future, instead of Math.max we find a faster way to determine the larger of two values. If we do, then all functions that compose this function also benefit.

Stay tuned

In the next article we'll develop these ideas further. Stay tuned, subscribe, and find me on Twitter at @babakness.

Discussion (3)

Collapse
krzysztofmiemiec profile image
Krzysztof Miemiec

The big performance difference is because you basically compared for in loop with for of. I came up with another performance test set here: jsperf.com/loop-vs-iterator-reduce/5

We have for of loops for reduce and loop tests and "plain-old simple for loops" for simple reduce and simple loop tests. Now we can see that plain-old loops are faster than for of iterators, but at the same time reducer isn't really any slower than regular loop, despite the fact that it's a function call inside a loop. Looks like JS is really good at calling functions 🙃

Collapse
babak profile image
Babak Author

Hi Krzysztof Miemiec, thanks for your feedback, you are correct! I've made the correction to the post. The results are comparable. We can also create a reduce function that utilize better algorithms depending on the data type given. For example

function reduceImproved(
    initialValue,
    reducer,
    data,
  ) {
    let acc = initialValue
    if (data.length) {
      for ( let count = 0, length = data.length; count < length; count++ ) {
        acc = reducer( acc, data[count], count )
      }
    } else {
      let count = 0
      for ( const item of data ) {
        acc = reducer( acc, item, count++ )
      }
    }
    return acc
}

I've not tested this but the idea is that one could create a general purpose reduce for iterators in general while maintaining performance. As mentioned in the article, the reducers themselves can also be improved. We can even reach for WebAssembly, were applicable, and improve our application without having to change a single line of code.

In future parts on this topic, I will go on to discuss how our iterable reduce can be enhanced to handle asynchronous data, such as streams, with ease!

Thanks again for your corrections and feedback!

Collapse
krzysztofmiemiec profile image
Krzysztof Miemiec

I think that your utility may end up looking like rxjs 😉 Nevertheless, I get the point that internally optimized, general-purpose reducers can be a cool internal utility for executing loops in a clean & easy-to-read manner. As a performance freak, I'm quite surprised that this approach is that performant. I'll try to use something similar in one of the products I'm working on! Thanks for the article 💪🏻