These past few weeks I've seen some stirrings on Twitter about how people should avoid using Array.prototype.reduce
mainly around the fact it's difficult to read. At first I thought it was rubbish and that it wasn't difficult at all. The more I thought about it I realised that I've been writing JavaScript for years. I've led teams and projects, I've also been the guy that people come to for help with their JS. I'm an experienced developer.
What I was missing from my initial dismissal of the tweets versus reduce
was empathy. I'll come back to that.
When I first came across reduce
it took a while for it to stick in my brain, I studied it and practiced it and eventually I had the muscle memory to bend code to my will using reduce.
...but that's the crux of the issue isn't it? It took time and focus to understand.
Writing software is challenging enough - especially in large codebases where dragons are lurking around every corner - without fighting to understand the basic language.
My brain on reduce (at first)
Here's my inner dialogue when I first started coming across reduce
:
// Okay so we're going to do something here
// It takes data as an argument and returns something
function doSomething(data){
// Nice a one liner this should be simple
// So I'm guessing 'data' is an array, I know that reduce is for arrays. (TypeScript helps with this!)
// Wait what is acc? What is curr?
// Nice argument names doofus.
// acc + curr.val, okay so is it concatenating strings?
// Oh theres a second argument to reduce
// *reads MDN docs*
// Oh that's the initial value
// Sweet so it's just calculating a total
// So on first pass acc = 0
// Ahh and then for each element of data we add the elements `val` property
return data.reduce((acc, curr) => acc + curr.val, 0)
}
That's for a simple reduce
, this is the primary example that's given on when to use reduce. reduce
is good for other stuff such as grouping data for a given key or combining map
and filter
in a single iteration:
const activeIds = items
.filter((item) => item.active === true)
.map((item) => item.id)
const activeIds = items.reduce((result, item) => {
if(!item.active) return result;
return [...result, item.id]
}, [])
The filter + map loops over items twice whereas reduce does it once. However, tell me which of the above snippets is the easiest to read? It's filter + map, yes if you're familiar with reduce and your muscle memory kicked in it's not so bad. When we're building software we want to be focused on the business logic and adding features, not wasting focus deciphering language features.
Empathy
There's the common quote that's used:
Code is read much more than it is written.
This is 100% true.
- You write some code
- It works
- You ship it
- Time passes
- Requirements change
- You or some other poor soul needs to update the code
- They study it and all code around it
- They decide if this is the place the change needs to be made
- They make the change
Repeat as many times as requirements change or bugs are fixed. Most of the time spent in that process is reading and understanding.
Now imagine someone on your team less experienced at JavaScript or software development in general comes along and not only do they need to understand the requirements of the task, but also break down the language. They won't have the muscle memory that you do.
So tell me why is:
function calculateTotalValue(data){
return data.reduce((result, item) => result + item.val, 0)
}
better than this:
function calculateTotalValue(data){
let sum = 0;
// This could also easily be a straight up for loop
for(let item of data){
sum += i.val;
}
return sum;
}
Anyone can read the loop, you don't get points for being succinct. This isn't code golf.
Performance
I very briefly touched on performance when I mentioned iterating a collection just once vs filter
+ map
.
What if I told you that a regular for loop or a for..of loop was faster than reduce anyway? (marginally for realistic examples)
Take a look at these snippets
function reduceData(data){
return data.reduce((acc, curr) => acc + curr.val, 0)
}
function forOfData(data){
let sum = 0;
for(let i of data){
sum += i.val;
}
return sum;
}
function forLoopData(data){
let sum = 0;
for(let i = 0, len = data.length; i < len; i++){
sum += data[i].val;
}
return sum;
}
You can view the benchmark here
Yes, these are really contrived and for realistic sample sizes it's extremely marginal.
Conclusion
Writing code isn't about telling a computer what to do. It's about telling other people what you want to computer to do. You don't get points for succinct code unless you're doing code golf. You don't get points for smashing out a 1 liner that takes 5 minutes to understand.
You get points from writing readable code, your team will thank you.
I wrote this post as a reminder to myself. I'm guilty of doing some pretty arcane stuff with reduce
when I should reach for a simpler solution. I'm trying to be better.
So, use filter+map+whatever or regular loops:
- ✅ More readable
- ✅ Faster (marginally)
- ✅ Anyone with a basic understanding of JavaScript understands what's going on
Top comments (0)