Why not reduce?
- https://dev.to/ycmjason/writing-cleaner-code-with-the-rule-of-least-power-rolp-4kkk
- https://twitter.com/jaffathecake/status/1213077702300852224
This list intend to be a forever growing one that hope to collect typical patterns of reduce to avoid. Please feel free to suggest more examples!
This post is not about space / time performance perks by not using reduce. This is all about readability.
π΄ Do not
faces.reduce((acc, face) => {
return [...acc, mask(face)]
}, [])
β
Do
faces.map(mask)
π΄ Do not
bags.reduce((acc, bag) => {
return [...acc, ...bag.apples]
}, [])
β
Do
bags.flatMap(bag => bag.apples)
π΄ Do not
phones.reduce((acc, phone) => {
return isNew(phone) ? [...acc, phone] : acc
}, [])
β
Do
phones.filter(isNew)
π΄ Do not
dogs.reduce((acc, dog) => {
return isHappy(dog) ? acc + 1 : acc
}, 0)
β
Do
dogs.filter(isHappy).length
π΄ Do not
people.reduce((acc, person) => ({
[person.dna]: person
}), {})
β
Do
Object.fromEntries(
people.map(person => [person.dna, person])
)
π΄ Do not
people.reduce((acc, person) => {
return Math.max(acc, person.age)
}, -Infinity)
β
Do
Math.max(...people.map(person => person.age))
Top comments (10)
Hmmm, I totally agree with the first few - but the last 3 your "do not" cases use less memory and will very probably be faster - making and spreading arrays for things like a max or a "lookup maker" seem like a bad idea to me.
jsperf.com/check-the-max
In this one I've made your "Do not" case do the same as the "Do" case
jsperf.com/create-index-using-from...
this post is not about space / time performance. do-while / for-loop would probably be the best if you wish to talk about speed / storage.
please check out the links attached at the beginning of the post. they explain why reduce is to be avoided.
Seems like deciding to be 20% to 70% slower for readability is the bigger anti-pattern.
Surely this is fast and readable
I disagree.
Map
is different from building an object. So it is not a fair test. If you remove theMap
test case, using spread and map is just 15% slower. jsperf.com/create-index-using-from...Having said that, these are all opinions. Performance vs readability seems to have been an everlasting war. I think it comes down to what are you trying to achieve with you code.
Please ignore the following, just taking the piss π
If speed is all you care about and you do not need to care about maintenance / readability, then go ahead and wrap everything in one big for-loop / while-loop, construct no function because functions take up memories and invoking function have their overhead. Ah perhaps you can consider writing webasm too. It is probably the fastest.
So in my first example of making an index, I used a map to create a multi-key array look up and it was 70% slower. Then realised DNA was probably "unique" HAHA. So yeah 70% faster if there is repetition.
In the map version: your "Do" is still 20% slower than the object version and lets face it, not that much more readable. The Map version is way faster of course.
I can and do write highly readable code that is fast. I wrap things in well-named functions that work efficiently :). Yes, those functions will often hide away performance, the same way as those core native functions do.
I just care about UX and UX tries not to be janky etc.
Readability vs performance is a hard thing to argue. Everyone has their own interpretation as of, what is readability, what is performance and why one is more important than the other etc.
I hope you have read the article I linked in the beginning.
.reduce
is a very vey powerful function. Simply by using it incurs readability cost. Because the reader need to figure out what it does. As oppose to usingObject.fromEntries
which the user knows immediately that it is constructing an object. There is nothing to "read into" to understand what it does.I won't try to convince you because I don't think I can. But please go and try writing code this way. Maybe you will gain some new insights.
Thank you for the post. I struggle with understanding how important readability is, coming from a 'use descriptive variables' and 'comment complex code' background. You have given me stuff to think about, but I feel more comfortable with explaining wha tthe code does and advocating for the more efficient and maintainable code. My gut reaction to the term 'readability', with the absence of those two axioms, makes me feel less like a solution architect and more paint by numbers.
Try it out. Try to write code in a way so that "it explains itself" and doesn't require comments. Optimise only the complexity (big-O) of the code; not the performance perks of using for-loops / some other methods.
So if you are doing something like
This is
O(n*m)
and we can improve this to make itO(n + m)
by doing:So this is the kind of things I optimise.
P.S. I use comments primarily for things that are uncommon. A good example I encounter recently is, in jest,
expect(-0).toBe(0)
will fail the test. So I needed to doexpect(-0 === 0).toBe(true)
. I add comment for these kind of things explaining why I write like this.I believe you are right, though the implementation of Set in javascript should give O(1) or at most sublinear lookups with a hashing function. Based on that, I assume the following would be equivalent without the new data type.
ya except I'd not use a reduce.