You will get noticeable performance hit if your array has more than 1K elements. In worst case scenario you will iterate 2 times over 1K elements. Btw, what makes you think that reducer is not readable?
type User = { name: string; city: string; birthYear: number; } declare const users: User[] const currentYear = new Date().getFullYear(); const olderThan25 = (user: User) => user.birthYear && (currentYear - user.birthYear) > 25 ? [user] : [] const getName = ({ name }: User) => name const userNames = users.reduce((acc, user) => olderThan25(user) ? acc.concat(getName(user)) : acc, [] as Array<User['name']> );
You can chain map and filter in functional languages, like for example F# because there is no intermediate value.
map
filter
F#
That's a good point with a great code presentation. 👍🏻
Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment's permalink.
Hide child comments as well
Confirm
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
You will get noticeable performance hit if your array has more than 1K elements. In worst case scenario you will iterate 2 times over 1K elements. Btw, what makes you think that reducer is not readable?
You can chain
map
andfilter
in functional languages, like for exampleF#
because there is no intermediate value.That's a good point with a great code presentation. 👍🏻