If you've worked with Javascript and arrays, you've surely used some of the functional utilities packed with them: filter, map, reduce and others. ...
For further actions, you may consider blocking this person and/or reporting abuse
Cool project, though I would argue that you could also improve the performance of the native methods by merging filter, map, etc. to a single reduce method call.
It should even be possible to automate that kind of optimization using Babel or a similar transpiler.
To be honest, I'm baffled that Javascript runtimes don't do any kind of optimisation already.
I think they first aimed to optimize the single methods instead for optimizing the combinations for the simple reason that improving the performance for a single method directly improved the performance of every combination it was used in whereas you have many more combinations to improve to yield effects.
That's an interesting take, thanks for sharing! Out of curiosity, why did you choose chaining methods instead of piping functions? (approach taken by Rambda and RxJS 6, for instance)
Here's an example of what I mean.
Hm... I hadn't though about this, and it's interesting to me. The main reason to use chainable methods was to mimic arrays, as extending array functions to generators is how this idea started.
For what you propose,
Fair enough, familiarity is always a bonus :)
Either take MojiScript's idea and make all pipelines async anyway, or provide a
pipeAsync
function that would be equivalent to your.async()
.I'd say there wouldn't be any intermediate stages. The pipe would be created with a list of functions, and then it would be given and array and apply all operations to each element of the array at once, effectively iterating over the array only once.
But I think that's more a matter of API ergonomics than actual differences in behaviour, you can probably achieve similar results either way.