When we are building applications, we generally end up chaining the result of one function into another function, for example the following is not an uncommon thing to see:
function purchasePriceReducer(previous, current) {
previous.push(current.price);
return previous;
}
function withTaxMapper(item) {
const withTax = item * 1.2;
return Math.round(withTax * 100) / 100;
}
function costReducer(previous, current) {
return previous + current;
}
// dummy express handler example
app.post('/pay', async (req, res, next) => {
// [
// { product_id: 103, price: 1.30 },
// { product_id: 3, price: 20.40 },
// { product_id: 29, price: 14.76 }
// ]
const purchases = req.body.purchases;
const prices = reduce(purchases, purchasePriceReducer, []);
const withTax = map(prices, withTaxMapper);
const totalCost = reduce(withTax, costReducer, 0);
await paymentService.init(totalCost);
return res.status(200).send('OK');
});
In the example above we make use of the Array Map and Array Reduce functions that we built in previous articles.
This is an example of something that is common in the wild. Namely, we import some helpers, take some content from a request and we do something to that content before sending back some sort of message or status to the requester. This kind of chaining is what we will address in todayโs post by looking at functional pipes and how they can help us write clearer code in the grand scheme of things.
Tests
describe("Pipe", () => {
it("Should throw for invalid parameters", () => {
expect(() => pipe("string")).toThrowError(TypeError);
});
it("Should throw even if a sub array of functions is provided", () => {
expect(() =>
pipe(
() => "first function",
[
() => "second function, inside array",
() => "third function, inside array"
]
)
).toThrowError(TypeError);
});
it("Should allow functions to be passed by reference", () => {
const addOne = number => number + 1;
const double = number => number * 2;
const result = pipe(
addOne,
double
)(5);
expect(result).toBe(12);
});
it("Should allow anonymous functions to be passed", () => {
const result = pipe(
number => number + 1,
number => number * 2
)(5);
expect(result).toBe(12);
});
it("Should return correctly when values are generated from sub pipes", () => {
const addOne = number => number + 1;
const double = number => number * 2;
const result = pipe(
addOne,
double,
number => pipe(
addOne
)(number)
)(5);
expect(result).toBe(13);
});
});
describe("PipeWith", () => {
it("Should return as expected", () => {
const addOne = number => number + 1;
const double = number => number * 2;
expect(pipeWith(5, addOne, double)).toBe(12);
});
});
Our tests check that parameter validation is run, pipes return as expected and sub-pipes are executed correctly. We also have the function pipeWith
defined here as a helpful proxy to the pipe
function by taking the value and then the function to run in order over that value just like the pipe
function would but with immediate invocation to receive the resulting value immediately instead of delaying the pipe
execution.
You might be asking why the signature of pipeWith
is not just the default behaviour of pipe
. In short, this is because in some scenarios you will want to setup your pipeline in advance, especially if it is re-used in multiple areas of your application and you'd prefer to wait until later to put a value through that pipeline. The two functions existing alongside one another adds an extra layer of flexibility to match your needs and preferred coding style.
Implementation
/**
* @function pipe
* @description A function pipeline to apply over a given value
* @param {Function[]} fns - The functions to call when a value is provided
* @returns {Function} The function where the value to call the pipeline on is provided
*/
function pipe(...fns) {
if(fns.every(fn => typeof fn === "function") === false) {
throw new TypeError("All parameters should be of type Function. At least one parameter does not meet with this criteria.");
}
return input => reduce(fns, (prev, fn) => fn(prev), input);
}
/**
* @function pipeWith
* @description A function to apply a pipeline of functions to a given value
* @param {*} value - The value to apply the pipeline to
* @param {Function[]} fns - The functions to call when a value is provided
* @returns {*} The result of the pipeline
*/
function pipeWith(value, ...fns) {
return pipe(...fns)(value);
}
In the example above we make use of the Array Reduce function that we built in a previous article.
The implementation above provides us with 2 helper functions.
The first of these is the pipe
function which takes a list of functions, returns another function which expects an input
to be provided and then runs all the functions in order over the value passing the result of the previous function to the next via a reducer.
The second helper function is pipeWith
which takes a value
and the functions to apply to that value
and simply returns the end result of the pipe
pipeline. I think this is a nicer interface to use but if you prefer to use pipe directly, that is totally good to do too.
Taking our opening example, we could alter it to do the following by piping smaller more manageable functions together:
function purchasePriceReducer(previous, current) {
previous.push(current.price);
return previous;
}
function withTaxMapper(item) {
const withTax = item * 1.2;
return Math.round(withTax * 100) / 100;
}
function costReducer(previous, current) {
return previous + current;
}
function getPricesFromPurchases(purchases) {
return reduce(purchases, purchasePriceReducer, [])
}
function applyTaxes(prices) {
return map(prices, withTaxMapper);
}
function sum(prices) {
return reduce(prices, costReducer, 0);
}
// dummy express handler example
app.post('/pay', async (req, res, next) => {
const totalCost = pipeWith(
req.body.purchases,
getPricesFromPurchases,
applyTaxes,
sum
);
await paymentService.init(totalCost);
return res.status(200).send('OK');
});
More than likely these helpers would be in external files and not with the router itself to show that I have created a project for you to see an example setup for the above code. In that example the code is way cleaner and looks like so:
const { pipeWith } = require("./helpers/pipe");
const { sum } = require("./helpers/sum");
const { getPricesFromPurchases } = require("./helpers/purchase-prices");
const { applyTaxes } = require("./helpers/apply-taxes");
// dummy express handler example
app.post('/pay', async (req, res, next) => {
const totalCost = pipeWith(
req.body.purchases,
getPricesFromPurchases,
applyTaxes,
sum
);
await paymentService.init(totalCost);
return res.status(200).send('OK');
});
In my opinion pipes are useful in a variety of cases and since they just take functions to apply to values, you can have pipes call other pipes and so on too which makes them a very powerful abstraction to make use of.
By using functional composition and our Array Map and Array Reduce to help with immutability, we are able to create a cleaner, simpler and more understandable structure to our code.
Conclusions
Pipes are cool because in functional programming, we look at our code as the computation of actions running together. A pipeline like this shows how powerful the humble function really is and how they can literally feed one into the other to resolve the computation of any programme.
I hope this was an interesting article for you and that you found some value in it, how will you use pipes in your code? Let me know in the comments below!
Top comments (15)
Tiny note; the pipe function behavior is not entirely as described here:
With the current implementation, parameter 2 (or 3, or 4, ...) could also be a Function[].
For example, the following is valid, while it shouldn't be according to the above error:
I had some fun setting up my first CodeSandbox to demo this with 1 additional test (open the sandbox page separately, because the dev.to embed doesn't allow the tests to be shown...)
The reason is that the concat trick
const parameters = fns.reduce((output, value) => output.concat(value), []);
will flatten just any value it gets, not only the first one.I'd say this behavior is desirable. It could be useful! But it's not as described.
I updated the check because actually the jsdoc expects just an array of functions to be in the
fns
variable and so I updated the code accordingly, open to feedback but I think this is cleaner either way. I also added your extra test with some modifications to the article.Looks good to me! Now it's consistent.
The spread operator can also be used when calling pipe, like
pipe(...[f1, f2], ...[f3, f4], f5)
, so if you know for sure there will be an array (or arrays) of functions to input, that's still possible this way.Exactly. Appreciate the input though! ๐
Exactly which still matches the initial intention anyway and keeps things more consistent too.
pipe(()=>1, ()=>2, "nope");
The phrasing suggests that the first argument must be an array of functions, which is not so. Should be something like: "All arguments must be functions"
Side note 1. Typical error reporting is typically phrased about arguments, not parameters, such as in TypeError
Side note 2. Please consider using TypeError which reflects the situations more precisely.
I am aware of the common
TypeError
usage and I am also aware of the phrasing being slightly off but it is being used as a spread parameter and would always be an array when the parameters are provided which I why I ended up choosing this wording but I will update the text to be a little clear never the less. Thanks for the comment!Your wording causes me to suspect you are not clear on difference between parameters and arguments.
Arguments will be provided.
See stackoverflow.com/questions/156767...
Again, "All arguments must be functions" or something of that sort would be more correct.
No since the function does accept an array of functions or all arguments can be functions. The wording is correct thusly.
Pass a
Function[]
and then all parameters as individual โFunctions` and it works both ways, heed the wording.Furthermore parameters and arguments are the same thing in English and have been used interchangeably between people in every role Iโve ever had, thus such elitism over terminology helps no one. The point of the article is to be educational and introduce people to a new concept and it does that. The tests are there, the implementation works exactly as it should and the terminology is accurate. Such pedantry is unwelcome and wholly unnecessary from your side in this case ๐คทโโ๏ธ
Correct! My mistake. First element of
fns
can be array of functions. You see? Easy. "My mistake". No shame in this. Everybody does mistakes. I did miss the functionality ofconst parameters = fns.reduce((output, value) => output.concat(value), []);
.... and as sad as it is, all of them were wrong. How much evidence you need to be convinced that this is incorrect?
@param
in jsdocInstead of correcting your colleagues, you are ignoring opportunity to learn. Stop for a minute and think about this.
Being correct and using terminology consistent with MDN in an article that people are supposed to learn from does help ... a lot.
Being incorrect and defending that definitely does not help.
Doesn't mean it should ignore terminology.
Not covering
fns[0]
being an array, which is not a very good example to learn from or reference in an argument.Not true. How can I convince you otherwise?
My aim is for programmers in general to be more professional. I think it will be a better world. Articles that people are supposed to learn from are the place to use correct terminology.
Learn from Kabir: "Uptil now, I thought that arguments and parameters were the same. But I just looked them up on the web again and understand the difference now. Shall correct it."
Your argument about arguments vs parameters is whatever. I know the difference, I learned it years ago as did most developers but the difference between you and I is that I accept language evolves and is used differently to how it is โstandardisedโ every day. People are not technical books nor are they pedants about it like you seem to be. If people want to say one thing or another, so long as it is understood clearly by all parties, I couldnโt care less what terms they use and neither do most reasonable people.
โ Being correct and using terminology consistent with MDNโ - sure thing, tell you what, Iโll just copy paste everything from there next time ๐
โNot covering fns[0] being an array, which is not a very good example to learn from or reference in an argument.โ - Firstly the code is covered on all branches and lines so I donโt care plus even if it wasnโt at 100% Iโd be fine with 80%+, even in TDD the aim isnโt 100% coverage, you should know that being such a proficient reader of technical documents. Secondly I could add an example for illustration but I chose not to although I may change that stance in a future update if I do one.
โMy aim is for programmers in general to be more professional. I think it will be a better world. Articles that people are supposed to learn from are the place to use correct terminology.โ - correct to you coming from a formal and standardised world but thatโs not the world we live in and to be accessible to the most people possible, simple and understandable language used on the ground will always be preferable to me.
Iโm finished with this conversation, I appreciate the comment but Iโm finished talking about such pedantic things.
very good! I've been doing similar techniques for a long time, just putting the functions in an array. One technique I do sometimes is to do something similar to a. reduce and the output of a previous function is input of a new one. I'm glad to see developers with this same line of thinking.
Glad you liked the article and nice to hear that youโve been trying this style for some time, I find it super helpful in many situations also! Thanks for stopping by and leaving a comment ๐