My job keeps me very busy and I don't get a chance to write as much about what I am doing (compared to actually doing it). So this post is an interlude from the other series I started on pluggable APIs.
The company I work for does a lot of work with data and prepares all sorts of tables and charts. An internal project we are looking at as a need to be able to replicate these charts in a more "live" dashboard style of presentation.
Inspiration
I work a lot with MongoDB and have been intrigued with how the JSON query language they have developed provides a lot of power without being being to actual procedural code. Sometimes, you need that kind of power and flexibility after you have left the database behind and are in plain JavaScript land. So I started wondering about how this could be done.
I did look at JS modules that mimic the MongoDB approach but I wanted something that might be a little more "consumable" by developers with different language skills.
But, I am probably getting ahead of myself and we should talk about what a data pipeline is first.
Collection Pipelines
Martin Fowler describes Collection Pipelines as follows:
Collection pipelines are a programming pattern where you organize some computation as a sequence of operations which compose by taking a collection as output of one operation and feeding it into the next.
So it is a bit like this....
collection => function => output => function => ......
One of the ways we can do this in JavaScript is with a Promise
. Promises can have an initial state and can pass the output of one operation into the next via the .then
chaining function.
Promise.resolve(someData)
.then(result => {
// do something here
const newResult = // details omitted
return newResult
})
.then(previousResult => {
return // something else
})
NOTE: The above example is pseudocode and Promises can also come in the async/wait flavour. The importance of the particular choice for the example will become apparent soon.
This mechanism is great, but it means we have to know when we write the code what each of the operations are.
If we look at Martin Fowler's description, one thing that stands out is the word sequence
. The main mechanism in JavaScript for a sequence
is an array
. What if we could use an array to give us the sequence we need but also leverage the power of the Promise/then pattern?
Combining Promises and Reduce
As it just so happens, someone has already thought about this and over on the MDN site there is an article that includes the following code:
[func1, func2, func3].reduce((p, f) => p.then(f), Promise.resolve())
.then(result3 => { /* use result3 */ });
You will have to scroll down to the Composition section to find it.
We can make a nice sequence
function to abstract this:
const sequence = (operations, collection) => {
return operations.reduce((p, f) => {
return p.then(f)
}, Promise.resolve(collection))
}
Now we can write:
const collection = ..... // array or object
const finalResult = sequence([
func1,
func2,
func3,
......
], collection)
BUT - this only gets us part of the way there.
Next (yes, running out of time again)
The next article in this series will wrap it all up with a way in which we can use something inspired by MongoDB... more like this:
const collection = // array of objects (or arrays for records)
const transformed = transformer([
["group", ["year", "state"]],
["mutate", {total: "`${row.men + row.women}`"}],
["rename", {total: 'Total'}]
], collection)
Stay tuned....
Top comments (0)