We have already discussed some approaches to manage concurrency in JS, we have used callbacks, thunks (callbacks with some more bits), promises and in this article, we will talk about generator functions, conceptually speaking, this pattern is the foundation of the fancy async functions.
Have you probably imagined a JS function that we can pause and resume? It would be *pretty cool isn’t it? - unfortunately with all that *hurry of the JS engine to execute the fastest and easiest instructions, it seems quite impossible.
Luckily there were some nice guys that have already imagined it and have made that possible within the JS engine. Differently to the previous approaches, we will now manage concurrency by pausing and resuming the execution of instructions inside a special function ... the generator function.
If you haven’t heard about generator functions, this article from MDN can help you to better understand it.
Let’s remember the previous example, where we need to fetch some data mocked by setTimeouts in a sequential ordered manner.
function fetchAPI(rq, time) {
const apiResponses = {
'file1': 'First File',
'file2': 'Second file'
}
setTimeout(function () {
console.info(apiResponses[rq])
}, time)
}
fetchAPI('file1' , 3000)
fetchAPI('file2' , 100)
// Second file
// First file
The output of the previous code is not what we initially wanted, we really need to force 😤 the JS engine to think sequentially and in an order manner.
Let’s code an intuitive example of the previous scenario using generator functions.
function fetchAPI(rq, time) {
const apiResponses = {
'file1': 'First File',
'file2': 'Second file'
}
setTimeout(function () {
console.info(apiResponses[rq])
iterator.next({ rq: "file2", time: 500 })
// Send new data and pauses
}, time)
}
function* generator() {
while (true) {
const object = yield 'objectReceived'
yield fetchAPI(object.rq, object.time)
}
}
const iterator = generator()
iterator.next()
// Position the iterator in the first 'yield' and then pauses
iterator.next({ rq: "file1", time: 2000 })
// Send the data and then pauses
iterator.next()
// Position the iterator again ready for receiving new data and pauses
// First File
// Second file
Visually speaking we could see some differences with conventional functions, first of all, when we call a generator functions we get back an iterator, it is pretty much like as the next object:
{value: currentValue, done: false/true}
That lets us go over each yield , blocking in a sequential ordered fashion and then pausing.
In the previous snippet we basically pause the function so that it waits until the file1 with 2000 ms is reached and then we resumed it so that it starts fetching file2 with 500ms (the easiest and preferred by JS engine).
Indeed we could have solved this in different ways as we did in the previous articles, some of them could have been even shorter and more beautiful, but the very idea behind this example is to show how to use it, so that you can decide which one is the perfect suit for your use-case. Every strategy for managing concurrency in JS brights more in some scenarios than in others, we need to first prove them and then choose.
Let’s do a powerful mix of generator functions + promises so that we approach this problem in a pretty more sophisticated way.
function fetchAPI(rq, time) {
const apiResponses = {
'file1': 'First File',
'file2': 'Second file'
}
return new Promise((resolve) => {
setTimeout(function () {
resolve(apiResponses[rq])
}, time)
})
}
function runGenerator(g) {
let it = g(), ret;
(function iterate(val) {
ret = it.next(val);
if (!ret.done) {
// Test the resolution of the promise
if ("then" in ret.value) {
// wait on the promise
ret.value.then((data) => {
// When the data is fetch || first promise yield is execute
iterate(data)
// the pointer in the generator function moves to the next yield
});
}
}
// done
}
)();
}
runGenerator(function* main() {
var result1 = yield fetchAPI('file1', 2000);
console.info(result1)
var result2 = yield fetchAPI('file2', 500);
console.info(result2)
});
// First file
// Second file
The idea behind this, is to handle several asynchronous calls (returned as promises) in a orchestrated way (using runGenerator() function), we are fundamentally doing the same, but this time we let the runGenerator() function to iterate over the generator function main() itself and checking whenever those async calls are fulfilled.
Conceptually speaking, generator functions are the bases of the async functions. This new fancy API does what we saw in the previous examples but hiding some of the complexity of them in a pretty awesome manner.
Now let’s code the same example but this time using async functions.
async function fetchAPI(rq, time) {
const resp = **await** new Promise((resolve) => {
const apiResponses = {
'file1': 'First File',
'file2': 'Second File'
}
setTimeout(function () {
resolve(apiResponses[rq])
}, time)
})
return resp
}
fetchAPI('file1', 3000)
.then(resp => {
console.info(resp)
return fetchAPI('file2', 100)
}).then(resp => {
console.info(resp)
})
// First File
// Second File
That’s it. The fancy async function fetchAPI() waits until a promise resolves or rejects pausing the execution of the next instruction (in this case the return resp), when the promise resolves, the functions resumes and continues, behaving as synchronous, blocking and reasonable as we all love. Keep in mind that this works only for promise returned values
Finally, to end this series Async JS Patterns, let’s remember from where we have come, we initially started in the callback world, we go deeper revising thunks, then we jumped to promises, and finally we entered the awesome world of generator functions and async functions. I hope you have enjoyed this adventure, it was a really pleasure to me to share this knowledge with you, I would have loved to go beyond explaining in a more technical understanding and better view of this approaches in terms of performance, if you would like it too, please write me, and even if you are not interested, write me as well so we can get in touch 😎.
Top comments (1)
Very good