loading...

How to resume the flow inside a failed try block computation without algebraic effects

jfet97 profile image Andrea Simone Costa Updated on ・7 min read

Introduction

After reading the wonderful Dan's article about algebraic effects, one question, maybe the wrong one, stuck in my head: how far we can go trying to recover a failed try block computation?

Due to their nature, I believe the answer is inextricably linked to generators. They are an exotic function type that can be paused and resumed as many times we need, without losing the previous steps' results. This is the main point because of the stack unwinding that follows the rise of an exception.
Obviously, if an unhandled exception rises, I don't know any JS magic that could help us. But thanks to a nice trick that I'm going to explain soon, we have at least the possibility to try again the failed computation and, if it is the case, replace its value with a fallback one.

Before starting, let me stress the fact that the final result won't be powerful as would be the use of algebraic effects if they were present in the language.

 

Promises + generators

Let's talk about tj/co:

co(function* () {
  var result = yield Promise.resolve(true);
  return result;
})
.then(console.log); // print 'true'

The main idea behind the function co was to execute a generator that yields out Promises. When a generator yields out something, it pauses.
The co function takes care of each yielded Promise, resolving it, and inserting back into the generator its result. So the generator will be able to continue its flow with the future value that was previously contained into the Promise.

I'm sure that nowadays you don't write such type of code anymore, preferring the async/await syntax:

;(async function () {
  let result = await Promise.resolve(true);
  return result;
})()
.then(console.log); // print 'true'

Because of the possibility to write asynchronous code in a synchronous fashion, the generators + Promises pattern was so appreciated that got a dedicated syntax!

<joke>

</joke>

But, I hear you ask, how that relates to my goal?
Well, what if, instead of Promises, we yield out pieces of the computation?

 

The idea

I have always been fascinated by the cleverness of the generators + Promises pattern. It is not so difficult to understand nor to recreate, but I admit it's something I never would have thought.
Thinking how to solve the try block failed computation problem, it has inspired me.

I'm going to show you broadly how I've solved the problem, the main idea behind it. Then I'll argue it as much as possible.

Let's transform the following block:

let value = null;

try {
    const res1 = itMayThrow();
    const res2 = itMayThrowToo(res1);

    value = res2 / res1;
} catch {}

using a generator that yields the problematic pieces of the main computation:

let value = null;

function* mainComputation() {
    const res1 = yield itMayThrow;
    const res2 = yield () => itMayThrowToo(res1);

    value = res2 / res1;
}

When the generator is executed, it yields out what could go wrong.
Who handles the generator, a simil co function, will be able to execute each yielded computation, reinserting back into the generator its result if no exception was thrown. Otherwise, it could not only try again the failed computation one or multiple times, but it could substitute it with a fallback value as well.

 

First attempt

You can find my first working solution here. I'm not going to spend so much time on it because, yes, it was powerful, but each try-catch block would have been transformed in a poem. The heaviness of the resulting code would have overshadowed the benefits of using the package itself.
Another mark against it was the need to stop using the try-catch syntax in favour of a dedicated API.

 

Second attempt

Focusing only on what is really needed, that is the possibility of retrying a failed computation or providing a fallback value, I've written a simpler package that could be used in conjunction with the well known try-catch syntax.

Here it is the result:

  • sync version
const { performSync, computeSync } = require("resumabletcf");

let value = null;

try {
    value = performSync(function*() {
        // computeSync(unitOfWork, howManyTimesToRetry, fallbackValue)
        const res1 = yield computeSync(itMayThrow, 5, 0);
        const res2 = yield computeSync(() => itMayThrowToo(res1), 5);

        return res2 / res1;
    });

} catch(e) {
    console.log(e);
}
  • async version
const { performAsync, computeAsync } = require("resumabletcf");

;(async () => {
    let value = null;

    try {
        value = await performAsync(async function*() {
            // computeAsync(unitOfWork, howManyTimesToRetry, fallbackValue)
            const res1 = yield computeAsync(itMayThrow, 5, 0);
            const res2 = yield computeAsync(() => asyncItMayThrowToo(res1), 5);

            return res2 / res1;
        });

    } catch(e) {
        console.log(e);
    }
})();

Let me explain it.

The perform functions are in this pattern what the co function is in the generators + Promises pattern.
Both the performSync and the performAsync functions take a generator, a sync and an async one respectively, and have the task to handle what they yield out. Only a particular type of function that embraces the problematic piece of computation must be yielded out, to then be properly managed by the generator runner, and we can create it thanks to the compute helpers.
If the generator reaches the end, the returned value will be given back by the perform functions, as a normal value in the performSync case or contained in a Promise in the performAsync case.

These helpers require three arguments: the unit of work to perform, how many times to retry it in case of failure (default value is 0) and a fallback value to be used if we ran out of attempts.
If you don't want to let the perform runner use a fallback value for a specific computation, preferring to rethrow the exception that has caused the unit of work to fail, simply do not pass the third parameter.
Be aware of the fact that passing undefined as the third parameter is not the same as passing only two parameters; this ensures you can use undefined as a fallback value.

Three more points to keep in mind:

  • performAsync always returns a Promise that will be fulfilled only if the async generator reaches the end, otherwise it will be rejected with the exception that causes its interruption as the reason
  • the function resulting from calling computeAsync always await the unit of work you have passed to the helper
  • you are not forced to return something from the generators

 

An example

Now we'll see an example of an async computation where two different remote API will be called into play. Each HTTP request could fail or take too time to respond, so the possibility of trying again will shine.
Moreover, the user position will be asked and, in the worst case, the Greenwich coordinates will be used as default value.

import { performAsync, computeAsync } from "resumabletcf";

const httpClient = Object.freeze({
    async delay(ms, v) {
        return new Promise(ok => setTimeout(ok, ms, v));
    },
    async get(url, ms = 1000) {
       const res = await Promise.race([fetch(url), this.delay(ms)]);

       if(res === void 0) {
           throw new Error("Out of time");
       }

       // only successfully status (2XX) are allowed 
       if(res.status < 200 || res.status > 299) {
           throw new Error(res);
       }

       return res;
    },
    async post(url, { headers, body, ...otherConfigs }, ms = 1000) {
       const config = {
           ...otherConfigs,
           method: "POST",
           headers,
           body,
       }
       const res = await Promise.race([fetch(url, config), this.delay(ms)]);

       if(res === void 0) {
           throw new Error("Out of time");
       }

       // only successfully status (2XX) are allowed 
       if(res.status < 200 || res.status > 299) {
           throw new Error(res);
       }

       return res;
    },
    async toJSON(res) {
        return await res.json();
    }
});

// wrapping the getCurrentPosition API
const getPosition = function (options) {
  return new Promise(function (resolve, reject) {
    navigator.geolocation.getCurrentPosition(resolve, reject, options);
  });
}


;(async () => {

    try {
        await performAsync(async function* () {

            // ask the user for his location only one time
            // use Greenwich coordinates as default
            const position = yield computeAsync(getPosition, 0, {
               coords: { latitude: 51.47, longitude: 0 }
            });

            const lat = position.coords.latitude;
            const lon = position.coords.longitude;


            const wrappedGetRequestAboutWeather = () => httpClient.get(`
                https://api.openweathermap.org/data/2.5/weather?lat=${lat}&lon=${lon}&APPID=0a80c24ce405d5481c3c5a9c41b9d45c
            `);

            // try to get info about the weather 10 times in total
            // the fallback value is 'null'
            let weatherRes = yield computeAsync(wrappedGetRequestAboutWeather , 9, null);

            if(weatherRes === null) {
                // try to get weather info from another API
                // ...
            }


            // if the 'toJSON' method fail it means that a wrong formatted
            // JSON response was obtained by the server
            // we are not able to do anything: let the exception rise
            const { weather } = await httpClient.toJSON(weatherRes);


            const wrappedPostRequestAboutWeather = () => httpClient.post(`
                https://5d457dedd823c30014771ebb.mockapi.io/resumabletcf/weather
            `, { body: JSON.stringify(weather[0]) }, 2000);

            // try to store info about the weather 10 times in total
            // here it does not make sense to provide a fallback value
            // so if it were not possible, an exception will be thrown
            yield computeAsync(wrappedPostRequestAboutWeather , 9);

        });

    } catch(e) {
        console.log(e);
    }

})();

 

Epic fail

As I've already said, this solution is far away from the power of algebraic effects.
Let me show an example; I'll borrow it from Dan:

function getName(user) {
  let name = user.name;
  if (name === null) {
    throw new Error('A girl has no name');
  }
  return name;
}

function makeFriends(user1, user2) {
  user1.friendNames.add(getName(user2));
  user2.friendNames.add(getName(user1));
}

const arya = { name: null, friendNames: new Set() };
const gendry = { name: 'Gendry', friendNames: new Set() };

// here the main part
try {
  // this is going to throw because 'arya.name' is 'null'
  makeFriends(arya, gendry);
} catch (err) {
  console.log("Oops, that didn't work out: ", err);
}

It could be transformed into:

const { performSync, computeSync } = require("resumabletcf");

function getName(user) {
  let name = user.name;
  if (name === null) {
    throw new Error('A girl has no name');
  }
  return name;
}

function makeFriends(user1, user2) {
  user1.friendNames.add(getName(user2));
  user2.friendNames.add(getName(user1));
}

const arya = { name: null, friendNames: new Set() };
const gendry = { name: 'Gendry', friendNames: new Set() };

// here the main part
try {
  performSync(function*() {
    yield computeSync(() => makeFriends(arya, gendry), ...); 
  });
  ;
} catch (err) {
  console.log("Oops, that didn't work out: ", err);
}

but this isn't going to help us so much. That is because getName and makeFriends are normal functions; after raising the exception, their stack is unwound. We are still able to replace the result of calling makeFriends inside the generator, but it is pretty useless at this point.

The computeSync could be modified to take a fallback computation as well, but it seems to be a non-complete solution.
I need to think about that. What's your opinion? Do you have any idea?

 

Conclusion

I have to admit that I'm not completely satisfied with resumabletcf.
I think it could find its place in the JavaScript world, but it seems a bit limited and limiting. The fight against the stack unwinding is hard; maybe generators are not the right answer.
How far we can go with a dedicated API before the cost-benefit grows too much?
Or maybe the solution is here somewhere, but I'm not able to see it.

Posted on by:

jfet97 profile

Andrea Simone Costa

@jfet97

I write JavaScript code, mostly.

Discussion

pic
Editor guide