DEV Community

Limit concurrent asynchronous calls

YCM Jason on September 11, 2018

Although Javascript is designed to be single threaded, you could still do things concurrently. For example, we can read multiple files concurrentl...
Collapse
 
smishr4 profile image
Shubham Mishra

I implemented this whole thing with a working code, let me know if I am doing something incorrectly.

Note: I am using setTimeout for async calls so promise failure will not happen.

const ASYNC_LIMIT = 2;

function scheduler(cb, id, delay) {
    return new Promise((resolve, reject) => {
        setTimeout(() => {
            cb(id);
            resolve();
        }, delay);
    });
}

const asyncLimit = (fn, n) => {
    let promiseArray = [];

    return async function(...args) {
        if (promiseArray.length >= n) {
            await Promise.race(promiseArray);
        }

        let p = fn.call(this, ...args);
        promiseArray.push(p);

        p.then(() => {
            promiseArray = promiseArray.filter(pending => p !== pending);
        });

        return p;
    };
};

let cb = id => {
    console.log(id + " task completed", Date.now() % 10000);
};

let modifiedScheduler = asyncLimit(scheduler, ASYNC_LIMIT);

modifiedScheduler(cb, 1, 5000);
modifiedScheduler(cb, 2, 2000);
modifiedScheduler(cb, 3, 1500);
modifiedScheduler(cb, 4, 3000);
modifiedScheduler(cb, 5, 4000);
modifiedScheduler(cb, 6, 1000);
modifiedScheduler(cb, 7, 2500);

Collapse
 
tconrado profile image
tconrado

hey, tried your code, did not work...

so, assigned the same delay for all of those...like 2 seconds... I did expect to see it process 2, schedule + 2; so, the result would be observable every 2 seconds and everytime 2 task

Collapse
 
tconrado profile image
tconrado

using the corrections from @kusha and the data and functino of your code it did work as expected!
hurray!

exceptional implementation to deal with REST API

Collapse
 
smishr4 profile image
Shubham Mishra

Can you post your working snippet here?

Thread Thread
 
tconrado profile image
tconrado • Edited

hey I'm terrible with markdown, but the example bellow is in a lib, basically it does limit the number of active connections to a http API rest service to the a defined number (8 in this case), so a the http request is resolved, it start another connection keeping always 8 active connections to the API server; it will completely hide the connection/handshake delay while guarantee no 429 error (too many requests); from my experience fastest safe approach as you can know the maximum number of calls per second of the API service


// this is the asyncLimit adjusted 
const asyncLimit = (fn, n) => {
  const pendingPromises = new Set();
  return async function(...args) {
    while (pendingPromises.size >= n) {
      await Promise.race(pendingPromises);
    }
    const p = fn.apply(this, args);
    const r = p.catch(() => {});
    pendingPromises.add(r);
    await r;
    pendingPromises.delete(r);
    return p;
  };
};

// native node.js https module to connect to shopify servers
const https = require('https')
exports.httpRequest = function(method, path, body = null) {
  const reqOpt = { 
    method: method,
    path: '/admin' + path,
    hostname: 'xxxxxxxxxxxxxxxxxxxx.myshopify.com', 
    headers: {
      "Content-Type": "application/json",
      "X-Shopify-Access-Token": "xxxxxxxxxxxxxxxxxxxx",
      'Cookie': '',
      "Cache-Control": "no-cache"
    }
  }
  if (body) reqOpt.headers['Content-Length'] = Buffer.byteLength(body);
  return new Promise((resolve, reject) => {

      const clientRequest = https.request(reqOpt, incomingMessage => {
          let response = {
              statusCode: incomingMessage.statusCode,
              headers: incomingMessage.headers,
              body: []
          };
          let chunks = ""
          incomingMessage.on('data', chunk => { chunks += chunk; });
          incomingMessage.on('end', () => {
              if (chunks) {
                  try {
                      response.body = JSON.parse(chunks);
                  } catch (error) {
                      reject(error)
                  }
              }
              resolve(response);
          });
      });
      clientRequest.on('error', error => { reject(error); });
      if (body) { clientRequest.write(body)  }  
      clientRequest.end();

  });
}


// the number 8 bellow can be changed to match the REST API service limits
// assume that this amount will call at once and will be replaced dynamically, hence
// if the service limit it 20 calls per second, be aware that 8 calls will hit the service at once
// using 40% of the maximum (avoid going higher)

exports.ratedhttpRequest = asyncLimit(exports.httpRequest, 8);

Collapse
 
kepta profile image
Kushan Joshi

Great article! Some minor improvements.

const asyncLimit = (fn, n) => {
  const pendingPromises = new Set();
  return async function(...args) {
    while (pendingPromises.size >= n) {
      await Promise.race(pendingPromises);
    }

    const p = fn.apply(this, args);
    const r = p.catch(() => {});
    pendingPromises.add(r);
    await r;
    pendingPromises.delete(r);
    return p;
  };
};
Enter fullscreen mode Exit fullscreen mode
Collapse
 
ycmjason profile image
YCM Jason

This is nice! 👍👍👍

Collapse
 
lednhatkhanh profile image
Nhat Khanh • Edited

Thank you so much for this post, learn a lot from this brilliant idea, also I added types (typescript) for this function in case someone needs this:

export function asyncLimit<T extends (...args: any) => Promise<any>>(fn: T, n: number): T {
  let pendingPromises = [] as Promise<ReturnType<T>>[];

  return async function limitedFunction(this: ThisType<T>, ...args: Parameters<T>) {
    while (pendingPromises.length >= n) {
      await Promise.race(pendingPromises);
    }

    const p = fn.apply<ThisType<T>, Parameters<T>, Promise<ReturnType<T>>>(this, args);

    pendingPromises.push(p);
    await p;
    pendingPromises = pendingPromises.filter(promise => promise !== p);

    return p;
  } as T;
}
Collapse
 
benjaminblack profile image
Benjamin Black • Edited

As written, asyncLimit will not resolve until the async function completes, because of await p.catch(() => {});.

Instead,

const p = fn.apply(this, args);
pendingPromises.push(p);
p.finally(() => {
    pendingPromises = pendingPromises.filter(pending => pending !== p);
});
return p;

Collapse
 
ycmjason profile image
YCM Jason

But we need the async function fn to complete before resolving the async function (...args) {. Isn't it?

Collapse
 
benjaminblack profile image
Benjamin Black • Edited

It's difficult to mentally follow the chain of promises here, because you have an async function (const asyncLimit = async...) which returns an async function (return async function) which itself awaits at least two promises (one in a loop) before resolving.

Note that asyncLimit does not have to be async, as it does not use await; removing async from the function signature would help comprehension a bit, since you would reduce one level of Promise-ception and stay out of limbo.

I guess it doesn't really matter, because the promise returned by asyncLimit can be used by the calling code. But attaching a finally clause to p instead of awaiting it allows the function to return immediately, instead of waiting for p to resolve.

Thread Thread
 
ycmjason profile image
YCM Jason

oh, I mistyped haha! Thanks for catching this!

asyncLimit shouldn't be an async function.

Collapse
 
benjaminblack profile image
Benjamin Black • Edited

No, because the function returns p itself, not the chained promise. The caller can attach its own .catch() clauses to p.

As in,

function foo() {
    let p = Promise.reject();

    p.catch(() => console.log('gotcha'));

    return p;
}


let p = foo();

p.catch(() => console.log('gotcha again'))
Collapse
 
ycmjason profile image
YCM Jason

Thanks!! Benjamin's reply is accurate! :)