DEV Community

Cover image for Build a concurrency limiter
Kailash Sankar
Kailash Sankar

Posted on

Build a concurrency limiter

Like memoizer and auto-completer, building a concurrency limiter is another interesting interview question.

Assume you have a function that does an async action like calling an API and you want to make sure that it's only run at most x times in parallel. The goal here is to write a function that can add this concurrency limiting capability to any such async function.

Let's start with a test case first

// mock api, resolves after 1 second
function api(params) {
  return new Promise((resolve, reject) => {
    setTimeout(()=>{
      const res = JSON.stringify(params);
      resolve(`Done: ${res}`);
    }, 1000);
  });
}

// accepts function and a limit to apply on it
function concurrencyLimiter(fn, limit) {
 // TODO
 return fn;
}

// tests
function test() {
  const testApi = concurrencyLimiter(api, 3);

  // for logging response
  const onSuccess = (res) => console.log(`response ${res}`);
  const onError = (res) => console.log(`error ${res}`);

  // multiple calls to our rate limited function
  testApi('A').then(onSuccess).catch(onError);
  testApi('B').then((res) => {
    onSuccess(res);
    testApi('B.1').then(onSuccess).catch(onError);
  }).catch(onError);
  testApi('C').then(onSuccess).catch(onError);
  testApi('D').then(onSuccess).catch(onError);
  testApi('E').then(onSuccess).catch(onError);
}

test();
Enter fullscreen mode Exit fullscreen mode

The log will look like this, prints A to E together after one second, and then a second later prints B.1

response Done: "A"
response Done: "B"
response Done: "C"
response Done: "D"
response Done: "E"
response Done: "B.1"
Enter fullscreen mode Exit fullscreen mode

After implementing the concurrency limiting function, we'll see A to C after one second, a second later D to B.1

Breaking down the requirement, we need

  • counter to track the number of active calls
  • queue for managing calls
  • wrap the original call with a then and catch which will dispatch the next in the queue
  • return a promise to keep contract the same
function concurrencyLimiter(fn, limit) {
  let activeCalls = 0;
  const callQueue = [];

  // decrement count and trigger next call
  const next = () => {
    activeCalls--;
    dispatch();
  }

  // add function to queue
  const addToQueue = (params, resolve, reject) => {
    callQueue.push(() => {
      // dispatch next in queue on success or on error
      fn(...params).then((res)=> {
        resolve(res);
        next();
      }).catch((err) => {
        reject(err);
        next();
      });
    });
  };

  // if within limit trigger next from queue
  const dispatch = () => {
    if(activeCalls < limit) {
      const action = callQueue.shift();
      if (action) {
        action();
        activeCalls++;
      }
    }
  }

  // adds function call to queue
  // calls dispatch to process queue
  return (...params) => {
    const res = new Promise((resolve, reject)=> {
      addToQueue(params, resolve, reject);
    });
    dispatch();
    return res;
  }
}
Enter fullscreen mode Exit fullscreen mode

Rerun the test, and you'll notice the difference in timing. Change concurrency limit to 1 and you will see only one message per second in the log.

Modify the test to see how exceptions are handled

// generate random number within limits
const getRandomNumber = (min = 1, max = 10) =>
 Math.floor(Math.random() * (max - min) + min);

// in the mock api, update promise to reject random calls
   setTimeout(()=>{
      const res = JSON.stringify(params);
      if(getRandomNumber() <= 5) {
        reject(`Something went wrong: ${res}`);
      }
      resolve(`Done: ${res}`);
    }, 1000);

Enter fullscreen mode Exit fullscreen mode

This test will verify that promise rejections or exceptions don't break the concurrency limiter from dispatching the next action.

That's all folks :)

Top comments (0)