DEV Community

Krishna Kurtakoti
Krishna Kurtakoti

Posted on • Updated on

Designing backend for performance by reducing DB calls.

We need to generate report based on the user’s activities for each day. A wallet for each user is created at the time of user registration. The user can top up his wallet and use his wallet to make purchases by shopping in the application’s platform. All these activities are accounted in the reports generated daily.
Code (Part-I):

const userWalletService = require("../../services/shop/userWallet"); let from = reportGenerationStartDate(dd/mm/yyyy00:00:00) let to = reportGenerationEndDate(dd/mm/yyyy23:59:59) for(let user of users){ ////fetch the user’s activities from the DB(database) const filter = { updatedAt: { $gte: from, $lt: to, }, userId: userId }; const projection = {}; const option = {}; const userWallet = await userWalletService.getAllUserWallets({ filter, projection, option, }); logger.info( Got all userWallet in ${dateRange} = ${userWallet} ); openingBalance = userWallet.walletBalance; }

Here, we are fetching user’s wallet data based on the user’s Id. For each user, we are making a DB call. If the users count increases as the application usage increases, say to 1000 or 10000 users, we need to make 10000 calls to the DB to fetch each user’s information and use that to generate the report. This is not feasible or good way to fetch the data from database.

From the above code, we can see that DB calls inside for loop can make the database to become unresponsive (in our case MongoDB to send error – pool draining and close the connection without processing the current DB calls being made).

This is undesirable and must be avoided. The approach to follow as suggested by one of the Project Lead is to fetch all the data that would be fetched when each call would be made separately inside the for loop.
One we have the data, we can do the processing, filtering operations with the modern Javascript functions like filter(), find(), map(), etc.

In any application, the DB calls can always be treated as bottle neck for processing the Api requests.
Also, if the app is running in cloud, we must know the processing power will be huge. Any large volumes of data is handled efficiently by these apps running on cloud due to automatic scaling of the resources when required.
We have our application running on AWS cloud and the framework is serverless with lambda functions which have high processing power, can do computing operations on large set of data efficiently.

We need to always consider DB calls made as having constraints on the database resources and should limit the number of calls to the database.

The above code is modified as shown:

We declare an array containing all the userIds based on which the user’s activities can be fetched.
We make a query like shown below.
userId: { $in: userIds }

Code (Part-II):

const userWalletService = require("../../services/shop/userWallet"); const filter = { updatedAt { $gte: from, $lt: to, }, userId: { $in: userIds } }; const projection = {}; const option = {}; const userWallets = await userWalletService.getAllUserWallets({ filter, projection, option, }); logger.info( Got all userWallets in ${dateRange} = ${userWallets.length} ); for (let i = 0; i < users.length; i++) { const userWallet = userWallets.find((userWallet) => { return userWallet.userId.toString() == users[i]._id.toString(); }); openingBalance = userWallet.walletBalance; }

Discussion (0)