While working on a node JS project, I had the need to pull certain data from my database and represent values in a CSV file, This guide aims at simplifying the process of creating a CSV file and making it downloadable from the client-side.
Prerequisites
- Knowledge using express JS.
- A working database setup.
For this tutorial, I would be making use of PostgreSQL with Sequelize ORM (database/ORM type does not actually matter), json2Csv
npm package is my choice package for parsing JSON to CSV data because of its simplicity and ease of use. other npm packages which can be used are fast-csv
, csv-parse
.
Objective
Downloading users data (first name, last name, email) as a CSV file, using an API endpoint /download
.
Steps
- utility function
- Using the helper function in the controller
Utility Function
Install the package using yarn add json2csv
or npm i json2csv
, once complete, we would make use of the parser synchronous API that is giving from json2Csv
.
util.js
import { Parser } from 'json2csv';
export const downloadResource = (res, fileName, fields, data) => {
const json2csv = new Parser({ fields });
const csv = json2csv.parse(data);
res.header('Content-Type', 'text/csv');
res.attachment(fileName);
return res.send(csv);
}
The downloadResource function parameters;
- res: This is a response from the API endpoint.
- fileName: The name intended for the generated CSV file to be downloaded.
-
fields: This is an array of objects that has a number of functionalities which includes, specifying what values should be parsed from data sent, giving a custom header to each column etc.
Note: For this tutorial we are going to supply fields, but fields are optional, when defining the parser instance it can accept an empty parameter (i.e
new Parser()
). The keys (i.elabel
,value
) of the objects would be used to give structure to the CSV file. - data: This is an array of objects representing data to be populated in the CSV file, Note: this is to match the specified field.
In the utility function, set the header Content-type
to text/csv
to enable the client to recognize what type of data is being sent and how to handle it, setting the attachment
to the file name of choice and then finally returning a response using res.send
taking the parsed CSV data as parameters. We have our ready to use utility function.
Using the helper function in the controller
Creating a new route for the download
route.js
import express from 'express';
import userController from './controllers/userController';
const router = express.Router();
router.get('/download', userController.download);
// rest of express configs
Creating a new controller download method
userController.js
import { users } from '../models';
import { downloadResource } from '../util';
const controller = {};
controller.download = async (req, res) => {
const fields = [
{
label: 'First Name',
value: 'first_name'
},
{
label: 'Last Name',
value: 'last_name'
},
{
label: 'Email Address',
value: 'email_address'
}
];
const data = await users.findAll();
return downloadResource(res, 'users.csv', fields, data);
}
export default controller;
Testing Implementation
Run your local server and open link in the client browser with a path for the download endpoint (e.g http://localhost:3000/api/download
) and your file gets downloaded using your local download manager.
Conclusion
Fun fact this is my first article on dev.to 🎉, I am David Okonji a software engineer based in Lagos, Nigeria, feedback would be greatly appreciated. Gracias!
Top comments (16)
Actually, I'm not a Front End dev, and my knowledge of js and node is quite a little, but I can help you to improve your solution.
Postgres has a very nice feature called "COPY FROM". It can copy any data from your database in the CSV format with the header as well.
So basically you don't need to have a json_to_converter and any orm at all.
Just using the "COPY TO STDOUT" and js streams you are able to build a very powerful and nice solution, which will help you to get any data from the database no matter how it will be large in size. you will be limited only with your bandwidth.
Thank you, Alex, Yeah sure an ORM is not necessary, I would surely try out the alternate solution suggested.
You should also streaming, especially if CSV is large.
Maybe by piping CSV stream.
Hello, I love the article. But I do have some questions.
In a situation where I gotten the data I need from mongodb and I applied .populate
However, I would like to get access to a certain field in the populated field. How do I do that.
Hello Joel, hope I’m not too late, if you use populate then you would have a nested object, the library supports dots notation, for example, you have a user object with Id key, it would be value: 'user.id'
This was very helpful David, Thanks. Just wondering how it will be handled in the frontend.
From the frontend, it can be sent as Formdata payload
Thank you!
This is awesome! Thanks David 😁
Nice read!!
Thank you this is really helpful!
This is beautiful... thanks
Much needed one for me. Thanks :)
Thanks, it was really helpful