loading...

Generating and Downloading CSV files using express JS

davidokonji profile image David Okonji ・3 min read

While working on a node JS project, I had the need to pull certain data from my database and represent values in a CSV file, This guide aims at simplifying the process of creating a CSV file and making it downloadable from the client-side.

Prerequisites

  • Knowledge using express JS.
  • A working database setup.

For this tutorial, I would be making use of PostgreSQL with Sequelize ORM (database/ORM type does not actually matter), json2Csv npm package is my choice package for parsing JSON to CSV data because of its simplicity and ease of use. other npm packages which can be used are fast-csv, csv-parse.

Objective

Downloading users data (first name, last name, email) as a CSV file, using an API endpoint /download.

Steps

  • utility function
  • Using the helper function in the controller

Utility Function

Install the package using yarn add json2csv or npm i json2csv, once complete, we would make use of the parser synchronous API that is giving from json2Csv.

util.js

import { Parser } from 'json2csv';

export const downloadResource = (res, fileName, fields, data) => {
  const json2csv = new Parser({ fields });
  const csv = json2csv.parse(data);
  res.header('Content-Type', 'text/csv');
  res.attachment(fileName);
  return res.send(csv);
}

The downloadResource function parameters;

  • res: This is a response from the API endpoint.
  • fileName: The name intended for the generated CSV file to be downloaded.
  • fields: This is an array of objects that has a number of functionalities which includes, specifying what values should be parsed from data sent, giving a custom header to each column etc. Note: For this tutorial we are going to supply fields, but fields are optional, when defining the parser instance it can accept an empty parameter (i.e new Parser()). The keys (i.e label, value) of the objects would be used to give structure to the CSV file.
  • data: This is an array of objects representing data to be populated in the CSV file, Note: this is to match the specified field.

In the utility function, set the header Content-type to text/csv to enable the client to recognize what type of data is being sent and how to handle it, setting the attachment to the file name of choice and then finally returning a response using res.send taking the parsed CSV data as parameters. We have our ready to use utility function.

Using the helper function in the controller

Creating a new route for the download
route.js

 import express from 'express';
 import userController from './controllers/userController';

 const router = express.Router();

 router.get('/download', userController.download);

 // rest of express configs

Creating a new controller download method
userController.js

 import { users } from '../models';
 import { downloadResource } from '../util';

 const controller = {};

 controller.download = async (req, res) => {
  const fields = [
    {
      label: 'First Name',
      value: 'first_name'
    },
    {
      label: 'Last Name',
      value: 'last_name'
    },
    {
     label: 'Email Address',
      value: 'email_address'
    }
  ];
  const data = await users.findAll();

  return downloadResource(res, 'users.csv', fields, data);
 }

 export default controller;

Testing Implementation

Run your local server and open link in the client browser with a path for the download endpoint (e.g http://localhost:3000/api/download) and your file gets downloaded using your local download manager.

Conclusion

Fun fact this is my first article on dev.to πŸŽ‰, I am David Okonji a software engineer based in Lagos, Nigeria, feedback would be greatly appreciated. Gracias!

Discussion

pic
Editor guide
Collapse
spilex profile image
Alex Solonenko

Actually, I'm not a Front End dev, and my knowledge of js and node is quite a little, but I can help you to improve your solution.
Postgres has a very nice feature called "COPY FROM". It can copy any data from your database in the CSV format with the header as well.
So basically you don't need to have a json_to_converter and any orm at all.
Just using the "COPY TO STDOUT" and js streams you are able to build a very powerful and nice solution, which will help you to get any data from the database no matter how it will be large in size. you will be limited only with your bandwidth.

Collapse
davidokonji profile image
David Okonji Author

Thank you, Alex, Yeah sure an ORM is not necessary, I would surely try out the alternate solution suggested.

Collapse
patarapolw profile image
Pacharapol Withayasakpunt

You should also streaming, especially if CSV is large.

Maybe by piping CSV stream.

Collapse
thisdotmedia_staff profile image
This Dot Media

This is awesome! Thanks David 😁

Collapse
awwal1999_72 profile image
Akanbi Lawal Tunde

Nice read!!

Collapse
donaldng profile image
Donald Ng

Thank you this is really helpful!

Collapse
cavdy profile image
Cavdy

This is beautiful... thanks

Collapse
keystroke99 profile image
keystroke99

Much needed one for me. Thanks :)

Collapse
luancoleto profile image
Luan Mota Coleto

Thanks, it was really helpful

Collapse
eliyahukriel profile image
eliyahuKriel

thanks!! very helpfull
how can i set the file to be support hebrew lang? when download the file its look like
%$#@$R@F

thanks