If you're running an application in production, it is critical to have an automated backup system for your database. This system could automatically dump database backups and upload to the cloud every couple of hours.
In this tutorial, we'll create this script using node.js. This script would run the mysqldump
command using the node.js child process. First let's examine this command.
mysqldump -u <username> -p<password> <database-name>
Running this command would generate a record of the table structure and the data from the specified database in the form of a list of SQL statements.
Let's create the node.js backup script that would run this command as a child process. First, we'll import all the modules we need.
const fs = require('fs')
const spawn = require('child_process').spawn
- We need the
fs
module to write the dump content to the dump file. - We need the
spawn
method from thechild_process
module to run themysqldump
command.
The spawn
runs the command and returns a stream. The spawn would not wait for the command to finish running before returning the output to us. This is very important because some large databases can run for many hours.
Next, we'll need a unique name for the database dump.
const dumpFileName = `${Math.round(Date.now() / 1000)}.dump.sql`
This uses the date object in javascript to generate the current epoch time and attaches .dump.sql
to it. We'll use this as the dump file name.
Next, let's create a write stream. When we stream output from the spawn method, we'll pass the output to the write stream, which would write the output to a file.
const dumpFileName = `${Math.round(Date.now() / 1000)}.dump.sql`
const writeStream = fs.createWriteStream(dumpFileName)
The write stream will create a file with the specified file name.
Next, let's create the child process using spawn .
const dump = spawn('mysqldump', [
'-u',
'<username>',
'-p<password>',
'<database-name>',
])
The first argument to the spawn
method is the command, and the second is a list of all arguments to be passed to this command. As seen above, we are passing through all the commands just like we did on the command line.
This method returns a child process, and we can now stream for every output emitted from the child process.
dump
.stdout
.pipe(writeStream)
.on('finish', function () {
console.log('Completed')
})
.on('error', function (err) {
console.log(err)
})
Here, we are pipe
ing the output from the dump as input to the writeStream. So as the child process runs, every time there's a new chunk of output, the write stream would write it to the file.
We can also listen to the finish and error events and pass callbacks to handle them. In this case we just log a message.
Here's the complete script:
const fs = require('fs')
const spawn = require('child_process').spawn
const dumpFileName = `${Math.round(Date.now() / 1000)}.dump.sql`
const writeStream = fs.createWriteStream(dumpFileName)
const dump = spawn('mysqldump', [
'-u',
'ghost',
'-pghost',
'ghost',
])
dump
.stdout
.pipe(writeStream)
.on('finish', function () {
console.log('Completed')
})
.on('error', function (err) {
console.log(err)
})
To automate this process, you can create a cron job that executes this script every x amount of time.
Top comments (0)