DEV Community

Supun Kavinda
Supun Kavinda

Posted on • Originally published at developer.hyvor.com

How to automatically backup MYSQL to Amazon S3?

Most of companies deal with the loss of data. That's why we should have database backups to at any time to recover from it. I won't be talking about data loss much, if you need to learn more about data loss and the importance of the backups, read this.

If you need to motivate yourself to create backups, read this discussion on reddit.

Why Amazon S3?

Up to today, I saved my mysql backups in the server itself. But, I thought it's the time to use a cloud storage to save backups. Why? The reason is, if you face with an issue of getting deleted both database and backups, you will need to have a backup somewhere else. Cloud is the best place.

Cloud

Among cloud services, I choose Amazon S3 because of their powerful CLI tool and reliability. (Also, Amazon S3 is simple to understand if you are new to Amazon services)

Our Target

Here's what were are going to do here.

  • Dump MYSQL into a folder. (/var/local/backups)
  • Sync that folder with a S3 folder using AWS CLI
  • Finally, we will create a shell script to automate the process with the help of cron jobs.

Dumping MYSQL

I use mysqldump to dump MYSQL databases, which is easy to use.

mysqldump -u [user] -p[password] my_database > /where/to/dump
Enter fullscreen mode Exit fullscreen mode

Setting up Amazon S3

  • If you haven't, create an AWS Account and login to the console.
  • Then, go to Amazon S3.
  • Create a bucket with default configurations. (I named it company-backups. Use a company name or your name to make it unique as it required to be unique globally)
  • Create a folder inside the bucket. (company-backups/mysql)

Now you are done! Let's setup AWS CLI

Setting up AWS CLI

As I mentioned earlier, AWS CLI is a powerful tool. It allows us to sync folders between our server and Amazon S3.

Here's the guide to setting up AWS CLI.

After installing AWS CLI, make sure it's working.

aws --version
Enter fullscreen mode Exit fullscreen mode

This should show something like, aws-cli/1.16.170 Python/3.7.3 Linux/4.14.123-111.109.amzn2.x86_64 botocore/1.12.160

Then, configure AWS CLI with your credentials.

Testing Synchronizing

Let's do some testing before we create the shell script.

Run the following command and dump a database to a temporary directory called backups in your home directory for testing. (Make sure you enter your password correctly when prompted)

mysqldump -u [user] -p my_database > backups/my_database.sql
Enter fullscreen mode Exit fullscreen mode

Let's try synchronization.

aws s3 sync backups s3://company-backups/mysql/
Enter fullscreen mode Exit fullscreen mode

Now, visit the S3 console and check if the file is uploaded to the folder. If yes, great!

Writing the Shell Script

This is the interesting part! Let's write the shell script which does following things.

  • Dump databases
  • Sync with S3
#!/bin/bash

# 1 to 7
DAY_OF_WEEK=$(date +%u)

# database username
USER="myusername"

# database user's password
PASSWORD="mypassword"

# the directory to save the dumped .sql files
DUMP_DIR="/var/local/backups"

# list of databases to dump
DATABASES="database_1 database_2 database_3" 

# where to sync in S3
S3_SYNC_URL="s3://company-backups/mysql"

# loop through each database and dump
for db in $DATABASES; do
    echo "Dumping database: $db"
    # dump the database
    # filename ~= /var/local/backups/database_1.1.sql
    mysqldump -u $USER -p$PASSWORD --databases $db > $DUMP_DIR/$db.$DAY_OF_WEEK.sql
done

# sync the dumb directory with the directory in the s3 bucket
aws s3 sync $DUMP_DIR $S3_SYNC_URL
Enter fullscreen mode Exit fullscreen mode

I have added comments in the bash script to understand more about each step.

Save the above file as backup-databases.sh in any directory (I use /var/www/shell for these kind of scripts) and then run the script.

/bin/bash /path/to/backup-databases.sh
Enter fullscreen mode Exit fullscreen mode

And, check if everything works. You must see the dumped .sql files in your S3 folder. If yes, you win!

Setting up Cron Jobs

So, you gonna call the above command each day to backup your databases?

Nope, cron is there.

Run this:

crontab -e
Enter fullscreen mode Exit fullscreen mode

Add this at the end of the file

# back up databases
00 00 * * * /bin/bash /path/to/backup-databases.sh
Enter fullscreen mode Exit fullscreen mode

Save and exit.

Now, our script will run the script every day at midnight. You will have 7 backups per each database, in your server as well as in Amazon S3.

If you have any questions feel free to comment below.

Thank you for reading!

Top comments (3)

Collapse
 
filix profile image
Filip Ilić

Hey,

I like this very much. But what if someone were to delete files in local backup folder would the sync delete s3 backup?

Collapse
 
supunkavinda profile image
Supun Kavinda

This is something I previously never thought about.

However, this tutorial says,

A question that needs answering here is what happens with any files existing under the specified prefix and bucket but not existing in the local directory or vice versa? The answer is that they are not deleted unless a --delete parameter is added to the command.

It seems that deleting a file locally doesn't delete the file on S3.

I'll check this myself and post a comment in a few days.

Collapse
 
filix profile image
Filip Ilić

If it's so than its great. Because that would be an security issue.