DEV Community

Cover image for Backup Postgres to S3
Sambhav Jain
Sambhav Jain

Posted on

Backup Postgres to S3

This article describes an easy way to backup a Postgres database to Amazon S3 using s3cmd and crontab in Linux environment.

Install S3cmd

S3cmd is a command-line utility for managing data in AWS S3.

For Debian/Ubuntu

apt-get install s3cmd
Enter fullscreen mode Exit fullscreen mode

For RedHat/CentOS

yum install s3cmd
Enter fullscreen mode Exit fullscreen mode

For OSX

brew install s3cmd
Enter fullscreen mode Exit fullscreen mode

Set Amazon access id & secret keys using the --configure option. Enable encryption for data transferring as well as HTTPS.

s3cmd --configure
Enter fullscreen mode Exit fullscreen mode

Backup the database

pg_dump -v --format=c -h localhost -U YOUR_USER YOUR_DB > backup.dump
Enter fullscreen mode Exit fullscreen mode

Push to S3

s3cmd put backup.dump s3://YOUR_BUCKET_NAME --encrypt
Enter fullscreen mode Exit fullscreen mode

Automate the process

Create a new bash file in the root directory.

#!/usr/bin/env bash

DB_NAME=$1
DB_USER=$2
DB_PASS=$3

BUCKET_NAME=<YOUR_NAME_HERE>

TIMESTAMP=$(date +%F_%T | tr ':' '-')
TEMP_FILE=$(mktemp tmp.XXXXXXXXXX)
S3_FILE="s3://$BUCKET_NAME/backup-$TIMESTAMP"

PGPASSWORD=$DB_PASS pg_dump -Fc --no-acl -h localhost -U $DB_USER $DB_NAME > $TEMP_FILE
s3cmd put $TEMP_FILE $S3_FILE --encrypt
rm "$TEMP_FILE"
Enter fullscreen mode Exit fullscreen mode

Change file permission to be executable using chmod +x and test the file using:

./FILE_NAME.sh DATABASE_NAME USER_NAME DATABASE_PASSWORD
Enter fullscreen mode Exit fullscreen mode

It should upload the backup to the S3 configured earlier.

Attach to a cron job

crontab -e
Enter fullscreen mode Exit fullscreen mode

Edit the file to execute the script at (say) every sunday at midnight:

0 0 * * 0 /home/ubuntu/FILE_NAME.sh DATABASE_NAME USER_NAME DATABASE_PASSWORD
Enter fullscreen mode Exit fullscreen mode

Save the file and check the cron logs in /var/log/syslog file.

Top comments (6)

Collapse
 
iamkarshe profile image
Utkarsh Kumar Raut

Thanks for the share!
Same way one can automate the backups for MySQL — using mysqldump.

Collapse
 
maxborysov profile image
Max Borysov • Edited

yeap, I do exactly the same :) Having a bash script with mysqldump and then upload to s3.

Collapse
 
iamkarshe profile image
Utkarsh Kumar Raut

Our Complete Automation Template

  1. Cron runs at every 12th hour. crontab
  2. Run script (mysqldump) to make backup
  3. Trigger script (php) to sync backup with Google Cloud Storage
  4. If all good all fine, we trigger another script (php) for Slack notification to our dev team
Collapse
 
maxborysov profile image
Max Borysov

Hi, thank you for the article.

Are there any benefits of using s3cmd over standard aws-cli utility which has aws s3 cp command?

Collapse
 
slidenerd profile image
slidenerd

can you add a post on how to backup elasticache redis using the same technique

Collapse
 
akhilnaidu profile image
Akhil Naidu • Edited

I installed postgress through caprover(I have a docker based postgress), so what do you suggest?