Not everybody, for a lot of administrative, legal or political reasons, can run applications in a cloud infrastructure, nor has any suitable software available to fulfill the needs that may arise, and when that happens, us system administrators need to get creative and design our own scripts to deal with our needs.
This need in particular I'll be addressing here is backup for applications and databases. In a lot of places, for lots of different reasons, you can't run applications on the cloud, but you can have backups stored in the cloud or somewhere else. To do this, I created a simple script that will save everything we need to save to run our application and make it possible to restore just as it was to and from the cloud.
In this example, we're going to backup a simple LAMP application that uses a database to save and list information to AWS S3. The goal is to save the source code and the database automatically in a way that can be easily retrieved and installed in a new server and run as if nothing else happened.
To make this script work, we'll need the following:
- Path(s) for the application files
- Database(s) name and Database Server(s) IP(s) and credentials
- Use the same credentials across all of the database servers
- AWS CLI installed and configured
- AWS S3 destination bucket already created
Note: Take in consideration that you'll probably need to update the source code after a restore because the database server IP and/or credentials may change. This script is designed to be ran in the server that runs the applications.
To create the backup, we'll use the following script:
#!/bin/bash
#Initializing our variables
#db_username, db_passwd and aws_bucket are not suposed to be used as arrays
paths=()
ips=()
databases=()
db_username=()
db_passwd=()
aws_bucket=()
#Getting our parameters and creating our lists
while [[ $# -gt 0 ]]; do
case "$1" in
-p)
paths+=("$2")
shift 2;;
-i)
ips+=("$2")
shift 2;;
-n)
databases+=("$2")
shift 2;;
-u)
db_username+=("$2")
shift 2;;
-w)
db_passwd+=("$2")
shift 2;;
-b)
aws_bucket+=("$2")
shift 2;;
*)
echo "Unknown flag: $1"
exit 1;;
esac
done
#Create a directory to save current backup
mkdir backup$(date +"%d%m%Y")
#Create a log file
touch backup$(date +"%d%m%Y")/logs.txt
#Iterate through all paths and create a backup for each one of them
for path in "${paths[@]}"; do
tar -cvpzf backup$(date +"%d%m%Y")/$(basename $path).tar.gz $path
echo "Backup for $path saved as $(basename $path).tar.gz" >> backup$(date +"%d%m%Y")/logs.txt
done
##Iterate through all database servers IPs and check if the database name exists and create a backup
for ip in "${ips[@]}"; do
for database in "${databases[@]}"; do
if mysql -u "$db_username" -p"$db_passwd" -h "$ip" -e "USE $database;" 2>/dev/null; then
sudo mysqldump --no-tablespaces -u $db_username -p$db_passwd "$database" > backup$(date +"%d%m%Y")/"$database$ip.sql"
echo "Database $database from $ip saved as $database$ip.sql" >> backup$(date +"%d%m%Y")/logs.txt
else
echo "Database $database doesnt exists on $ip server" >> backup$(date +"%d%m%Y")/logs.txt
fi
done
done
#Create a tar for the whole folder
tar -cvpzf backup$(date +"%d%m%Y").tar.gz backup$(date +"%d%m%Y")
#Upload file to AWS bucket
aws s3 cp backup$(date +"%d%m%Y").tar.gz s3://$aws_bucket/
echo "File backup$(date +"%d%m%Y").tar.gz uploaded to $aws_bucket" >> backup$(date +"%d%m%Y")/logs.txt
#Delete local folder and file
rm -rf backup$(date +"%d%m%Y")
rm backup$(date +"%d%m%Y").tar.gz
In this example, we're basically saving all of the files listed in the different paths passed in a tar.gz file, then backing up the databases passed as parameters in the list and then, creating a tar of all of the created files and deleting the remaining files. After that, the tar file is uploaded to an S3 bucket and deleted from the disk to save space. We're also logging all of the paths and the IP of the database servers where databases are being backed up from so you know where it was before restoring and take certain action if needed.
This script can be executed as frequently as needed, taking in consideration the costs of transfer and the size of the files.
Alternatively, you can backup your files to a different destination like an FTP server for example.
You can find about the usage to this script at my GitHub repository.
I hope this helps someone out to backup applications and databases in a simple but effective way.
Top comments (0)