Here's the issue. I'm always messing up my server, but now I'm running more production services on this server than my other server, so I need to figure out how to not lose all of this blog data if something catastrophic were to happen.
Lets get started.
In my case, I would like to back up both the text posts and other generally backed up things (through Ghosts interface) as well as photos and videos that I may upload in the future. This guide will be for backing up both to Google Drive.
Boom! Easy solution, just gotta copy one directory, zip it up, and send it to Google! Boom boom boom done.
No. Don't just do this. This will not copy any blog data when running in production mode as they are saved in a SQL database. Let's figure out how to fix it.
"Two steps" here, just saving the SQL database and just copying the ./content/ folder in Ghost.
This is good practice just so you can backtrack things if needed. It is also good to just give a user permissions that they require, no more, no less.
mysql> CREATE USER 'backup'@'localhost' IDENTIFIED BY '###'; mysql> GRANT ALL ON ghost_prod.* TO 'backup'@'localhost'; mysql> FLUSH PRIVILEGES;
Modify the file ~/.my.cnf to include the following, and then make sure it requires chmod 600.
[client] user=backup password="###" # chmod 600 ~/.my.cnf
To back up the MySQL database, we will be using MySQLDump to save the file as a gzip with the date.
Lets figure out this command
mysqldump ghost_prod > ghost_prod.sql
The above command would technically work, however it won't be compressed, so if a blog is large, the file will most likely be large. Let's figure out how to date and compress the file.
mysqldump ghost_prod | gzip > ghost_prod-$(date +%Y%m%d).sql.gz
This is a good example on how to zip it up and save the file with a date. Now that we have the database saving setup, let's test it (or, at least I will to double check my work).
- Run the above command
- Make a Ghost backup through the UI
- Drop the ghost_prod table
"DROP DATABASE ghost_prod;"
- Exit MySQL
- Recreate the table using
mysql -e "create database ghost_prod";
- Decompress the backup
- Restore database
mysql --one-database ghost_prod < ghost_prod-(date).sql
- Success (hopefully)
tar -zcvf ./content-$(date +%Y%m%d).tar.gz /var/www/ghost/content/
Notice in the script I changed from having the date in the file name to having it in
#!/bin/bash now=$(date +'%Y-%m-%d_%H-%M') echo "Making backup folder for $now" mkdir "/home/kenton/Backups/ghost/$now" echo "Saving ghost_prod Database Backup $now" mysqldump ghost_prod | gzip > "/home/kenton/Backups/ghost/$now/ghost_prod.sql.gz" echo "Compressing content folder" tar -zcvf "/home/kenton/Backups/ghost/$now/content.tar.gz" --absolute-names /var/www/ghost/content/ > /dev/null
- Install rclone
sudo apt-get install rclone
- Setup rclone
rclone config(I set the name to google-drive)
- Change to script to include rclone code (make sure to change (USER) to your user)
echo "Sending to Drive" /usr/bin/rclone copy --update --verbose --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s "/home/(USER)/Backups/ghost/$now/" "google-drive:ServerBackups/ghost/$now/"
- Add a
- For this example, we will back up the data every day at midnight:
0 0 * * * /home/(USER)/Backups/backup.sh
While I have no liability for any lost data, I'm nearly sure you won't lose any as long as it is backed up. This solution makes sure to keep multiple copies of data across Google Drive and local devices, so you should be good if you ever need to reset.