Here's the issue. I'm always messing up my server, but now I'm running more production services on this server than my other server, so I need to figure out how to not lose all of this blog data if something catastrophic were to happen.

Lets get started.

Step 1: Figure out what you need to back up.

In my case, I would like to back up both the text posts and other generally backed up things (through Ghosts interface) as well as photos and videos that I may upload in the future. This guide will be for backing up both to Google Drive.

Take 1 (DONT (just) do this): Backing up root Ghost directory

Boom! Easy solution, just gotta copy one directory, zip it up, and send it to Google! Boom boom boom done.

No. Don't just do this. This will not copy any blog data when running in production mode as they are saved in a SQL database. Let's figure out how to fix it.

Take 2: Save the SQL database AND Ghost content directory

"Two steps" here, just saving the SQL database and just copying the ./content/ folder in Ghost.

Step 1: Create a backup MySQL user

This is good practice just so you can backtrack things if needed. It is also good to just give a user permissions that they require, no more, no less.

mysql> CREATE USER 'backup'@'localhost' IDENTIFIED BY '###';
mysql> GRANT ALL ON ghost_prod.* TO 'backup'@'localhost';
mysql> FLUSH PRIVILEGES;

Step 2: Create ~/.my.cnf file

Modify the file ~/.my.cnf to include the following, and then make sure it requires chmod 600.

[client]
user=backup
password="###"

# chmod 600 ~/.my.cnf

Step 3: Create backup scripts

To back up the MySQL database, we will be using MySQLDump to save the file as a gzip with the date.

Lets figure out this command

mysqldump ghost_prod > ghost_prod.sql

The above command would technically work, however it won't be compressed, so if a blog is large, the file will most likely be large. Let's figure out how to date and compress the file.

mysqldump ghost_prod | gzip > ghost_prod-$(date +%Y%m%d).sql.gz

This is a good example on how to zip it up and save the file with a date. Now that we have the database saving setup, let's test it (or, at least I will to double check my work).

Step 2.5 (optional): Test backup commands

  1. Run the above command
  2. Make a Ghost backup through the UI
  3. Drop the ghost_prod table "DROP DATABASE ghost_prod;"
  4. Exit MySQL
  5. Recreate the table using mysql -e "create database ghost_prod";
  6. Decompress the backup gunzip ghost_prod-(date).sql.gz
  7. Restore database mysql --one-database ghost_prod < ghost_prod-(date).sql
  8. Success (hopefully)

Step 3: Compress content

tar -zcvf ./content-$(date +%Y%m%d).tar.gz /var/www/ghost/content/

Step 4: Putting it all together in a script

Notice in the script I changed from having the date in the file name to having it in `Backups/ghost/(date)/<files>`

#!/bin/bash
now=$(date +'%Y-%m-%d_%H-%M')
echo "Making backup folder for $now"
mkdir "/home/kenton/Backups/ghost/$now"

echo "Saving ghost_prod Database Backup $now"
mysqldump ghost_prod | gzip > "/home/kenton/Backups/ghost/$now/ghost_prod.sql.gz"

echo "Compressing content folder"
tar -zcvf "/home/kenton/Backups/ghost/$now/content.tar.gz" --absolute-names /var/www/ghost/content/ > /dev/null

Step 5: Keep backup in Google Drive

  1. Install rclone sudo apt-get install rclone
  2. Setup rclone rclone config (I set the name to google-drive)
  3. Change to script to include rclone code (make sure to change (USER) to your user)
echo "Sending to Drive"
/usr/bin/rclone copy --update --verbose --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s "/home/(USER)/Backups/ghost/$now/" "google-drive:ServerBackups/ghost/$now/"

Step 6: Cronjob

  1. Add a crontab -e item
  2. For this example, we will back up the data every day at midnight: 0 0 * * * /home/(USER)/Backups/backup.sh

Step 7: Congrats! You're now fully backed up!

While I have no liability for any lost data, I'm nearly sure you won't lose any as long as it is backed up. This solution makes sure to keep multiple copies of data across Google Drive and local devices, so you should be good if you ever need to reset.