I’ve been using Bash scripts with cronjobs to schedule daily backups, and then sync the backup to BackblazeB2 after the local backup has completed.
However, it’s a bit messy and has a failure point where a locally corrupted repo could be uploaded to the remote repo. It also doesn’t send emails on failure or success.
Here’s mine:
shared.sh
#!/bin/bash # Sources to be backed up SOURCES=( "/srv/dev-disk-by-uuid-a57d5696-00fk-4d1c-9885-095ad5cf71ba/Stuff/" ) # Restic configurations RESTIC_CONFIG_LOCATION="/srv/dev-disk-by-uuid-a77de696-04f9-4d1c-9875-055advcf71ba/Stuff/Projects/Setup/Backups/restic-config" RESTIC_REPO="/srv/dev-disk-by-uuid-25EE65533BB23C37/restic/repo" RESTIC_PASSWORD_FILE="${RESTIC_CONFIG_LOCATION}/config/password.txt" RESTIC_EXCLUDE_FILE="${RESTIC_CONFIG_LOCATION}/config/excludes.txt" RESTIC_LOG_LOCATION="${RESTIC_CONFIG_LOCATION}/logs" RESTIC_LOCK_LOCATION="${RESTIC_REPO}/locks" RESTIC_MOUNT_LOCATION="/srv/dev-disk-by-uuid-28EE65538BB23J37/restic/mount" # RClone configurations RCLONE_CONFIG="${RESTIC_CONFIG_LOCATION}/config/rclone.conf" RCLONE_REPO="bucket:restic-backup" function unlock-repo() { if [[ $(ls -A ${RESTIC_LOCK_LOCATION}) ]] ; then restic -r ${RESTIC_REPO} unlock --password-file=${RESTIC_PASSWORD_FILE} fi }
backup.sh
#!/usr/bin/env bash # Get the directory this script is from, and then source the # shared.sh shell script for the environment variables and # functions BASEDIR=$(dirname "$0") source "${BASEDIR}/shared.sh" RESTIC_LOG_FILE="${RESTIC_LOG_LOCATION}/log-backup.txt" # Backs up the sources to the local Restic repository function backup() { # Get the latest saved parent snapshot ID local short_id=$(restic -r ${RESTIC_REPO} --password-file=${RESTIC_PASSWORD_FILE} --json snapshots | jq -r 'max_by(.time) | .short_id') # Back up and log the results ( echo echo date echo echo restic -r ${RESTIC_REPO} backup ${SOURCES[@]} --exclude-file=${RESTIC_EXCLUDE_FILE} --password-file=${RESTIC_PASSWORD_FILE} --parent ${short_id} --compression max ) >>${RESTIC_LOG_FILE} } function main() { unlock-repo backup } main $@
validate.sh
#!/usr/bin/env bash BASEDIR=$(dirname "$0") source "${BASEDIR}/shared.sh" RESTIC_LOG_FILE="${RESTIC_LOG_LOCATION}/log-validate.txt" # Validates the local Restic repository. function validate-locally() { ( echo echo date echo echo restic -r ${RESTIC_REPO} check --read-data --password-file=${RESTIC_PASSWORD_FILE} ) >>${RESTIC_LOG_FILE} } function main() { unlock-repo validate-locally } main $@
sync.sh
#!/usr/bin/env bash BASEDIR=$(dirname "$0") source "${BASEDIR}/shared.sh" RESTIC_LOG_FILE="${RESTIC_LOG_LOCATION}/log-sync.txt" # Syncs the local Restic repository to the cloud provider, under the 'Restic' folder. function cloud-sync() { ( echo echo date echo echo ) >>${RESTIC_LOG_FILE} rclone -v sync ${RESTIC_REPO} ${RCLONE_REPO} --config=${RCLONE_CONFIG} --log-file=${RESTIC_LOG_FILE} --b2-hard-delete rclone cleanup ${RCLONE_REPO} --config=${RCLONE_CONFIG} --log-file=${RESTIC_LOG_FILE} } function main() { cloud-sync } main $@
Thanks for posting this. I’ve been meaning to write something similar for a while but never got around to it. I wish Restic would ship an official client application for managing this stuff!
You’re welcome. Although I’m not good with Bash scripting, and it’s quite messy and could prove fatal if something goes wrong and it syncs to the secondary backup