Sun 13th June 2021 By David T. Sadler.
Below is a quick and dirty way in which I backup all my repositories that are hosted at git.davidtsadler.com.
#!/bin/sh
DATE_PREFIX=$(date +%Y%m%d)
BACKUP_DIRECTORY=/tmp
BACKUP_FILE="${BACKUP_DIRECTORY}/${DATE_PREFIX}-repositories.tar.gz"
BACKUP_FILES="${BACKUP_DIRECTORY}/*-repositories.tar.gz"
REPOSITORIES=/home/git/*.git
tar -czf $BACKUP_FILE $REPOSITORIES
find $BACKUP_FILES -mtime +3 -delete
exit 0
All it does it tar and gzip any .git directories found under /home/git. It also removes any backups that are more than three days old.
This script has been saved as /usr/bin/backup_repositories and is ran daily via cron.
0 3 * * * /usr/bin/backup_repositories > /dev/null 2>&1
It is important to know that this backup strategy is far from ideal for repositories that are heavily used as you run the high risk of trying to backup a repository as users are pushing to it. As git updates a repository in two phases this will lead to a backup that may not contain all the data and so won't be suitable for restoring. However its fine for my purposes since I'm the only user and it's unlikely that I will be making changes during the time the backup is running.
I don't have comments as I don't want to manage them. You can however contact me at the below address if you want to.
Email david@davidtsadler.comCopyright © 2021 David T. Sadler.
Return to Homepage.