I was able to get all of my requirements via scripts and cron jobs. Each of the 3 Ubuntu Servers, can ssh using shared keys into the other 2 machines. This is so handy for quick jumping onto another machine to check a log file and copying files to another machine.
It was lots of man page reviewing, Googling, and testing in my down time. Rerouting my backups increased the speed 9.6x running between the 2 fastest machines first.
I ended up writing backup, archive, and cleanup scripts. Maybe there is a better way but, through crontab, I run the archive backup compression script on the 1st, 5th, 10th, 15th, 20th, 25th, and 30th of each month to get that accomplished. So 7 scheduled entries in crontab for archival purposes.
Cleanup script going back 3 months of compressed backups leaving only the first one of that month ended up like this:
#!/bin/bash
# Store cleanup month date
CLEANMONTH=$(date -d "-3 month" +%m-%Y)
# Store cleanup directory
CLEANUP_DIR="/path/archive/$CLEANMONTH"
# Find and delete files other than the first of the month backup
find $CLEANUP_DIR -type f -name '*_backup.tar.gz' ! -path $CLEANUP_DIR/'*-01-*_backup.tar.gz' -delete
View Quote
Prior to this month, I'd done very little scripting other than the "hello world" kind of stuff. This was a great practical challenge.
The only thing I can't automate is my encrypted removable drive "Sneakernet" off-site backup that I do in the event of on-site catastrophic destruction.
UpdateAll backup activities are functioning automatically. My initial directory share copy runs 15.7x faster. Archival and cleanup scripts run within the same amount of time that it took before to just copy files across the network.