Warning

 

Close

Confirm Action

Are you sure you wish to do this?

Confirm Cancel
BCM
Member Login

Site Notices
Posted: 4/5/2021 3:58:57 PM EDT
[Last Edit: 5/11/2021 10:34:22 AM EDT by Tangotag]
I'm looking at better automating the backup of my file server.

I've got 3 old PCs set up as Ubuntu Servers with Samba for Win machine access (LARRY - main file share machine, MOE - backup machine, CURLY - archive machine).
To get Win user file access, Samba needs a user and password loaded in addition to a non-sudo, non-bash Linux user accounts set up on the Linux machine.

My current backup method. This is horrible for efficiency and I know there is a much better way. To run backups my Win machine runs a batch file nightly.  

I currently run full backup of the share folder or directory every night from LARRY to MOE via the batch file run on a Win machine.

The last 6-10 daily full backups are kept on MOE in a backup directory.
From here on I SSH into my Linux machines.
I also make tarball archives every 1,5,10,15,20,25,30th for each month inside an archive directory on MOE.
After 3 months pass, I only maintain the 1st backup of that month and the rest of the tarballs within that month are deleted.  
In addition I will rsync the MOE archive folder to CURLY every time a new tarball is created for a second tarball storage location. I also keep the last 2 most current tarballs on an Encrypted USB drive off-site.

I'd prefer to keep doing this full backup as opposed to incremental backup. Once this is done I will dabble with incremental options. Working with these full backup files I can use a Raspberry Pi and a USB drive to get the file structure back up and running in a short period of time in the event of catastrophic destruction to the Ubuntu Servers.


I've been using the same handful of Linux commands for a number of years to keep this running but I know the majority of this can be run via cron with less of me having to remote into the machines other than the USB encrypted drive copies.

Suggestions for running this in cron to automate?

Thanks T
Link Posted: 4/5/2021 4:23:29 PM EDT
I would look at configuring duplicity ( http://duplicity.nongnu.org/ ) to handle the backup rotation that you want and then have a cron job run it nightly.

HTH
Link Posted: 4/5/2021 4:25:55 PM EDT
I use Veeam's free Linux agent for stuff like that : https://www.veeam.com/linux-backup-free.html. It does machine-level backups (includes the OS, partition layout, configs, etc.) and stores them as differential backups on external hard drives or NFS/SMB shares.
Link Posted: 4/5/2021 7:13:10 PM EDT
Take the commands you run by hand now, and make a shell script out of them. Create a credentials file for your samba shares so you don't have to supply username and password on the commandline or in the shell script or as part of the options line in /etc/fstab. Test and debug your script, then create a cron job to execute that script when you want it run.
Link Posted: 4/9/2021 3:18:59 PM EDT
A little update. I made a bash script that takes my shared folders (drives) and compresses them into a single daily full backup tarball using "(date)_backup" naming format. A cron job on LARRY runs the script that makes the backup and also maintains the last 6 full backups deleting older files.

Next up will be the ssh keys for my rsync between LARRY, MOE, and CURLY maintaining spaced archival of the backups.

I ran into a set back making scripts in Win environment moving into the Unix environment. When redoing it in Vi on the Linux machine it all worked fine. Something to do with the Win to Unix text formatting was the issue.  I'll just write the rest of my scripting work on Vi in the future.

Thanks for the suggestions.
Link Posted: 4/10/2021 3:56:03 AM EDT
if you think Windows is harshing your ascii files, try "dos2unix"
Link Posted: 4/25/2021 10:30:27 PM EDT
[Last Edit: 4/25/2021 10:32:17 PM EDT by viktor]
I denote files as "long term" in which case they go to a freenas box via SMB. Very little movement of these. Truenas rather. https://www.truenas.com/

Short term files similar to what would be found in "my documents" are synced via syncthing hosted on that freenas box. https://syncthing.net

Might help, might not. Regardless of how files get moved to storage, they are protected with zfs and able to be restored if I have a "whoops".
Link Posted: 4/28/2021 3:46:32 PM EDT
[Last Edit: 5/11/2021 10:40:29 AM EDT by Tangotag]
I was able to get all of my requirements via scripts and cron jobs.  Each of the 3 Ubuntu Servers, can ssh using shared keys into the other 2 machines. This is so handy for quick jumping onto another machine to check a log file and copying files to another machine.

It was lots of man page reviewing, Googling, and testing in my down time.  Rerouting my backups increased the speed 9.6x running between the 2 fastest machines first.

I ended up writing backup, archive, and cleanup scripts.  Maybe there is a better way but, through crontab, I run the archive backup compression script on the 1st, 5th, 10th, 15th, 20th, 25th, and 30th of each month to get that accomplished. So 7 scheduled entries in crontab for archival purposes.  

Cleanup script going back 3 months of compressed backups leaving only the first one of that month ended up like this:  

#!/bin/bash

# Store cleanup month date
CLEANMONTH=$(date -d "-3 month" +%m-%Y)

# Store cleanup directory
CLEANUP_DIR="/path/archive/$CLEANMONTH"

# Find and delete files other than the first of the month backup
find $CLEANUP_DIR -type f -name '*_backup.tar.gz' ! -path $CLEANUP_DIR/'*-01-*_backup.tar.gz' -delete
View Quote


Prior to this month, I'd done very little scripting other than the "hello world" kind of stuff. This was a great practical challenge.
The only thing I can't automate is my encrypted removable drive "Sneakernet" off-site backup that I do in the event of on-site catastrophic destruction.

Update
All backup activities are functioning automatically. My initial directory share copy runs 15.7x faster. Archival and cleanup scripts run within the same amount of time that it took before to just copy files across the network.
Link Posted: 4/28/2021 4:03:02 PM EDT
Discussion ForumsJump to Quoted PostQuote History
Originally Posted By 2ANut:
I use Veeam's free Linux agent for stuff like that : https://www.veeam.com/linux-backup-free.html. It does machine-level backups (includes the OS, partition layout, configs, etc.) and stores them as differential backups on external hard drives or NFS/SMB shares.
View Quote

Too many free Sw packages that do backups to worry about from jobs.  Ought to have an offsite backup as well.
Link Posted: 4/28/2021 9:16:50 PM EDT
Discussion ForumsJump to Quoted PostQuote History
Originally Posted By Tangotag:
Prior to this month, I'd done very little scripting other than the "hello world" kind of stuff. This was a great practical challenge.
The only thing I can't automate is my encrypted removable drive "Sneakernet" off-site backup that I do in the event of on-site catastrophic destruction.
View Quote
http://duplicity.nongnu.org/ to a cloud based service depending on transfer sizes/bandwidth caps
Top Top