Your Linux backup and disaster recovery solution

etcetera

Active Member
Joined
Mar 10, 2024
Messages
159
Reaction score
35
Credits
1,880
I run some mission critical stuff (to me, anyway) and I once (almost) had to deal with data loss so I take data integrity and availability very seriously.

This is my backup and disaster recovery solution. It's not pretty but has worked for me for a number of years.

I run a box with several SSDs and periodically clone the primary OS to a secondary disk. Do a full disc clone with Clonezilla. I think it can handle Win disks as well, IIRC. I used to use Macrium Reflect for Win10 boxes but now I think I will fully move to Clonezilla.

It's kind of slow and expensive in terms of you having to dump 2TB of data once a month and if you need something, it may not be there given the 1 month time frame. However the delay period is intentional, if I get a virus, or install something, the backup disk is not infected and a month is plenty to figure it out. I can backout and then clone the other way, meaning clone--> primary.

I also make a third clone and that disk is taken out from the computer, because there is nothing that prevents a system error or virus to hose both disks, primary and the stand by one, even if the OS is not active/mounted. All disks are there, and fdisk -l can see all disks in the computer.

An ideal disaster recovery solution involves taking a clone off site completely, so if one has to deal with a natural disaster, or a fire, or something, you have data preserved in a different state/site. That takes care of data availability but there is also the security aspect of it and I encrypt so that if the disk falls into wrong hands, it's completely useless, it is a brick. They are just seen as raw partitions. I use old 3.5" HDDs for that purpose, they can hold data indefinitely and 2TB ones are cheap.
 


I keep rotating backups of my important data. I use the 3 - 2 - 1 method.


I don't tend to backup the system files. That doesn't interest me. I test my backups fairly regularly but not every time. I should do so every time.
 
@KGIII
Quote from your article:
AN UNTESTED BACKUP IS NOT A BACKUP!

I like that you mentioned this, it's as important as 3-2-1 method because without it 3-2-1 might as well be 2-1 instead or even worse!

A method that I use my self to "test" backup is to perform the slowest file-system check on disk possible soon after a new disk is formatted to be used as backup drive.
I used to format disk the quick way and then bulk copy my backup data to it, but then after some long time figured out some of the data is no longer readable resulting is loss.

In other words there is a big difference between:
1.) mkfs.ext4
2.) mkfs.ext4 -c
3.) mkfs.ext4 -cc

For a backup drive you want option 3, it will take hours (ex.12h for 1TB HDD) but it's a must for a backup drive because you'll get rid of bad sectors and keep backup data on good sectors only.
 
I like that you mentioned this, it's as important as 3-2-1 method because without it 3-2-1 might as well be 2-1 instead or even worse!

Yup. If you haven't tested to verify the backup works then it's not really a backup. Without testing, it's just a bunch of bits that might work.
 
Speaking of backups, here's a prompt to get folks motivated:

I'm retiring a bunch of old hard drives, since I currently don't have much use for 3.5" PATA 1 GB drives, and I'm opening them up to retrieve those really cool magnets from them. Here's a picture of one of them...

ST5108A_clean.jpg



But here's a picture of another:

ST5108A_head-crashed.jpg


That black shtuff used to be someone's data but now it's just dirt on my clothing, keyboard and hands.

These were both Seagate ST5108A drives - identical except for the head crash.
 
Cool. They are headed to where CRTs are now. what a nostalgic pic.
I wonder what uses can be gotten out of these platters.
 
Puppy backups are crazy simple.

Once a week, I copy the 'save-folders' from every Puppy in the 'kennels' to an external USB 3.0 HDD.....deleting the previous week's before doing so. The 'save' is the only part that needs backing-up; the system files look after themselves, since they load into RAM from read-only 'squash' files. These can't get corrupted.....and I have back-ups of these, even so.

ATM, I do this manually. One of these days, I will get around to automating all this via a script, which will be a one-click 'launcher' for the back-up operation; plug the external HDD in, fire up the script, let it run overnight, and set it to shut-down automatically when finished.....

One of these days (when I can kick my a**e into gear)..! :p


Mike. :D
 
Last edited:
Cool. They are headed to where CRTs are now. what a nostalgic pic.
I wonder what uses can be gotten out of these platters.
The platters are kind of pretty and I have -bunch- of them in a box somewhere - maybe some day I'll do something artistic with them. :)
 
Puppy backups are crazy simple.

Once a week, I copy the 'save-folders' from every Puppy in the 'kennels' to an external USB 3.0 HDD.....deleting the previous week's before doing so. The 'save' is the only part that needs backing-up; the system files look after themselves, since they load into RAM from read-only 'squash' files. These can't get corrupted.....and I have back-ups of these, even so.

ATM, I do this manually. One of these days, I will get around to automating all this via a script, which will be a one-click 'launcher' for the back-up operation; plug the external HDD in, fire up the script, let it run overnight, and set it to shut-down automatically when finished.....

One of these days (when I can kick my a**e into gear)..! :p


Mike. :D
Tiny Core backups are simple, too, which is a good thing since you make one at least before every reboot, but that's not quite the same since it's really just files getting backed up from the RAM disk to "persistent storage" (usually the boot device). That "backed up data" survives a reboot but not a media failure (nor an ID-ten-T issue, though it's hard to really protect against those) so I regularly copy the backup file (which is just a tarball) to external media.

My own policy on backups, in addition to "before rebooting" (which sometimes doesn't happen for a year) is that every time I have a feeling of accomplishment or "I'm glad that's taken care of" - that's the time to make a backup.

I have a bunch of large, (mostly) static data (photos, music, etc etc) that's not part of the regular backup routine but gets backed up separately and much less often.

The OS itself, and the applications, never get "backed up" per se because it's so absurdly easy to reinstall and all the configs and tweaks are in the regular backup. While I do have copies of the OS and apps around - for instance, an installation on another machine - there's really no good reason not to just download it fresh when needed. That way, I pick up any application updates I may have missed since the previous install.
 
I have so little change on Linux. That backups make no sense. The most that really changes is the OS itself. What makes the most sense to me is clone the entire SSD. So if you screw up with an install, can always go back easily. As far as I understand, incremental backups take care of user data. But not stuff that lives in /usr, var/, etc.
 
The OS itself, and the applications, never get "backed up" per se because it's so absurdly easy to reinstall and all the configs and tweaks are in the regular backup. While I do have copies of the OS and apps around - for instance, an installation on another machine - there's really no good reason not to just download it fresh when needed. That way, I pick up any application updates I may have missed since the previous install.
To be honest, I bother very little with the apps, either. Reason being that I don't "install" stuff at all these days; so much stuff I regularly use - including browsers, natch! - has been re-built into self-contained, "portable" format. At most, I 'link' launcher scripts and .desktop files into Puppy; I only install a handful of small, Puppy-specific utilities.......and these are so tiny they take up next to no space anyway.

I have around 200+ "portable" apps on a partition of my secondary internal 'data' drive. in this way, one application - along with its config files - can be shared between multiple Puppies; after all, where's the point in installing the same application into a dozen different distros, and having to set it up every time?

You end up with a dozen copies of the same software, a dozen copies of the same configuration files.....talk about wasted space that could always be put to better use.

The portable format is, I feel, uniquely suited to Puppy's mode of operation. I know of many in our wee community that would agree with me.


Mike. ;)
 
I have used TimeShift, but these days I use BackupPC because it comes with my Distro.
I care about my data more than I do my apps. I use kickstart to build my PC's, so everything gets built and
installed how I want from the start. My "data" is always on a second disk separate from the OS disk.
One big advantage of desktops over laptops, is that I can have multiple internal drives.

I then backup my data to another drive, sometimes Blu-Ray disks, sometimes USB drives ( I have a whole drawer full of about 30 )
I have a high-speed USB to m.2 sata adapter is also does SSD disks, I have 3 external SSD drives that I use for backup drives
as well.
 
I use Timeshift exclusively, and yes I am aware that it is not exactly a backup solution, although I use it effectively as one, with my settings. It is also my only need for a restore solution, and system restore solution.

I also use it to move a distro from one drive to another, and one computer to another.

I have my personal data stored on another drive.

I have installed Timeshift on each and every one of my 80+ distros, covering four (4) "families" - Debian/Ubuntu/Mint, RPM-based, Arch-based, and Gentoo-based.

I have used it since 2014 when it was authored, and probably closer to 15,000 times than 10,000 times.

I have had maybe 3 hiccups with it, and none that could not be remedied.

A 4th was as recently as October 2022, but that was not the fault of Timeshift, but rather with the upstream releasers of libglib.

One of the Members here, mentioned it at Manjaro, and a Manjaro Forum Member notified Tony George (author of Timeshift) who stepped up with a workaround.

Just my somewhat more than 2 cents.

Wizard
 
Do you mean a specific sub forum, or just one thread?

I will likely respond on my tomorrow, as I am signing off shortly.

Your top link, my Timeshift thread, is in

Forums > linux.org Articles and Tutorials > Linux Original Content > Linux Articles > Linux Other

... which is an area that only Rob, Jarret and I have access to.

Mine has been there over 6 years and 4 months, and won't likely be moving any time soon.

But I do take your point.

If one takes the time to use our Search facility at top right and keywords

backup recovery

the results reveal 9 pages with about 170 Threads covering related issues.

Cheers
 
Do you mean a specific sub forum, or just one thread?
Just one thread to discuss backup tools and solutions. Other wise we get someone create a new topic about backups every few months?
 
dnf info backuppc
Last metadata expiration check: 2:38:42 ago on Sat 27 Apr 2024 06:57:16 AM PDT.
Available Packages
Name : BackupPC
Version : 4.4.0
Release : 10.fc39
Architecture : x86_64
Size : 455 k

Fedora has it it. Alma and Rocky also have it, but since I have the epel repo's enabled, I don't know if it's in the vendor repo's.
 

Members online


Top