![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/8286e071-7449-4413-a084-1eb5242e2cf4.png)
It is unrealiatic, that in a stable software release there is suddenly, after you tested your backup a hard bug which prevents recovery.
How is unrealistic? Think of this:
- day 1: you backup your files, test the backup and everything is fine
- day 2: you store a new file that triggers a bug in the compression/encryption algorithm of whatever software you use, now backups are corrupted at least for this file Unless you test every backup you do, and consequently can’t backup fast enough, I don’t see how you can predict that future files and situations won’t trigger bugs in a software
The only good reply in the thread. Thanks for saying this