- Google Cloud accidentally deleted UniSuper’s account and backups, causing a major data loss and downtime for the company.
- UniSuper was able to recover data from backups with a different provider after the incident.
- The incident highlighted the importance of having safeguards in place for cloud service providers to prevent such catastrophic events from occurring.
As the saying goes: if you only have one backup you have zero backups.
How the fuck does Google of all companies manage to accidentally delete that‽
Everything is tied to the subscriptions, they deleted the sub and that automatically deleted all backups.
Very stupid.
AWS has a holding period after account deletion where nothing is actually deleted, just inaccessible and access can be regained without data loss.
Since first hearing about this I’m wondering how TF Google Cloud doesn’t have a similar SOP.
If this is the thing I heard of a few days ago then google had multiple backups on different sites but they managed to delete all of them
I guess they weren’t paying quite enough to have offline backups? I believe financial institutions can keep stuff stored in caves (think records of all the mortgages a bank wants to be repaid for - data loss isn’t an option).
My first job was in a Big Iron shop in the late 80’s, where I was in charge of backups. We kept Three sets of backups, on two different media, one on hand, one in a different location in the main building, in a water and fireproof safe, and one offsite. We had a major failure one day, and had to do a restore.
Both inhouse copies failed to restore. Thankfully the offsite copy worked. We were in panic. That taught me to keep all my important data on three sets. As the old saying goes: Data loss is not an if question, but a when question. Also, remember that “the cloud” simply means someone else’s remote servers over which you have no control.
And had you ever tested the restore process?