Last night I was writing a script and it made a directory literally named “~” on accident. It being 3am I did an rm -rf ~ without thinking and destroyed my home dir. Luckily some of the files were mounted in docker containers which my user didn’t have permission to delete. I was able to get back to an ok state but lost a bit of data.

I now realize I really should be making backups because shit happens. I self host a pypi repository, a docker registry both with containers and some game servers in and out of containers. What would be the simplest tool to backup to Google drive and easily restore?

  • Shimitar@feddit.it
    link
    fedilink
    English
    arrow-up
    8
    ·
    3 days ago

    Restic or Borg. For restic I use the great Backrest web GUI.

    I mounted an USB drive to one of my OpenWRT access points and backup on that one.

    Rclone or fuse can mount/access Google Drive and can be used as back end for your backup choice.

    Simplest backup ever: restic/Borg on a folder on the same PC. Not very recommendable, but indeed a good starting point.

    Zfs/brtfs seems a complex solution for a simple problem. True is that once you start eating you get hungrier so maybe worthwhile.

  • atzanteol@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 days ago

    rclone & restic work okay together to create backups in a Google drive mount. There are “issues” with backing up to Google drive since it doesn’t guarantee file names are unique which is… a choice… but it should be reliable enough.

  • PhilipTheBucket@ponder.cat
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 days ago

    Borg borg borg

    You can combine it with a FUSE mount of the Google Drive, I’m not sure if that works but I don’t see why it wouldn’t.

  • Atherel@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 days ago

    I use duplicacy, it’s free as cli and pretty cheap if you want to manage the backup via gui. Restore by gui is always free and I would recommend it because it’s way easier to navigate the backups if you want to restore single files or folders.

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 days ago

    I just uh, wrote a bash script that does it.

    It dumps databases as needed, and then makes a single tarball of each service. Or a couple depending on what needs doing to ensure a full backup of the data.

    Once all the services are backed up, I just push all the data to a S3 bucket, but you could use rclone or whatever instead.

    It’s not some fancy cool toy kids these days love like any of the dozens of other backup options, but I’m a fan of simple and well, a couple of tarballs in a S3 bucket is about as simple as it gets since restoring doesn’t require any tools or configuration or anything: just snag the tarballs you need, unarchive them, done.

    I also use a couple of tools for monitoring the progress and a separate script that can do a full restore to make sure shit works, but that’s mostly just doing what you did to make and upload the tarballs backwards.