I bought a second hand NUC to have a little more horsepower to run various services. I have it connected to my NAS, but almost all of the docker volumes reside on the SSD in the NUC.

It would be nice to be able to backup those volumes to my NAS in case the NUC fails. I have Debian 12 running on it.

What are my options ? Should I just backup my docker volumes or does it make more sense to backup the entire NUC ? (I’m less tech savvy then I might appear. Please be generous with your explanation, I still have a lot to learn)

  • Ebrithil95@alien.topB
    link
    fedilink
    English
    arrow-up
    3
    ·
    8 months ago

    I dont use volumes at all and instead use bind mounts so i can just back up those folders

  • ElevenNotes@alien.topB
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 months ago

    Use XFS as file system and use --reflink when you copy the volumes, creates an instant CoW copy no matter how big the folder is. You can then move, copy or whatever that folder to anywhere, or use a VM and simply backup the VM.

  • MundanePercentage674@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    I use zfs mirror 2x SSD for cache data snapshot no need to stop or restart docker 0 downtime and super easy and fast to restore data like 1 second and it’s done

  • sevlonbhoi1@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    I use restic. Stop the container, run backup, start container. you can script it to run every night or something.

  • CactusBoyScout@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    Wow people are recommending a lot of things I don’t do and now I’m worried I’m doing something wrong.

    I just have a folder on my Ubuntu boot drive called Docker with all of the persistent data from my containers. And I just tell Duplicati to backup that folder to BackBlaze. I don’t stop the containers to do that. Am I doing something wrong?

    • Big-Finding2976@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Some people would say that you’re doing something wrong by using Duplicati, because they’ve had problems restoring data and it’s very slow, so if you’ve never had to restore data before you should test that to check that it works, and maybe switch to something else like Borg to be safe.

      Also, backing up the folder without stopping the containers first might result in any backed up databases being corrupt, so if you’re running anything that uses databases, you should stop those containers before backing up the folder.

    • NiftyLogic@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      IMHO not really.

      There is the slight chance that DBs get inconsistent with backing up hot DB files, but in a homelab with minimal load this is usually not an issue. Same for NFS.

      Just make sure you have older backups, too. Just in case the last backup was not good.

  • Minituff@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    If you just need to backup Docker Volumes, I recommend Nautical.

    You can use it to backup to an NFS share if you need to go between Machines.

  • BulMaster@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    You can create NFS Docker Volumes that are mounted through docker and reside on the NAS.

  • christancho@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    Nop, docker volumes bring me anxiety since I have no clue where are the files located. I always go with directory binding, I retain full control, master of my files, captain of my soul.

  • plastrd1@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    Those backing up bind mounted volumes, what user does your backup program run as? The data inside bind mounts can have very random user IDs depending on the container including files owned by root.
    Does your backup have to run as root in order to capture all files and retain permissions?