Hi All

I have a selfhosted Gitea server, and I use it for a lot more than coding. I even do management of document history for my business. I love it.

What I would like to do, is use it backup specific folders on other servers in my homelab.

Say for example my webdev test server: I would like to daily back up /etc/ /var/www/one.example.com/ /var/www/two.example.com/ etc etc

Now my knowledge on Gitea, and Git as a whole, is relatively limited to clone, add, commit, push and pull.

If I setup a user for the server, then insert the ssh pub key. I would like to know, how from the terminal (via SSH to the server), I can create a new repo for folder /var/www/one.example.com/ and then do an initial commit, so that the .git folder is created locally in /var/www/one.example.com/.git/

Then I can set a cronjob to do my daily backups, but still have the magic of full file history.

Also, can you configure a Repo to only keep changes back for say 90 days? (Space saving in the long run).

I know there are a lot of ways to do this, but I have a very good reason for using Git, mainly, it streamlines restoring files at any point in history, and also if I need to fork a website I am developing, I can do it in Git with ease.

Plus it allows me to add other users to a repo for example, and allows us to do branches etc.

Currently I am backing everything up using a script I wrote, and I have a dedicated bare metal that is handling that. I get a .tar.gz for the last 7 days, the last 5 sundays and the last 3 months (1st). But this is starting to take up a lot of harddrive space.

Any advice would kindly be appreciated.

  • AudioOmen@alien.topB
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 months ago

    Hope you are not doing that for binary files. It can become huge really fast. This is also not a good option for text files that change it’s structure or content all the time. It’s better to use backup solutions for backups, and version control systems for it’s purpose.

  • geek_at@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    You can use gitea actions to run any command on any server you like. Starting backups, running git pull or push… endless possibilities

  • knaak@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    I set up a git repo in AWS Code Commit and then I setup a script on a backup VM that I have an external HDD mounted to. I have a cron job which pulls from gitea and then pushes to CodeCommit via a remote server in git.

    That gives me a backup of git on the external hdd and externally via aws plus my gitea is backed up twice a week to my truenas.

  • lilolalu@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    git-annex allows managing large files with git, without storing the file contents in git. It can sync, backup, and archive your data, offline and online. Checksums and encryption keep your data safe and secure. Bring the power and distributed nature of git to bear on your large files with git-annex.

    git-annex is designed for git users who love the command line. For everyone else, the git-annex assistant turns git-annex into an easy to use folder synchroniser.

    https://git-annex.branchable.com/

  • OSH@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 months ago

    You should consider looking into a proper backup solution like restic or borg, which do a great job in saving space and deduplication.

    Depending on your workflow for the web content, maybe a git or gitops based workflow would be a better way of doing things like pushing the changes to git and have actions publish it for you to your website. Like some static website generators, or Github/GitLab pages do it.