Hello selfhosted! Sometimes I have to transfer big files or a large amounts of small files in my homelab. I used rsync but specifying the IP address and the folders and everything is bit fiddly. I thought about writing a bash script but before I do that I wanted to ask you about your favourite way to achieve this. Maybe I am missing out on an awesome tool I wasn’t even thinking about.

Edit: I settled for SFTP in my GUI filemanager for now. When I have some spare time I will try to look into the other options too. Thank you for the helpful information.

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Yeah, I mean I do still use rsync for the stuff that would take a long time, but for one-off file movement I just use a mounted network drive in the normal file browser, including on Windows and MacOS machines.

    • theorangeninja@sopuli.xyzOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      4 months ago

      Sounds very straight forward. Do you have a samba docker container running on your server or how do you do that?

      • drkt@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        13
        ·
        edit-2
        4 months ago

        I just type sftp://[ip, domain or SSH alias] into my file manager and browse it as a regular folder

          • drkt@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            4 months ago

            Linux is truly extensible and it is the part I both love and struggle to explain the most.
            I can sit at my desktop, developing code that physically resides on my server and interact with it from my laptop. This does not require any strange janky setup, it’s just SSH. It’s extensible.

            • blackbrook@mander.xyz
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              4 months ago

              I love this so much. When I first switched to Linux, being able to just list a bunch of server aliases along with the private key references in my .ssh/config made my life SO much easier then the redundantly maintained and hard to manage putty and winscp configurations in Windows.

        • GamingChairModel@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          Yeah, if OP has command line access through rsync then the server is already configured to allow remote access over NFS or SMB or SSH or FTP or whatever. Setting up a mounted folder through whatever file browser (including the default Windows Explorer in Windows or Finder in MacOS) over the same protocol should be trivial, and not require any additional server side configuration.

      • Kit@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        I have two servers, one Mac and one Windows. For the Mac I just map directly to the smb share, for the Windows it’s a standard network share. My desktop runs Linux and connects to both with ease.

      • Lv_InSaNe_vL@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        4 months ago

        I dont have a docker container, I just have Samba running on the server itself.

        I do have an owncloud container running, which is mapped to a directory. And I have that shared out through samba so I can access it through my file manager. But that’s unnecessary because owncloud is kind of trash.

  • e0qdk@reddthat.com
    link
    fedilink
    English
    arrow-up
    14
    ·
    4 months ago

    People have already covered most of the tools I typically use, but one I haven’t seen listed yet that is sometimes convenient is python3 -m http.server which runs a small web server that shares whatever is in the directory you launched it from. I’ve used that to download files onto my phone before when I didn’t have the right USB cables/adapters handy as well as for getting data out of VMs when I didn’t want to bother setting up something more complex.

    • GamingChairModel@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Honestly, this is an easy way to share files with non-technical people in the outside world, too. Just open up a port for that very specific purpose, send the link to your friend, watch the one file get downloaded, and then close the port and turn off the http server.

      It’s technically not very secure, so it’s a bad idea to leave that unattended, but you can always encrypt a zip file to send it and let that file level encryption kinda make up for lack of network level encryption. And as a one-off thing, you should close up your firewall/port forwarding when you’re done.

  • boreengreen@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    4 months ago

    rsync is indeed fiddly. Consider SFTP in your GUI of choice. I mount the folder I need in my file browser and grab the files I need. No terminal needed and I can put the folders as favorites in the side bar.

    • Lv_InSaNe_vL@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      If you want to use the terminal though, there is scp which is supported on both windows and Linux.

      Its just scp [file to copy] [username]@[server IP]:[remote location]

  • Xanza@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 months ago

    rclone. I have a few helper functions;

    fn mount { rclone mount http: X: --network-mode }
    fn kdrama {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/KDrama/$x --filter-from
    ~/.config/filter.txt }
    fn tv {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/TV/$x --filter-from ~/.config/filter.txt }
    fn downloads {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/Downloads/$x --filter-from ~/.config/filter.txt }
    

    So I download something to my seedbox, then use rclone lsd http: to get the exact name of the folder/files, and run tv "filename" and it runs my function. Pulls all the files (based on filter.txt) using multiple threads to the correct folder on my NAS. Works great, and maxes out my connection.

  • magic_smoke@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    4 months ago
    • sftp for quick shit like config files off a random server because its easy and is on by default with sshd in most distros
    • rsync for big one-time moves
    • smb for client-facing network shares
    • NFS for SAN usage (mostly storage for virtual machines)
  • melroy@kbin.melroy.org
    link
    fedilink
    arrow-up
    2
    ·
    4 months ago

    yeah I also use SFTP using FileZilla. Or like everybody mentioned including yourself, rsync to sync files across computers. Or even scp.

  • neidu3@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 months ago

    rsync if it’s a from/to I don’t need very often

    More common transfer locations are done via NFS

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 months ago

    I’d say use something like zeroconf(?) for local computer names. Or give them names in either your dns forwarder (router), hosts file or ssh config. Along with shell autocompletion, that might do the job. I use scp, rsync and I have a NFS share on the NAS and some bookmarks in Gnome’s file manager, so i just click on that or type in scp or rsync with the target computer’s name.