• aard@kyu.de
    link
    fedilink
    arrow-up
    0
    ·
    27 days ago

    Nowadays it matters if you use a compression algorithm that can utilize multiple cores for packing/unpacking larger data. For a multiple GB archive that can be the difference between “I’ll grab a coffee until this is ready” or “I’ll go for lunch and hope it is done when I come back”

      • aard@kyu.de
        link
        fedilink
        arrow-up
        0
        ·
        26 days ago

        I personally prefer bzip2 - but it needs to be packed with pbzip, not the regular bzip to generate archives that can be extracted on multiple cores. Not a good option if you have to think about Windows users, though.

        • quicksand@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          26 days ago

          Ah I have to use Windows for work and that’s the source of most of my compression needs. Thanks for the info though, I’ll look into this

  • azimir@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    27 days ago

    How about when peoples websites would put the sizes of linked images and files so you could estimate how long it would take to download a given image and such? Basically anything 30KB and above would have a size warning attached.

    • kratoz29@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      27 days ago

      What the hell, how so?

      Now that I think about it not much software comes in rar nowadays.

      • Björn Tantau@swg-empire.de
        link
        fedilink
        arrow-up
        0
        ·
        27 days ago

        Because it’s a garbage proprietary format that needs extra software on every OS. But for some inane reason it’s become the standard for piracy stuff. I think that’s the only reason it’s still alive.

        • shalafi@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          27 days ago

          Windows opens RAR files right out the box. Just tested.

          And if you need a separate unzipper for whatever reason, 7-Zip opens all the things.

          • SquigglyEmpire@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            27 days ago

            Windows now handles 7z files natively too (at least as of the upcoming Windows 11 24H2 version), I’m glad they’ve at least added some legit new features for File Explorer.

        • AwkwardLookMonkeyPuppet@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          27 days ago

          It’s not garbage. It’s used in the pirate community and elsewhere because back in the day things were shared on the Usenet before they were shared anywhere else. There’s a limit for file size on the Usenet, so we needed to be able to break compressed files into multiple parts and have an easy way to put them back together when uncompressing. Win Zip did not have that functionality. You can thank WinRar for powering the entire sharing scene for decades. When torrent was becoming popular NO distributors shared on torrent. They shared on the Usenet. Then someone would take a Usenet share and post it to the torrent network. Torrent wouldn’t have had much success, or would have taken much longer to catch on if it wasn’t for WinRar and the Usenet.

          • BigDanishGuy@sh.itjust.works
            link
            fedilink
            arrow-up
            0
            ·
            26 days ago

            There’s a limit for file size on the Usenet

            No, there is no limit on the file size on usenet. There’s a limit on the individual article size, but larger files just require more articles.

            The reason why files were split on usenet was completion and corruption, and probably also media size originally. Say you need to post a 700MB file to alt.binaries.erotica.grannies.diapers, then you could just split those 700MB into 477867 articles of 1.5kB each, but if a single article is then corrupted or dropped, then nobody can get the file. If you split the 700MB into 35 files of 20MB each, and each 20MB file into 13654 articles, then a dropped article only corrupts a single file. Add to that, that completion issues often occured (or is it occurs? it’s been a long while since I got my Linux iso files from usenet) close to each other. So there might be a bunch of corruption in a single file, but everything else is fine. This is useful if your main provider was your ISPs complimentary usenet server, and you only got the rest from a pay by download service.

            About the media comment earlier, I can’t be sure. I wasn’t around in the early days, but I know that the 700MB file size for movies came from the limitations of CDs. Splitting files can quite possibly stem from some similar restrictions on a removable media.

            You can thank WinRar for powering the entire sharing scene for decades

            And the saints behind winrar for only bugging you to pay. TBH first time installing 7z on a new windows install, instead of winrar, felt a bit sad.

        • frezik@midwest.social
          link
          fedilink
          arrow-up
          0
          ·
          27 days ago

          RAR has internal file checking and redundancy that allows it to recover from a level of transmission errors. Some of the more clandestine ways pirate teams transfer things are by means that aren’t totally reliable, so this is very important. BitTorrent uploaders tend to take the file exactly as they get it, so there you go.

          BitTorrent has more sophisticated ways of checking correctness than RAR, so it’s not really necessary. It’s just too much effort for uploaders to bother.

  • ssm@lemmy.sdf.org
    link
    fedilink
    arrow-up
    0
    ·
    26 days ago

    Now we have so much bandwidth it doesn’t matter

    Squints eyes

    Now we just don’t care about even the slightest modicum of efficiency

  • db2@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    27 days ago

    For a few hundred kilobyte file sure, the difference is like pocket change. For a larger one you’d choose the right tool for the job though, especially for things like a split archive or a database.

    • Im_old@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      27 days ago

      Username checks out! Also you’re absolutely right, just last month I was looking for the best compression algorithm/packages to archive a 70gb DB

    • dustyData@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      27 days ago

      Because gzip and bz2 exists. 7z is almost always a plugin or addon, or extra application. While the first two work out of the box pretty much everywhere. It also depends on frequency of access, frequency of addendum, size, type of data, etc. If you have an archive that you have to add new files frequently, 7z is gonna start grating on you with the compression times. But it is Ok if you are going to extract very frequently from an archive that will never change. While gz and bz2 are overall the “good enough at every use case” format.

        • JasonDJ@lemmy.zip
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          26 days ago

          Windows having tar.gz support is great.

          I have scripts for generating log bundles on user computers and sending to a share. tar.gz is great for compressing ~2.5GB text to send over VPN, and then I can open the .tar.gz direct from the network drive with minimal additional delay opening a 500MB text file inside.

    • Fonzie!@ttrpg.network
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      21 days ago

      For archiving/backupping *NIX files, tar.whatever still wins as it preserves permissions while 7z, zip and rar don’t

      Oh, and while 7z is FOSS and supported out of the box on most Linux desktop OSes and on macOS, Windows users will complain they need to install stuff to open your zip. Somehow, tar.gz is supported out of the box on Linux, macOS, and yes Windows 10 and 11!

  • Swarfega@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    27 days ago

    In the early days of the internet, WinZip was a must have tool. My college had a fast internet connection. I say fast but I bet it was less than 1Mb shared between everyone. Way faster than the 33k modem I had at home.

    I used my college connection to download so much and then took it home on floppy disks. For files larger than 1MB I’d use WinZip to split files up.