Not sure why this doesn’t exist. I don’t need 12TB of storage. When I had a Google account I never even crossed 15GB. 1TB should be plenty for myself and my family. I want to use NVMe since it is quieter and smaller. 2230 drives would be ideal. But I want 1 boot drive and 2 x storage drives in RAID. I guess I could potentially just have 2xNVMe and have the boot partition in RAID also? Bonus points if I can use it as a wireless router also.

  • helenslunch@feddit.nlOP
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    7 months ago

    You can get those machines second hand for the price range you specified.

    Maybe but they also presumably consume much more power?

    • TCB13@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      7 months ago

      Maybe but they also presumably consume much more power?

      If you pick one of those machines with a “T” CPU you won’t even notice them. They’ll downscale on idle to probably around the same power the N100 would. The real difference is that they’ll use more power if you demand more resources but even that that point do you really care about a few watts?

      Before anyone loses their minds, imagine you get the i3-8300T model that will peak at 25W, that’s about 0.375$ a month to run the thing assuming a constant 100% load that you’ll never have.

      Even the most cheap ass cloud service out there will be more expensive than running that unit at 100% load. People like to freak about power consumption yet it’s not their small mini PC that ruins their power bill for sure.

      • 486@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        Before anyone loses their minds, imagine you get the i3-8300T model that will peak at 25W, that’s about 0.375$ a month to run the thing assuming a constant 100% load that you’ll never have.

        Not sure how you came to that conclusion, but even in places with very cheap electricity, it does not even come close to your claimed $0.375 per month. At 25 W you would obviously consume about 18 kWh per month. Assuming $0.10/kWh you’d pay $1.80/month. In Europe you can easily pay $0.30/kWh, so you would already pay more than $5 per month or $60 per year.

          • 486@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            7 months ago

            Well, what they are stating is obviously wrong then. No need to use some website for that anyway, since it is so easy to calculate yourself.

            • TCB13@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              7 months ago

              Okay, you’re into something, they’re considering 8 hours / day at 0.25$ per kWh2 and 25% CPU by default. Still if we tweak that into reasonable numbers, or even consider your 60$/year it will still be cheaper than a cloud service… either way those machines won’t run at 25W on idle, more like 7W.

              • 486@kbin.social
                link
                fedilink
                arrow-up
                1
                ·
                7 months ago

                Sure, cloud services can get quite expensive and I agree that using used hardware for self-hosting - if it is at least somewhat modern - is a viable option.

                I just wanted to make sure, the actual cost is understood. I find it rather helpful to calculate this for my systems in use. Sometimes it can actually make sense to replace some old hardware with newer stuff, simply because of the electricity cost savings of using newer hardware.

                • TCB13@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 months ago

                  Sometimes it can actually make sense to replace some old hardware with newer stuff, simply because of the electricity cost savings of using newer hardware.

                  Yes and we usually see that with very old server grade hardware vs new-ish consumer hardware. Once the price difference is around 100$ or so, we’re talking about years before break-even and it may not make much sense.

        • TCB13@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          7 months ago

          Yes, I’m in Europe with this Ukrainian/Russian mess whatever you’re paying I can assure you I’m paying more than most people reading this and you don’t see me freaking out about a mini PC. Even if you multiply everything above by 4 (and that will certainly go over wtv someone is paying right now) you’ll sill be talking about a very little money compared to everything else you’re running in our houses.

          • rambos@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 months ago

            While I agree 25W is not much, I pay around 1€ for 1W a year (Croatia) and I know there are countries that pay way more thn that. Still, we are talking about power that is close to SBC consumption, you cant go much lower. I think number of devices (drives etc) are more important than actual CPU idle power

            • TCB13@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 months ago

              I think number of devices (drives etc) are more important than actual CPU idle power

              Yes, or even the BIOS setup, some of those machines allow to disable CPU cores and un-used hardware.