I wonder if my system is good or bad. My server needs 0.1kWh.

  • MentalEdge@ani.social
    link
    fedilink
    English
    arrow-up
    36
    ·
    edit-2
    28 days ago

    You might have your units confused.

    0.1kWh over how much time? Per day? Per hour? Per week?

    Watthours refer to total power used to do something, from a starting point to an ending point. It makes no sense to say that a device needs a certain amount of Wh, unless you’re talking about something like charging a battery to full.

    Power being used by a device, (like a computer) is just watts.

    Think of the difference between speed and distance. Watts is how fast power is being used, watt-hours is how much has been used, or will be used.

    If you have a 500 watt PC, for example, it uses 500Wh, per hour. Or 12kWh in a day.

    • fool@programming.dev
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      edit-2
      27 days ago

      I forgive 'em cuz watt hours are a disgusting unit in general

      idea what unit
      speed change in position over time meters per second m/s
      acceleration change in speed over time meters per second, per second m/s/s=m/s²
      force acceleration applied to each of unit of mass kg * m/s²
      work acceleration applied along a distance, which transfers energy kg * m/s² * m = kg * m²/s²
      power work over time kg * m² / s³
      energy expenditure power level during units of time (kg * m² / s³) * s = kg * m²/s²

      Work over time, × time, is just work! kWh are just joules (J) with extra steps! Screw kWh, I will die on this hill!!! Raaah

    • overload@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      27 days ago

      I was really confused by that and that the decided units weren’t just in W (0.1 kW is pretty weird even)

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          27 days ago

          At least in the US, the electric company charges in kWh, computer parts are advertised in terms of watts, and batteries tend to be in amp hours, which is easy to convert to watt hours.

          Joules just overcomplicates things.

            • BigDanishGuy@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              26 days ago

              Wow, the US education system must be improved.

              I pay my electric bill by the kWh too, and I don’t live in the US. When it comes to household and EV energy consumption, kWh is the unit of choice.

              1J is 3600Wh.

              No, if you’re going to lecture people on this, at least be right about facts. 1W is 1J/s. So multiply by an hour and you get 1Wh = 3600J

              That’s literraly the same thing,

              It’s not literally the same thing. The two units are linearly proportional to each other, but they’re not the same. If they were the same, then this discussion would be rather silly.

              but the name is less confusing because people tend to confuse W and Wh

              Finally, something I can agree with. But that’s only because physics is so undervalued in most educational systems.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              26 days ago

              Do you regularly divide/multiply by 3600? That’s not something I typically do in my head, and there’s no reason to do it when everything is denominated in watts. What exactly is the benefit?

            • overload@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              ·
              26 days ago

              I did a physics degree and am comfortable with Joules, but in the context of electricity bills, kWh makes more sense.

              All appliances are advertised in terms of their Watt power draw, so estimating their daily impact on my bill is as simple as multiplying their kW draw by the number of hours in a day I expect to run the thing (multiplied by the cost per kWh by the utility company of course).

    • Valmond@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      27 days ago

      Wasn’t it stated for the usage during November? 60kWh for november. Seems logic to me.

      Edit: forget it, he’s saying his server needs 0.1kWh which is bonkers ofc

      • B0rax@feddit.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        27 days ago

        Only one person here has posted its usage for November. The OP has not talked about November or any timeframe.

        • Valmond@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          27 days ago

          Yeah misxed up pists, thought one depended on another because it was under it. Again forget my post :-)

    • GravitySpoiled@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      28 days ago

      Computer with gpu and 50TB drives. I will measure the computer on its own in the enxt couple of days to see where the power consumption comes from

      • Ulrich@feddit.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        27 days ago

        Which GPU? How many drives?

        Put a kill-o-watt meter on it and see what it says for consumption.

  • elmicha@feddit.org
    link
    fedilink
    English
    arrow-up
    13
    ·
    28 days ago

    Do you mean 0.1kWh per hour, so 0.1kW or 100W?

    My N100 server needs about 11W.

    • chunkystyles@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      2
      ·
      28 days ago

      The N100 is such a little powerhouse and I’m sad they haven’t managed to produce anything better. All of the “upgrades” are either just not enough of an upgrade for the money, it just more power hungry.

      • d_k_bo@feddit.org
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        1
        ·
        edit-2
        28 days ago

        It’s the other way around. 0.1 kWh means 0.1 kW times 1 h. So if your device draws 0.1 kW (100 W) of power for an hour, it consumes 0.1 kWh of energy. If your device factory draws 360 000 W for a second, it consumes the same amount of 0.1 kWh of energy.

        • GravitySpoiled@lemmy.mlOP
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          5
          ·
          27 days ago

          Thank you for explaining it.

          My computer uses 1kwh per hour.

          It does not yet make sense to me. It just feels wrong. I understand that you may normalize 4W in 15 minutes to 16Wh because it would use 16W per hour if it would run that long.

          Why can’t you simply assume that I mean 1kWh per hour when I say 1kWh? And not 1kWh per 15 minutes.

          • 486@lemmy.world
            link
            fedilink
            English
            arrow-up
            11
            ·
            edit-2
            27 days ago

            kWh is a unit of power consumed. It doesn’t say anything about time and you can’t assume any time period. That wouldn’t make any sense. If you want to say how much power a device consumes, just state how many watts (W) it draws.

          • __nobodynowhere@startrek.website
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            27 days ago

            A watt is 1 Joule per Second (1 J/s). E.g. Every second, your device draws 1 Joule of energy. This energy over time is called “Power” and is a rate of energy transfer.

            A watt-hour is (1 J/s) * (1 hr)

            This can be rewritten as (3600 J/hr) * (1 hr). The “per hour” and “hour” cancel themselves out which makes 1 watt-hour equal to 3600 Joules.

            1 kWh is 3,600 kJ or 3.6 MJ

      • elmicha@feddit.org
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        27 days ago

        0.1kWh per hour can be written as 0.1kWh/h, which is the same as 0.1kW.

  • acockworkorange@mander.xyz
    link
    fedilink
    English
    arrow-up
    12
    ·
    27 days ago

    Idles at around 24W. It’s amazing that your server only needs .1kWh once and keeps on working. You should get some physicists to take a look at it, you might just have found perpetual motion.

  • Karna@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    ·
    27 days ago

    I came here to tell my tiny Raspberry pi 4 consumes ~10 watt, But then after noticing the home server setup of some people and the associated power consumption, I feel like a child in a crowd of adults 😀

    • trolololol@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      26 days ago

      Quite the opposite. Look at what they need to get a fraction of what you do.

      Or use the old quote, “they’re compensating for small pp”

  • walden@sub.wetshaving.social
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    28 days ago

    9 spinning disks and a couple SSD’s - Right around 190 watts, but that also includes my router and 3 PoE WiFi AP’s. PoE consumption is reported as 20 watts, and the router should use about 10 watts, so I think the server is about 160 watts.

    Electricity here is pretty expensive, about $.33 per kWh, so by my math I’m spending $38/month on this stuff. If I didn’t have lots of digital media it’d be worth it to get a VPS probably. $38/month is still cheaper than Netflix, HBO, and all the other junk I’d have to subscribe to.

    • GravitySpoiled@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      28 days ago

      That’s true. And the children of my family see no ads which is priceless. Yet I am looking into ways to cut costs in half by using an additional lower powered mini pc which is always on and the main computer only running in the evening - maybe.

    • Billygoat@catata.fish
      link
      fedilink
      English
      arrow-up
      2
      ·
      27 days ago

      Same here. 300w with 12 disks, switches, and router. But electricity only costs $.12/kwh. I wouldn’t trust having terabytes of data in the cloud.

  • GreenKnight23@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    28 days ago

    last I checked with a kill-a-watt I was drawing an average of 2.5kWh after a week of monitoring my whole rack. that was about three years ago and the following was running in my rack.

    • r610 dual 1kw PSU
    • homebuilt server Gigabyte 750w PSU
    • homebuilt Asus gaming rig 650w PSU
    • homebuilt Asus retro(xp) gaming/testing rig 350w PSU
    • HP laptop as dev env/warmsite ~ 200w PSU
    • Amcrest NVR 80w (I guess?)
    • HP T610 65w PSU
    • Terramaster F5-422 90w PSU
    • TP-Link TL-SG2424P 180w PSU
    • Brocade ICX6610-48P-E dual dual 1kw PSU
    • Misc routers, rpis, poe aps, modems(cable & 5G) ~ 700w combined (cameras not included, brocade powers them directly)

    I also have two battery systems split between high priority and low priority infrastructure.

      • GreenKnight23@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        28 days ago

        would love to add more fiber to my diet! if I had the time and money. next four years is going to get pricy so I’m solidifying my stack now with backup hardware and planning for failures.

        the brocade is running my pvt lan since it’s the most important. physically cut off public access. just upgraded most my servers to use 10gbe and would love to run fiber to my office about 60-70 feet away.

        the brocade I’m using was unlocked by the eBay seller I got it from, so it can theoretically transfer up to 40g. would be great for my AI rig I keep in the office.

    • Atemu@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      27 days ago

      I was drawing an average of 2.5kWh after a week of monitoring my whole rack

      That doesn’t seem right; that’s only ~18W. Each one of those systems alone will exceed that at idle running 24/7. I’d expect 1-2 orders of magnitude more.

      • GreenKnight23@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        27 days ago

        IDK, after a week of runtime it told me 2.5kwh average. could be average per hour?

        Highest power bill I ever saw was summer of 2022. $1800. temps outside were into to 110-120 range and was the hottest ever here.

        maybe I’ll hook it back up, but I’ve got different (newer) hardware now.

        • Atemu@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          26 days ago

          after a week of runtime it told me 2.5kwh average. could be average per hour

          If it gives you kWh as a measure for power, you should toss it because it’s obviously made by someone who had no idea what they were doing.

  • Lemmchen@feddit.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    28 days ago

    I’m idling at 120W with eight drives, but I’m currently looking into how to lower it.

  • rumba@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    27 days ago

    Running an old 7th gen Intel, It has a 2070 and a 1080 in it, six mechanical hard drives 3 SSDs. Then I have an eighth gen laptop with a 1070 TI mobile. But the laptop’s a camera server so it’s always running balls to the wall. Running a unified dream machine pro, 24 port poe, 16 port poe and an 8 port poe

    Because of the overall workload and the age of the CPU, it burns about 360 watts continuous.

    I can save a few watts by putting the discs to sleep, But I’m in the camp where the spin up and spin down of the discs cost more wear than continuous running.

    Edit: cleaned up the slaughter from the dictation, after I cleaned up my physical space from Christmas festivities.

  • mtoboggan@feddit.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    28 days ago

    Idle: 30 Watts

    Starting all docker containers after reboot: 140 Watts

    It needs around 28 kWh per month.

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    3
    ·
    28 days ago

    My server with 8 hard drives uses about 60 watts and goes up to around 80 under heavy load. The firewall, switch, access points and modem use another 50-60 watts.

    I really need upgrade my server and firewall to something about 10 years newer, it would reduce my power consumption quite a bit and I would have a lot more runtime on UPS.

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    27 days ago

    Around 18-20 Watts on idle. It can go up to about 40 W at 100% load.

    I have a Intel N100, I’m really happy about performance per watt, to be honest.

  • bier@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    2
    ·
    28 days ago

    My whole setup including 2 PIs and one fully speced out AM4 system with 100TB of drives a Intel Arc and 4x 32gb ecc ram uses between 280W - 420W I live in Germany and pay 25ct per KWh and my whole apartment uses 600w at any given time and approximately 15kwh per day 😭