• ayyy@sh.itjust.works
    link
    fedilink
    arrow-up
    10
    ·
    2 months ago

    PSUs are waaaaay more efficient when operating closer to their rated capacity. Pulling 200W through a 1kW power supply is like making a marathon runner breathe through a straw.

    • bitwaba@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      2 months ago

      The sweet spot is the 40-60% load.

      But it doesn’t make that much of a difference. The efficiency swing is maybe 10%. Like an bronze 80 rated PSU will have a minimum efficiency of 80%, but even if you’re at the 50% load mark it won’t be over 90% efficient.

      The main point (to me anyways) is that its dumb to pay more for a power supply just so you can pay "more* on your power bill. If your idle load is 100W and your gaming load is 300W, you’ve got no reason running more than a 600W PSU

      • Naz@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        I’ve got a 850W power supply, which I bought 2-3 years ago in anticipation of the RTX 4000 series. My usual load with a GTX 1080 was 150W and now my entire system uses 520W completely loaded. Do I count? :)

    • SkunkWorkz@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      2 months ago

      While true. How much would it actually save you in electricity? If you upgrade every year wouldn’t it be cheaper to just buy the bigger psu outright and pay the extra cost in electricity so you don’t have to buy another PSU when you get more power hungry components.