• silence7@slrpnk.netOPM
      link
      fedilink
      arrow-up
      8
      ·
      7 days ago

      LLMs are a huge new energy use and part of why bills are going up — big data pushed the cost onto the rest of us

      • Womble@piefed.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        6
        ·
        edit-2
        7 days ago

        LLMs are a tiny energy use, 100 queries to chatGPT type models use about as much energy as a hair dryer for a few minutes. At current UK electricity retail prices (after tax, so significantly more than datacentres pay) 1 query costs somewhere between £0.00015 - £0.001 in power usage.

        I cant see that being a significant factor in the price power companies charge over things like moving away from cheap but dirty sources of power or fluctuations of the natural gas price.

        • silence7@slrpnk.netOPM
          link
          fedilink
          arrow-up
          3
          ·
          6 days ago

          The median query is small. There are a LOT of queries because they’re being generated by machines, not just people, and the average energy use per query is likely far larger than the median, hence the decision by Google to publish the median instead.

          • Womble@piefed.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            5 days ago

            Sure maybe that’s all true, but even if you make insane assumptions like every single person in the UK is making 100 queries per week, and that the true average cost is 10 times higher than the 3Wh I used for my upper price limit there (this is far more than independent research suggests), and that data centres are paying retail price + taxes: It still only comes out to around 5% of the UK domestic electricity market, so hardly going to be responsible for huge shifts in prices.

    • usernamesAreTricky@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      7 days ago

      Datacenters for AI are delaying, but not stopping the closure of fossil fuel plants. They are still like ~5% of total US electricity demand and forecast to maybe be 10% by 2030. Sure, that increase is certainly not great (data center power demand was flat until recently), but it’s also not something that’s going to make progress impossible either