• wizardbeard@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 months ago

    Querying the LLM is not where the dangerous energy costs have ever been. It’s the cost of training the model in the first place.

    • pixxelkick@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      3 months ago

      The training costs effectively enter a “divide by infinity” argument given enough time.

      While they continue to train models at this time, eventually you hit a point where a given model can be used in perpetuity.

      Costs to train go down, whereas the usability of that model stretches on to effectively infinity.

      So you hit a point where you have a one time energy cost to make the model, and an infinite timescale to use it on.

      • Auth@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Costs to train are going up exponentially. In a few years corps are going to want a return on the investment and they’re going to squeeze consumers.