• GeneralInterest@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    Maybe it’s like the dotcom bubble: there is genuinely useful tech that has recently emerged, but too many companies are trying to jump on the bandwagon.

    LLMs do seem genuinely useful to me, but of course they have limitations.

    • datelmd5sum@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      We’re hitting logarithmic scaling with the model trainings. GPT-5 is going to cost 10x more than GPT-4 to train, but are people going to pay $200 / month for the gpt-5 subscription?

      • Skates@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        Is it necessary to pay more, or is it enough to just pay for more time? If the product is good, it will be used.

      • Madis@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        But it would use less energy afterwards? At least that was claimed with the 4o model for example.

      • GeneralInterest@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        Businesses might pay big money for LLMs to do specific tasks. And if chip makers invest more in NPUs then maybe LLMs will become cheaper to train. But I am just speculating because I don’t have any special knowledge of this area whatsoever.