• ShadowRam@fedia.io
    link
    fedilink
    arrow-up
    26
    ·
    21 hours ago

    It’s not all hype.

    nVidia has some SERIOUS R&D in the use of AI for the past 10 years.

    But using AI in the graphic space… upscaling, downscaling, faking lighting, faking physics… This is all very useful in making videogames.

    Then there was a leap in the way AI Image generation was done with the above hardware. And that opened up a whole new growing field.

    It’s just some people took basic language models that have been around for 30 years and scaled them up with their hardware. And it was neat, and surprising some of the stuff a LLM would output. But not reliable.

    And then suddenly a lot of layman’s got their hand on the LLM’s and thought it was the 2nd coming of Jesus, and started throwing big money at it… it will be surprise to no one who knows how these AI’s work that that big money isn’t going anywhere.

    But those first two, is no hype. It’s a real viable use case for the AI, and money will be made there.

    • RogueBanana@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      49 minutes ago

      But the insane growth is because of hype. Doesn’t mean it’s useless or makes it invalid, but they would nowhere be this big if it wasn’t for the AI gold rush going on with all of their data centre cards being sold out immediately despite 50x profit margins and such.