• Ludrol@szmer.info
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    In 2022 AI evolved into AGI and LLM into AI. Languages are not static as shown by old English. Get on with the times.

    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      They didn’t so much “evolve” as AI scared the shit out of us at such a deep level we changed the definition of AI to remain in denial about the fact that it’s here.

      Since time immemorial, passing a Turing test was the standard. As soon as machines started passing Turing tests, we decided Turing tests weren’t such a good measure of AI.

      But I haven’t yet seen an alternative proposed. Instead of using criteria and tasks to define it, we’re just arbitrarily saying “It’s not AGI so it’s not real AI”.

      In my opinion, it’s more about denial than it is about logic.

    • Fedizen@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      7 months ago

      Changes to language to sell products are not really the language adapting but being influenced and distorted

        • randomsnark@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          I think the modern pushback comes from people who get their understanding of technology from science fiction. SF has always (mis)used AI to mean sapient computers.

      • Echo Dot@feddit.uk
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        LLMs are a way of developing an AI. There’s lots of conspiracy theories in this world that are real it’s better to focus on them rather than make stuff up.

        There really is an amazing technological development going on and you’re dismissing it on irrelevant semantics