• redballooon@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      We’ll see. To date there’s no local runnable generative LLM model that comes close to the gold standard GPT-4. Even coming close to GPT-3.5-turbo counts as impressive.

      • kinttach@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        We only recently got on-device Siri and it still isn’t always on-device if I understand correctly. So the same level of privacy that applies to in-the-cloud Siri could apply here.

        • BudgetBandit@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          My on-device-Siri that lives in my Apple Watch Series 4 is definitely processing everything locally now. She got dumber than I.