Boston Dynamics turned its robot dog into a talking tour guide using AI as seen in a somewhat unsettling video posted by Boston Dynamics. Boston Dynamics used OpenAI’s ChatGPT API, along with some open-source large language models (LLM) to carefully train its responses. It then outfitted the bot with a speaker, added text-to-speech capabilities, and made its mouth mimic speech “like the mouth of a puppet.”

The version speaking in a British accent and the one of a Shakespearean time traveller had me 😂 but it’s certainly a little unsettling overall.

  • GreyBeard@lemmy.one
    link
    fedilink
    English
    arrow-up
    19
    ·
    11 months ago

    Boston Dynamics YouTube channel has been filled with silly videos. Often times they are duel function. 1. Build brand awareness through fun videos, and 2. Show the versatility of the onboard systems. In this case they are showing of the ability to navigate a real world human environment and the sensors/cameras that can be fed into other systems for advanced decision making and planning.

    • Bebo@literature.cafeOP
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      11 months ago

      Also, the robots embracing the “personalities” was interesting (for someone like me who doesn’t have technical knowledge of LLMs etc) as well as entertaining.

      • kingthrillgore@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        They also uploaded this video a day after they showed off new functionality on their “Stretch” robot which more directly impacts livelihoods, as Stretch isn’t cute. Stretch is meant to replace menial labor.

        Spot is cute and proactive. Stretch is what Boston Dynamics is actually selling.

        • Bebo@literature.cafeOP
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          11 months ago

          From what I understood, a short prompt regarding a personality was provided based on which the LLM generated the lines which were converted into speech conveyed to the listener through speakers. (If some technicalities are incorrect feel free to correct me). I used “embraced” kind of metaphorically. The robots themselves didn’t literally embrace a personality.