Thanks to rapid advancements in generative AI and a glut of training data created by human actors that has been fed into its AI model, Synthesia has been able to produce avatars that are indeed more humanlike and more expressive than their predecessors. The digital clones are better able to match their reactions and intonation to the sentiment of their scripts—acting more upbeat when talking about happy things, for instance, and more serious or sad when talking about unpleasant things. They also do a better job matching facial expressions—the tiny movements that can speak for us without words.

But this technological progress also signals a much larger social and cultural shift. Increasingly, so much of what we see on our screens is generated (or at least tinkered with) by AI, and it is becoming more and more difficult to distinguish what is real from what is not. This threatens our trust in everything we see, which could have very real, very dangerous consequences.

“I think we might just have to say goodbye to finding out about the truth in a quick way,” says Sandra Wachter, a professor at the Oxford Internet Institute, who researches the legal and ethical implications of AI. “The idea that you can just quickly Google something and know what’s fact and what’s fiction—I don’t think it works like that anymore.”

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    This is the best summary I could come up with:


    “I think we might just have to say goodbye to finding out about the truth in a quick way,” says Sandra Wachter, a professor at the Oxford Internet Institute, who researches the legal and ethical implications of AI.

    We’re about to take on a topic that’s pretty delicate and honestly hits close to home—dealing with criticism in our spiritual journey,” I read off the teleprompter, simultaneously trying to visualize ranting about something to my partner during the complain-y version.

    Historically, making AI avatars look natural and matching mouth movements to speech has been a very difficult challenge, says David Barber, a professor of machine learning at University College London who is not involved in Synthesia’s work.

    And while anyone can join the platform, many features aren’t available until people go through an extensive vetting system similar to that used by the banking industry, which includes talking to the sales team, signing legal contracts, and submitting to security auditing, says Voica.

    Claire Leibowicz, the head of the AI and media integrity at the nonprofit Partnership on AI, says she worries that growing awareness of this gap will make it easier to “plausibly deny and cast doubt on real material or media as evidence in many different contexts, not only in the news, [but] also in the courts, in the financial services industry, and in many of our institutions.” She tells me she’s heartened by the resources Synthesia has devoted to content moderation and consent but says that process is never flawless.

    It really shines when presenting a story I wrote about how the field of robotics could be getting its own ChatGPT moment; the virtual AI assistant summarizes the long read into a decent short video, which my avatar narrates.


    The original article contains 4,026 words, the summary contains 289 words. Saved 93%. I’m a bot and I’m open source!

  • Hello Hotel@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    It looks like the screen is smeard in vasoline. To be fair, you have to be primed to notice or halucinate it. Did the AI decide the movements too, because that woman looks possesed, her eyes behave correctly but she moves her jaw wildly at the beginning of every sentence. she even twitches at one point.

  • sgibson5150@slrpnk.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    I had anticipated that there would be an uptick in cryptographic signing to combat the problem as this sort of fakery has become ubiquitous, which in my mind would assure the recipient of a file that

    A) the file is unaltered after the date/time of the signing B) that the file was created by the named photographer or videographer

    This is not proof of authenticity but with a verifiable source, the file recipient could at least judge for themselves based on the reputation of the file creator (say, a notable AP photojournalist vs. some random schmoe).

    Thus far, whenever I have raised this idea in a public forum, it has met with silence or even derision. What am I missing?

      • sgibson5150@slrpnk.net
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        5 months ago

        You seem to be conflating NFTs and digital signatures. Any file can be signed, unrelated to any sort of block chain technology. See PGP and related for more information.

        Edit: fixed a typo

        • Quereller@lemmy.one
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          Why should I trust the authenticity of your signning key? Solution 1: web of trust like PGP. Impossible with foreign content. Solution 2: Trusted Certificate authorities (private/ state/ UN) Solution 3: a block chain (scaling problems)

          • sgibson5150@slrpnk.net
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            5 months ago

            If I generate a key pair and use it to sign a file and distribute it and then I publish the public key somewhere like Facebook, any recipient of the file could be assured that the file originated from my Facebook account. A commercial certificate is not required to do this. As to whether the Facebook account holder is actually me is another problem, but hopefully major social media platforms require at least a photo ID.

            Edit: Sorry, I said public certificate when I meant commercial certificate.

  • TypicalHog@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    I’m not even gonna read this shit. It’s SO SO OBVIOUS there are many usecases for it.

  • inb4_FoundTheVegan@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    It’s noble how many of you are willing to get philosophical about the rise of deep fakes freeing us from puritan beliefs and readdressing the concept of truth.

    While completely fucking ignoring the harassment and extortion of deep fakes. Y’all want to get high minded about YOUR right to free speach using OTHER peoples bodies as a gateway to some utopia, while playing dumb that this is just another form of mysgonstic abuse. If it truly is just you something you are doing in the privacy of your own home, why the fuck do you need other people’s media?

    Your ideals are built upon YET AGAIN women taking on for the team. The “truth” is immposible to know so YOLO, let’s turn any women who made the mistake of being photographed in to porn. Her consent doesn’t matter between the privacy of me and my dataset, even if I do upload it and blackmail her a lil.