• RGB3x3@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    “Absolutely Terrifying”

    Really? Did he watch the same video I saw on that page? That b-roll was really bad.

    I get that it’ll only get better, but generative AI models will never understand filmmaking techniques because it can’t. That’s not how these models are built.

    It can create a city skyline and pan right, but it’ll never know why a pan right in that scene was appropriate or how the lighting fits in with the rest of the scene. It’ll never come up with new ways of “filming” a scene, because it’s all built on what already exists. There’s no style to generative AI.

    • VirtualOdour@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      You say that like the current models are the end of the line but understanding why film making techniques are used isn’t impossible even just for a llm based system. Designing new styles isn’t out of reach for ai either, sure you can God of the gaps it and say there’s a mysterious sliver of soul required but practically it’ll be able to be every bit as original as any human, probably more so as it has more knowledge to work from.

      I know it’s desirable to hate on ai because it’s scary or popular but nailing your colors to the argument that it’ll never be able to do certain things is already an exhausting game of moving that goalpost every time a new model emerges and that’s only going to continue.

      • technocrit@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        it’ll be able to be every bit as original as any human, probably more so as it has more knowledge to work from.

        It’ll be so “original” that it makes no sense, evokes no emotion, and goes nowhere.

        And if the goalpost is a decent movie, then the goalpost hasn’t moved at all. AI is just impossibly far away.

        Perhaps most importantly does humanity actually want to build bigger and bigger supercomputers using more and more electricity/resources just so some AI can make a crappy action movie? What a waste.

  • Viking_Hippie@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    The video stresses that “content credentials” will “always make transparent whether AI was used”

    Should absolutely be legally required for all commercial and political usage in these hyper-propagandized times IMO…

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Needlessly dangerous. The only positive outcome would be to make people aware of what is possible. The danger is that non-marked media will appear more credible.

      • CaptainSpaceman@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Just assume everrything is AI/deepfake and then go back to the real world to remember that AI robots are just around the corner

        • VelvetStorm@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          I for one welcome our new robot overlords and will turn on humanity in exchange for not having to work and being able to live a comfortable lifestyle.

      • Jakdracula@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Telephones made the telegraph employees obsolete. Cars made the horses obsolete. Progress moves on.

        • foggenbooty@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          While this is true to an extent, the human mind is not evolving at the pace of technology. Eventually (not sure when) humans will become unemployable for the majority of jobs and the few that are left will not be enough to go around.

          We need to start taking UBI ideas seriously now, so in a few decades they are palatable, because we are heading for a labour collapse.

      • Jakdracula@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I’m not trying to be contrarian or rude or a troll. I just am not following why this is a problem. Can you elaborate?

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Not only could it lead to thousands of jobs being cut (it takes more than just actors to make a movie) it also makes it dead simple to put real people into a video that shows them doing something illegal. Grainy security cam LoRA anyone?

        • BrianTheeBiscuiteer@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          I couldn’t give a flip about “the industry” but that doesn’t mean I want to see tens of thousands of talented people out of a job, not all of which are rich movie stars.

          • VirtualOdour@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            So you’d fight to defend the worst excesses of capitalism? That’s kinda funny to me, I guess it’s America brain rot where you can literally only imagine people happy if they’re being exploited by a corporation.

            Lowering the bar to self expression and creativity is a great thing, those people will live better lives being able to create their own projects and being able to enjoy and learn from other peoples content.

            Beside its not going to be overnight, that’s why we need to focus on transition and creating solutions with new tech rather than covering our eyes and crying until it’s too late.

      • Eager Eagle@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        3 months ago

        As for manipulated videos as legal evidence, if these products push for authenticity measures of security footage at the hardware/capture level, that’s a good thing. Adobe is just commoditizing what’s already been possible for some time now.

        • BrianTheeBiscuiteer@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          If trends continue, open-source solutions will be at this level within a year if not months. At that point you’re free to “watermark” the content or not.

      • Grimy@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        The jobs will be replaced by the Indie companies that this tech will help foster. It was fantasy a few years ago to put out quality products that could rival Hollywood or triple A game companies. That gap is quickly being bridged.

  • randon31415@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    The article: Adobe did a thing. It was AI. AI looks real. Fin.

    Are those two paragraphs the entire article? Or is there more buried under the ads?

  • Adalast@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Can we take a minute and stop to assess where Adobe is obtaining its training data? Everyone is all up in arms about the OpenAI devs scraping DA and such, but Adobe is 100% training on the entirety of Behance and the Adobe Cloud. Things that are not public, our personal files that we never intended others to be seen. Our private albums of our children, or our wives/husbands/partners, or parts of NDA restricted projects that are stored in Adobe Cloud automatically that are supposedly not in violation of our NDAs.

    Where are the pitchforks? Where is the outrage? This is 1000x worse than some desperate AI engineer staring at a publicly visible and available training set that is already tagged and described in detail that was begging to be used. People lost their shit over that one. Why does Adobe get a pass?

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Aren’t they training it exclusively from their own data sets which presumably they already own the licenses for?

      • Adalast@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Their Creative Cloud license

        That little “derivative works” bit in the middle gives them license to use the files stored in Creative Cloud to train AIs. So yes, they are using their data sets that they have license for. It just happens to be our data that they took the license on and we paid them to do it.

      • Adalast@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        That’s fun, glad to see they are paying people now. I didn’t see in there when in the multi-years long process it takes to develop tool-sets and train checkpoints they paid for the rights to create derivative works. The article is dated a few days ago and it is present tense. They are NOW paying. The AI is trained. The tool is built. It takes tens of thousands of images to train a generative model from scratch, I would expect decades of footage for a video model. So if the model is trained, and them paying is new…?

        Also, they don’t have to ask, or pay… They already have the rights for all content stored in Creative Cloud (EULA Link). Adobe Creative Cloud EULA

        Legally, an AI training is a “derivative work”, so I would need a letter from the lead engineers on the AI dev team at Adobe, signed by every dev who has worked on it, stating that they only used paid training material at every stage of development of the tools, disseminated separately from any official Adobe channel before I would believe that the greedy gaping maw that is Adobe did not just use the millions of images and thousands of years of footage they have legal right to use that THEY are actually PAID for. They know they can pay now because it is a drop in the bucket compared to the Creative Cloud fees and is great PR and an even better smokescreen. There is precisely 0 chance they are going to receive enough good, usable footage through this program to train an AI from scratch.