• TropicalDingdong@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    It’s like the least popular opinion I have here on Lemmy, but I assure you, this is the begining.

    Yes, we’ll see a dotcom style bust. But it’s not like the world today wasn’t literally invented in that time. Do you remember where image generation was 3 years ago? It was a complete joke compared to a year ago, and today, fuck no one here would know.

    When code generation goes through that same cycle, you can put out an idea in plain language, and get back code that just “does” it.

    I have no idea what that means for the future of my humanity.

    • Grandwolf319@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I agree with you but not for the reason you think.

      I think the golden age of ML is right around the corner, but it won’t be AGI.

      It would be image recognition and video upscaling, you know, the boring stuff that is not game changing but possibly useful.

      • zbyte64@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        I feel the same about the code generation stuff. What I really want is a tool that suggests better variable names.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      you can put out an idea in plain language, and get back code that just “does” it

      No you can’t. Simplifying it grossly:

      They can’t do the most low-level, dumbest detail, splitting hairs, “there’s no spoon”, “this is just correct no matter how much you blabber in the opposite direction, this is just wrong no matter how much you blabber to support it” kind of solutions.

      And that happens to be main requirement that makes a task worth software developer’s time.

      We need software developers to write computer programs, because “a general idea” even in a formalized language is not sufficient, you need to address details of actual reality. That is the bottleneck.

      That technology widens the passage in the places which were not the bottleneck in the first place.

      • TropicalDingdong@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        I think you live in a nonsense world. I literally use it everyday and yes, sometimes it’s shit and it’s bad at anything that even requires a modicum of creativity. But 90% of shit doesn’t require a modicum of creativity. And my point isn’t about where we’re at, it’s about how far the same tech progressed on another domain adjacent task in three years.

        Lemmy has a “dismiss AI” fetish and does so at its own peril.

        • rottingleaf@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 months ago

          Are you a software developer? Or a hardware engineer? EDIT: Or anyone credible in evaluating my nonsense world against yours?

            • rottingleaf@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              So close, but not there.

              OK, you’ll know that I’m right when you somewhat expand your expertise to neighboring areas. Should happen naturally.

            • hark@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              That explains your optimism. Code generation is at a stage where it slaps together Stack Overflow answers and code ripped off from GitHub for you. While that is quite effective to get at least a crappy programmer to cobble together something that barely works, it is a far cry from having just anyone put out an idea in plain language and getting back code that just does it. A programmer is still needed in the loop.

              I’m sure I don’t have to explain to you that AI development over the decades has often reached plateaus where the approach needed to be significantly changed in order for progress to be made, but it could certainly be the case where LLMs (at least as they are developed now) aren’t enough to accomplish what you describe.