• sunbeam60@lemmy.one
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        Ugh. You’re probably right. Finally all those idiots who come up to me going “I’ve got a great idea for an app” will actually be able to release their great idea :)

        I used to be able to say “ideas are easy, work is hard”. Now we won’t be.

        • TechNom (nobody)@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          9 months ago

          I’m yet to hear anyone saying that chatGPT can navigate the complex series of design decisions needed to create a cohesive app (unless of course, it was trained on something exactly the same). Many people report spending an inordinate amount of time rectifying the mistakes these LLMs make. It sounds like a glorified autofill (I haven’t used them yet). I shudder to think about the future of the software ecosystem if an entire generation is trained to rely entirely on them to create code.

          • PM_Your_Nudes_Please@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            9 months ago

            LLM is great for writing code in small snippets. I’ve used it for quickly writing batch files, for instance. I couldn’t be bothered to look up how to format something obscure. So I use an LLM like ChatGPT to do the bulk work, then I just double check what it gave me.

            I wouldn’t use it for anything over ~100 lines at a time. Just like with long conversations, it will have a tendency to “lose the plot” and start forgetting things that it said early on. Because as things get added to the conversation it has to parse more and more data. So it’ll start to drift off topic as conversations get longer.

            It can also be handy for debugging sections of code. Because programming is just a form of language with strict grammar/diction/spelling rules. And a LLM will be really really good at spotting stupid grammar mistakes. It’ll instantly notice your missing semicolon and point it out to you, which can save you a ton of frustration.

            Just like with any tool, how well it works is entirely up to the user. It will likely progress to the point of being able to manage longer code eventually. But right now it’s still incredibly useful as long as you accept its limitations and work within them.

          • sunbeam60@lemmy.one
            link
            fedilink
            arrow-up
            2
            ·
            9 months ago

            I think you’re right at the minute. Whether you’ll be right in the future I’m less certain.

      • Swedneck@discuss.tchncs.de
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        and they’re going to be precisely as nonsensical as those AI articles are

        sure, you can get good output from LLMs, but companies are absolutely not going to bother putting in the effort to do so, as not putting in effort is the entire point.

        it’s at least nice to know that corporations will enshittify themselves out of existence, while one guy living in a basement will silently release something they poured their soul into and it sells 5 billion copies in the hour

    • Milk_Sheikh@lemm.ee
      link
      fedilink
      arrow-up
      16
      ·
      9 months ago

      AI for the heavy lifting, some poor overworked freelancer overseas fixes issues and refines, and then maybe, mayyyybe a domestic review team of senior coders for pen/security testing.

      !remindme 2030

    • ForgotAboutDre@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      9 months ago

      People wrote software before there’s was computers for them to grow up with. They’ll be able to develop these skills in university’s, colleges, coding courses or online.

      I grew up prior to the app world. My exposure to computing during highschool was word, excel, access and once we used PowerPoint. Nothings changed, people are only taught what the teachers know.

      • TechNom (nobody)@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        I started from a similar background in school. Learning from books in the library and coding on a sheet of paper. Opportunities to get that in a real computer was hard to come by. Some teachers helped by pitching in to get me a few hours in the school lab. Those who like it start learning well before the resources become available. You don’t need to wait till UG to gain those skills.

        That said, how often do you see kids these days using a real general purpose computer suitable for coding? Like a desktop or laptop? Not phones, Chromebooks or tablets. In fact, it’s bewildering these days to see programming tutorials start with a statement saying that you need such a device. It was a given, back in the day. And the other stories here don’t paint a good picture.

        • ForgotAboutDre@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          9 months ago

          It’s probably the same amount as before. More phones and tablets haven’t had a big effect on the amount of general purpose computers. There’s devices today like raspberry pi and Arduino that fill the same niche as older general-purpose computers.

          Your assume things are different and must be worse. This is a take old as time. Socrates complained about the youth no longer taking the studies as serious as his generation did. The world would have fallen into complete chaos if it were ever true. It’s the conservative myth that things were better and can only get worse.

          These kids accessing websites that tell you that a general purpose computer is needed, would have to rely on textbooks and magazines to get the same information in the past. A much bigger barrier, even identifying which ones you need.