LLMs certainly hold potential, but as we’ve seen time and time again in tech over the last fifteen years, the hype and greed of unethical pitchmen has gotten way out ahead of the actual locomotive. A lot of people in “tech” are interested in money, not tech. And they’re increasingly making decisions based on how to drum up investment bucks, get press attention and bump stock, not on actually improving anything.

The result has been a ridiculous parade of rushed “AI” implementations that are focused more on cutting corners, undermining labor, or drumming up sexy headlines than improving lives. The resulting hype cycle isn’t just building unrealistic expectations and tarnishing brands, it’s often distracting many tech companies from foundational reality and more practical, meaningful ideas.

  • Etterra@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    A buddy of mine made bacon ice cream once, but um… I think they did it wrong. It was bad. Really really bad.

  • Muffi@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    I think we are about to experience a true Butlerian Jihad. Not because of the fearsome power of AI, but because of the hatred of shitty LLMs.

  • sunzu@kbin.run
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    Damn… I guess they gonna have to hire shiti organics still to sell that disgusting slop?!

    Womp womp

  • Belastend@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    theres a cute programm, call the goblin chef. if you feed it ingredients, along with amounts, and numbers of people to cook for, it spits out some neat recipes.

    But it specifically warns you that it cant actually taste things. If you list ice and bacon, it’ll probably combine those two into a dish. (although now i doesnt recognize “one fresh kitten” as an ingredient anymore q.q)

    • Wogi@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Bacon and ice cream go great together and I refuse to pretend they don’t.

      I still miss midnight snack ice cream. Potato chips covered in chocolate. Delicious

    • SecretSauces@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      I’m pretty sure some of those fancy restaurants that pop up everywhere already do this. They’ll put bacon on anything

  • AlternateRoute@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Mc Donald’s already has customer self serve kiosks and mobile apps with the full menu that limit you as to which items you can add or remove.

    How did they screw this up and leave things open ended for the LLM?

    IE why was the LLM not referencing a list of valid options with every request and then replying with what the possible options are. This is something LLMs are actually able to do fairly well, then layer on top the EXACT same HARD constraints they already have on the kiosk and mobile app to ensure orders are valid?

      • AlternateRoute@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        The self serve terminals and apps actually work well. I prefer using them over ordering at the counter.

        So ya I am surprised they rolled this out so poorly.

    • mosiacmango@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      That wouldn’t even need AI. Thats just a fancy switch statement with a pleasant voice.

      • AlternateRoute@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        4 months ago

        An LLM can somewhat smooth over variances in language without having to have all possible variances known just the valid options and the raw input.

        • I would like a Big Mac, no lettuce, no tomato, no cheese.
        • I would like a Big Mac, no vegies, hold the cheese.
        • I would like a Big Mac, no vegies, no dairy
          • AlternateRoute@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            Natural language is really messy… Could go through many variants on things. Then you get text to speech issues due to audio quality / accents… And you need an engine that can “best guess / best match” based on what it has or ask for clarification.

            Similarly you can as for TWO of a complex thing I would like Two… meals, with, XXXX

    • CarbonatedPastaSauce@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Because most people, including those implementing this shit, have no idea how LLMs work, or their limitations. I see it every day at my job. I have given up trying to patiently explain why they are having issues.

  • Eager Eagle@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Well, what’s the problem. They have bacon and they have ice cr… oh I see the error now. Just add a generic response the ice cream machine is broken and move on!

  • helenslunch@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    4 months ago

    AI has a few potentially super useful applications like an “assistant” like Apple is making, or customer support/troubleshooting (after being trained on first-party documents), or drafting/replying emails and such. They could certainly be used to collect orders in a drive-thru or elsewhere. But these companies just keep trying to shoehorn them everywhere. We are very much in the midst of an AI “bubble” and it’s absolutely astonishing to me how many people seem to be unable to see that.

  • db2@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    But that sounds delicious… completely unbelievable that their ice cream machine was working though.

    • Lost_My_Mind@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      After I’ve seen videos of how infrequently those machines are cleaned, and the ice machines for their drinks too, I just don’t want anything from these fast food places.

      Basically mold growing inside the deep to clean areas, which never get accessed, and then you trust a bunch of immature teenagers to clean to a proper specification?

      Nope. End result is mold.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    You get out ahead of the locomotive knowing that most of the directions you go aren’t going to pan out. The point is that the guy who happens to pick correctly will win big by getting out there first. Nothing wrong with making the attempt and getting it wrong, as long as you factored that risk in (as McDonalds’ seems to have done given that this hasn’t harmed them).

    • AnalogyAddict@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      4 months ago

      The thing most companies are missing is to design the AI experience. What happens when it fails? Are we making options available for those who want a standard experience? Do we even have an elegant feedback loop to mark when it fails? Are we accounting for different pitches and accents? How about speech impediments?

      I’m a designer focusing on AI, but a lot of companies haven’t even realized they need a designer for this. It’s like we’re the conscience of tech, and listened to about as often.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Five Guys does milkshakes with bacon. I’d think that bacon ice cream would work.

  • bstix@feddit.dk
    link
    fedilink
    English
    arrow-up
    0
    ·
    4 months ago

    Those mistakes would be easily solved by something that doesn’t even need to think. Just add a filter of acceptable orders, or hire a low wage human who does not give a shit about the customers special orders.

    In general, AI really needs to set some boundaries. “No” is a perfectly good answer, but it doesn’t ever do that, does it?

    • Lvxferre@mander.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Those mistakes would be easily solved by something that doesn’t even need to think. Just add a filter of acceptable orders, or hire a low wage human who does not give a shit about the customers special orders.

      That wouldn’t address the bulk of the issue, only the most egregious examples of it.

      For every funny output like “I asked for 1 ice cream, it’s giving me 200 burgers”, there’s likely tens, hundreds, thousands of outputs like “I asked for 1 ice cream, it’s giving 1 burger”, that sound sensible but are still the same problem.

      It’s simply the wrong tool for the job. Using LLMs here is like hammering screws, or screwdriving nails. LLMs are a decent tool for things that you can supervision (not the case here), or where a large amount of false positives+negatives is not a big deal (not the case here either).