Sjmarf@sh.itjust.works to Lemmy Shitpost@lemmy.world · 12 days agoHow to clean a rescued pigeonsh.itjust.worksimagemessage-square55fedilinkarrow-up1726arrow-down18
arrow-up1718arrow-down1imageHow to clean a rescued pigeonsh.itjust.worksSjmarf@sh.itjust.works to Lemmy Shitpost@lemmy.world · 12 days agomessage-square55fedilink
minus-squareKecessa@sh.itjust.workslinkfedilinkarrow-up16arrow-down4·edit-212 days agoPigeon = edible bird Cleaning a bird > preparing a bird after killing it (hunting term) AI figured the “rescued” part was either a mistake or that the person wanted to eat a bird they rescued If you make a research for “how to clean a dirty bird” you give it better context and it comes up with a better reply
minus-squareDannyBoy@sh.itjust.workslinkfedilinkEnglisharrow-up17arrow-down2·edit-211 days agoThe context is clear to a human. If an LLM is giving advice to everybody who asks a question in Google, it needs to do a much better job at giving responses.
minus-squareHighlyRegardedArtist@lemmy.worldlinkfedilinkarrow-up15arrow-down4·12 days agoOr, hear me out, there was NO figuring of any kind, just some magic LLM autocomplete bullshit. How hard is this to understand?
minus-squareRivalarrival@lemmy.todaylinkfedilinkEnglisharrow-up2arrow-down7·11 days agoYou say this like human “figuring” isn’t some “autocomplete bullshit”.
minus-squareHighlyRegardedArtist@lemmy.worldlinkfedilinkarrow-up4·11 days agoYou can play with words all you like, but that’s not going to change the fact that LLMs fail at reasoning. See this Wired article, for example.
minus-squareRivalarrival@lemmy.todaylinkfedilinkEnglisharrow-up1·edit-211 days agoMy point wasn’t that LLMs are capable of reasoning. My point was that the human capacity for reasoning is grossly overrated. The core of human reasoning is simple pattern matching: regurgitating what we have previously observed. That’s what LLMs do well. LLMs are basically at the toddler stage of development, but with an extraordinary vocabulary.
minus-squareFlorianSimon@sh.itjust.workslinkfedilinkarrow-up11arrow-down2·edit-212 days agoI like how you’re making excuses for something that it is very clear in context. I thought AI was great at picking up context?
minus-squareiAmTheTot@sh.itjust.workslinkfedilinkarrow-up5·11 days agoI don’t think they are really “making excuses”, just explaining how the search came up with those steps, which what the OP is so confused about.
minus-squarehuginn@feddit.itlinkfedilinkarrow-up4arrow-down1·11 days agoLet me take the tag off my bird then snap it’s wings back together
Pigeon = edible bird
Cleaning a bird > preparing a bird after killing it (hunting term)
AI figured the “rescued” part was either a mistake or that the person wanted to eat a bird they rescued
If you make a research for “how to clean a dirty bird” you give it better context and it comes up with a better reply
The context is clear to a human. If an LLM is giving advice to everybody who asks a question in Google, it needs to do a much better job at giving responses.
Or, hear me out, there was NO figuring of any kind, just some magic LLM autocomplete bullshit. How hard is this to understand?
You say this like human “figuring” isn’t some “autocomplete bullshit”.
You can play with words all you like, but that’s not going to change the fact that LLMs fail at reasoning. See this Wired article, for example.
My point wasn’t that LLMs are capable of reasoning. My point was that the human capacity for reasoning is grossly overrated.
The core of human reasoning is simple pattern matching: regurgitating what we have previously observed. That’s what LLMs do well.
LLMs are basically at the toddler stage of development, but with an extraordinary vocabulary.
I like how you’re making excuses for something that it is very clear in context. I thought AI was great at picking up context?
I don’t think they are really “making excuses”, just explaining how the search came up with those steps, which what the OP is so confused about.
“You’re holding it wrong”
Let me take the tag off my bird then snap it’s wings back together
deleted by creator