Sjmarf@sh.itjust.works to Lemmy Shitpost@lemmy.world · 12 days agoHow to clean a rescued pigeonsh.itjust.worksimagemessage-square55fedilinkarrow-up1726arrow-down18
arrow-up1718arrow-down1imageHow to clean a rescued pigeonsh.itjust.worksSjmarf@sh.itjust.works to Lemmy Shitpost@lemmy.world · 12 days agomessage-square55fedilink
minus-squareHighlyRegardedArtist@lemmy.worldlinkfedilinkarrow-up15arrow-down4·12 days agoOr, hear me out, there was NO figuring of any kind, just some magic LLM autocomplete bullshit. How hard is this to understand?
minus-squareRivalarrival@lemmy.todaylinkfedilinkEnglisharrow-up2arrow-down7·11 days agoYou say this like human “figuring” isn’t some “autocomplete bullshit”.
minus-squareHighlyRegardedArtist@lemmy.worldlinkfedilinkarrow-up4·11 days agoYou can play with words all you like, but that’s not going to change the fact that LLMs fail at reasoning. See this Wired article, for example.
minus-squareRivalarrival@lemmy.todaylinkfedilinkEnglisharrow-up1·edit-211 days agoMy point wasn’t that LLMs are capable of reasoning. My point was that the human capacity for reasoning is grossly overrated. The core of human reasoning is simple pattern matching: regurgitating what we have previously observed. That’s what LLMs do well. LLMs are basically at the toddler stage of development, but with an extraordinary vocabulary.
Or, hear me out, there was NO figuring of any kind, just some magic LLM autocomplete bullshit. How hard is this to understand?
You say this like human “figuring” isn’t some “autocomplete bullshit”.
You can play with words all you like, but that’s not going to change the fact that LLMs fail at reasoning. See this Wired article, for example.
My point wasn’t that LLMs are capable of reasoning. My point was that the human capacity for reasoning is grossly overrated.
The core of human reasoning is simple pattern matching: regurgitating what we have previously observed. That’s what LLMs do well.
LLMs are basically at the toddler stage of development, but with an extraordinary vocabulary.