If anyone wants to recreate this. I’m 99% the best way to go about this is:
- Buy two identical fluffy toys.
- Take pictures of cute little puppy with fluffy toy.
- Hide one of the toys away 1-2 years.
- Take pictures of cute doggo with the not-destroyed fluffy toy.
I don’t know… I’ve been using ChatGPT4. I use it only where the knowledge it outputs is not important. It’s good when I need help with language related things, as more of a writing assistant. Creative stuff is also OK, sometimes even impressive.
With facts? On moderately complicated topics? I’d say it gets something subtly wrong about 80% of the time, and very obviously wrong 20%. The latter isn’t the problem.
I don’t understand where the “intelligent” part would even come in. Sure, it requires a fair level of intelligence to understand and generate human language responses. But, to me, all I’ve seen fits: generate responses that seem plausible as responses to the input.
If intelligence requires some deeper understanding of the world, and the facts and relationships between them, then I don’t see it. It’s just a coincidence when it looks like it happened. It’s impressive how often that is, but it’s still all it is.