I use LLMs before search especially when I’m exploring all possibilities, it usually gives me some good leads.
I somehow know when it’s going to be accurate or when it’s going to lie to me and I lean on tools for calculations, being time aware, and web search to help with the lies.
The same can be said about the search results. For search results, you have to use your brain to determine what is correct and what is not. Now imagine for a moment if you were to use those same brain cells to determine if the AI needs a check.
AI is just another way to process the search results, that happens to give you the correct answer up front, most of the time. If you go blindly trust it, that’s on you.
If you knew what the sources were, you wouldn’t have needed to search in the first place. Just because it’s on a reputable website does not make it legit. You still have to reason.
So, if it isn’t important, you just want an answer, and you don’t care whether it’s correct or not?
I use LLMs before search especially when I’m exploring all possibilities, it usually gives me some good leads.
I somehow know when it’s going to be accurate or when it’s going to lie to me and I lean on tools for calculations, being time aware, and web search to help with the lies.
The same can be said about the search results. For search results, you have to use your brain to determine what is correct and what is not. Now imagine for a moment if you were to use those same brain cells to determine if the AI needs a check.
AI is just another way to process the search results, that happens to give you the correct answer up front, most of the time. If you go blindly trust it, that’s on you.
With the search results, you know what the sources are. With AI, you don’t.
If you knew what the sources were, you wouldn’t have needed to search in the first place. Just because it’s on a reputable website does not make it legit. You still have to reason.