Personally I don’t really trust the LLMs to synthesize disparate sources.
The #1 best use case for LLMs is using them as extremely powerful fuzzy searchers on very large datasets, so stuff like hunting down published papers on topics.
Dont actually use their output as the basis for reasoning, but use it to find the original articles.
For example, as a software dev, I use them often to search for the specific documentation for what I need. I then go look at the actual documentation, but the LLM is exceptionally fast at locating the document itself for me.
Basically, using them as a powerful resource to look up and find resources is key, and was why I was able to find documentation on the symptoms of my pet so fast. It would have taken me ages to find those esoteric published papers on my own, there’s so much to sift through, especially when many papers cover huge amounts of info and what Im looking for is one small piece of info in that one paper.
But with an LLM I can trim down the search space instantly to a way way smaller set, and then go through that by hand. Thousands of papers turn into a couple in a matter of seconds.
The #1 best use case for LLMs is using them as extremely powerful fuzzy searchers on very large datasets, so stuff like hunting down published papers on topics.
Dont actually use their output as the basis for reasoning, but use it to find the original articles.
For example, as a software dev, I use them often to search for the specific documentation for what I need. I then go look at the actual documentation, but the LLM is exceptionally fast at locating the document itself for me.
Basically, using them as a powerful resource to look up and find resources is key, and was why I was able to find documentation on the symptoms of my pet so fast. It would have taken me ages to find those esoteric published papers on my own, there’s so much to sift through, especially when many papers cover huge amounts of info and what Im looking for is one small piece of info in that one paper.
But with an LLM I can trim down the search space instantly to a way way smaller set, and then go through that by hand. Thousands of papers turn into a couple in a matter of seconds.