- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
A “natural language query” search engine is what I need sometimes.
Edit: directly reachable with the !ai bang
A “natural language query” search engine is what I need sometimes.
Edit: directly reachable with the !ai bang
You are missing the point. You don’t have to become a subject expert to verify the information. Not all sources are the same, some are incorrect on purpose, some are incorrect due to lax standards. As a thinking human being, you can decide to trust one source over the other. But LLMs sees all the information they are trained on as 100% correct. So it can generate factually incorrect information while believing what it provided you are 100% factually correct.
Using LLMs as a shortcut to find something is like playing a Russian roulette, you might get correct information 5 out of 6 times, but that one time is guaranteed to be incorrect.
No, I understood that. Hence why I said if sourced ethically & responsibly.