- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
You must log in or register to comment.
Yeah, and the way it will confidently give you a wrong answer instead of either asking for more information or saying it just doesn’t know is equally annoying.
Because giving answers is not a LLM’s job. A LLM’s job is to generate text that looks like an answer. And we then try to coax framework that into generating correct answers as often as possible, with mixed results.