Hacker News new | past | comments | ask | show | jobs | submit login

The human in the loop isn’t any kind of moat for humanity and shrinking the need for a human will be the goal of any engineer trying to build these systems. Some do believe that hallucinations are inescapable from the design of LLMS, but at the same time we see systems improving so that hallucinations becomes more rare… when will it be on par with humans?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: