Hacker News new | past | comments | ask | show | jobs | submit login

  > I've taken to asking it to "skip the mediocre nonsense and return the good solution on the first try".
Is that actually how you're prompting it? Does that actually give better results?



stuff like this working is why you get odd situations like "don't hallucinate" actually producing fewer hallucinations. it's to me one of the most interesting things about llms




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: