Hacker News new | past | comments | ask | show | jobs | submit login

Hallucination is definitely a problem but can be somewhat mitigated by good prompting. GPT-4 seems less prone to hallucination. This will be better over time.

You can view the prompts used for generating docs here[1] and the prompts used for answering questions here[2]

[1]https://github.com/context-labs/autodoc/blob/master/src/cli/... [2https://github.com/context-labs/autodoc/blob/master/src/cli/...]




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: