Hacker News new | past | comments | ask | show | jobs | submit login

Just out of curiosity, in what sense is Codex is better trained than CodeGen?



OpenAI hasn't said exactly how they trained code-davinci-002 so this is speculative, but I'm reasonably sure it was trained on more data and languages than CodeGen and for longer. It was also trained using fill-in-the middle [1].

[1] https://arxiv.org/abs/2207.14255




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: