Hacker News new | past | comments | ask | show | jobs | submit login

> Now I'd be interested to see GPT-3 trained on code samples from open-source repositories. Would it compile?

Check out:

https://www.tabnine.com/blog/deep

FTA:

"Deep TabNine is trained on around 2 million files from GitHub. During training, its goal is to predict each token given the tokens that come before it....Deep TabNine is based on GPT-2."

So this is GPT-2 not GPT-3, and it's designed to give line-by-line autocompletions, but I'm gathering that the way we're headed, the answer to your first question is approaching "yes"...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: