Hacker News new | past | comments | ask | show | jobs | submit login

Yes, the most likely completion, but not necessarily from the same writer. This is learned from any text it has seen somewhere, which includes text in various formats, like interviews. So when you have a question, most likely the next sentence might be the answer (from another person). Or to make that more explicit, just put a prefix like "Q1? A1. Q2? A2. Q3?" And "Q3" is your question. And now let it auto-complete. Most likely the auto-completion is "A3".



Now I'd be interested to see GPT-3 trained on code samples from open-source repositories. Would it compile?

Could it output a decent implementation of an algorithm if you were to feed it the comment describing it? How about more general statements about input and output.

The holy Grail would be to code just by describing what you expect the code to do. A few plain-language (maybe a more structured subset?) sentences stitched together, the API glue autocompleted.

And for reverse-engineering? Train it on drivers, then feed it packet captures. Could it make sense of the data?


> Now I'd be interested to see GPT-3 trained on code samples from open-source repositories. Would it compile?

Check out:

https://www.tabnine.com/blog/deep

FTA:

"Deep TabNine is trained on around 2 million files from GitHub. During training, its goal is to predict each token given the tokens that come before it....Deep TabNine is based on GPT-2."

So this is GPT-2 not GPT-3, and it's designed to give line-by-line autocompletions, but I'm gathering that the way we're headed, the answer to your first question is approaching "yes"...


There was some good discussion about this on another GPT-3 this weekend, but I don't have the link handy.

The author prompted GPT-3 with some questions like, what is 10-1 (9), 100-1 (99), 1000-1 (999), 10000-1 (9099); i.e. after a while, it can't really "recurse" deeply enough to get the right answer anymore. The author also asked it some coding questions; it could answer something like "write a Ruby function to count the number of Xs in a word" but not "reverse the list [foo bar baz]" (not the exact examples, sorry). There again seems to be a point where it gets the idea, but can't compute deeply enough to actually answer this sort of question.

Edit: I found it! https://news.ycombinator.com/item?id=23887637


https://twitter.com/sharifshameem/status/1284095222939451393

I mean, this tweet is what started the latest round of GPT hype.


yes no doubt it is impressive. But some people are speculating that a lot of cherry picking is done for this demo. I have access to gpt-3 but I am unable to reproduce such results.


Classic tale :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: