Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well I think it is interesting as an illustration of what it is actually doing - generating words based on the corpus, and what it is not doing - reasoning based on an internal world model or system of logic or mathematics.

People tend to forget and this is a useful reminder.

So if you target ‘complex systems engineering’ with such a model you will inevitably fail unless the work you’re doing is in the training data, in which case you’ll get a bad copy of it.

 help



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: