Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I was curious to try this myself. I asked it to encode provided sentences using rot13 and, while it rarely did so correctly, it did produce valid encoded words.

Asking it to encode "this is a test sentence" produced:

* guvf vf n grfg fvtangher ("this is a test signature")

* Guvf vf n grfg zrffntr. ("this is a test message.")

* Guvf vf n grfg fnl qrpbqr. ("This is a test say decode.")

* guvf vf n grfg fgevat ("this is a test string")



> it did produce valid encoded words

I wonder if that's a by-product of some of those words existing on the internet and being part of its training set or somehow close enough in context to show up in its pattern-matching logic, rather than any real "understanding"


Well it's not like GPT3 has any other way of "understanding" anything




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: