Hacker News new | past | comments | ask | show | jobs | submit login

For example if it consumes lots of data that uses the pronoun "he" for doctor, it's much more likely to spit out "he" in a medical context.

Since the model lacks world knowledge for new words, sometimes it also ascribes words to a specific cultural origin / group that is completely wrong.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: