Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ok, what when I talk to you, and say, "so, I saw Mike the other day in an accident." What do you reply? "oh, how is Mike? Did Mike hurt Mike?" Or do you reply "Oh, how is he? Did he hurt himself?"


And adding to this, isn't the whole issue assuming that Mike is a male? Imagine training a medical bot that gets stuck in a loop asking about pronouns in a an automated 911 call, for instance.

Modern society isn't easy to program for, if we're trying to make robots that conform to (western, minority) "norms".


So don’t code that for a bot that responds to 911 calls.


Going back to my original post: "I kind of think that we need to steer AIs away from trying to talk like humans at all."

The question is "why would I be talking to ai about an accident involving another person?"

Is it a medical AI? then perhaps it would be better to refer to Mike as "the patient", etc. Which is how it would be handled in code.

Are you looking to make friends with your ai? Then no, I think you need to go find real humans to talk to, who can navigate complex human social interactions better.

Robots are not a substitute for humans.


>Robots are not a substitute for humans.

Well, there is Replika.ai which seems to be doing alright for itself.


But then you're not speaking to Mike, right? You're speaking about Mike. So unless the person you're speaking with gets offended on Mike's behalf I don't see what the problem is.


A lot of people get offended on other people's behalf


It’s easy to just say “they” and its grammatically correct




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: