Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Sentience" is a weird word because it's a such fuzzy concept, but I think if we define it properly there are testable answers to the question of machine sentience.

I would define sentience as an awareness of oneself in relation to one's environment, that generalizes to different aspects of self and environment (eg. social, physical etc)

One of the key aspects of intelligence is learning from 3rd-person perspectives - eg. you observe another ape hunting with a stick, and decide to emulate that behavior for yourself. This seemingly simple behavior requires a lot of work, you have to recognize other apes as agents similar to yourself, then map their actions to your own body, then perform those actions and see if it had the intended outcome.

Current RL agents are not capable of this. The data used for GATO and similar agents for imitation learning are in first person perspective. If an agent could learn from a 3rd person perspective, via direct observations of its environment, I would say that would be the beginning of machine sentience.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: