Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As someone said once, machine dictatorship is very easy—you only need a language model and a critical mass of human accomplices.

The problem is not a Microsoft product being human-like conscious, it’s humans treating it as if it was.

This lowers our defences, so when it suggests suicide to a potentially depressed person (cf. examples in this thread) it might have the same weight as if another person said it. A person who knows everything and knows a lot about you (cf. examples in this thread), which qualities among humans usually indicate wisdom and age and require all the more respect.

On flip side, if following generations succeed at adapting to this, in a world where exhibiting human-like sentience does not warrant treating you as a human by another human, what implications would there be for humanity?

It might just happen that the eventual AIrmageddon would be caused by humans whose worldview was accidentally poison pilled by a corporation in the name of maximising shareholder value.



The /r/replika subreddit is a sad reminder of exactly what you’re talking about. It’s happening, right now.


Oh god I rubbernecked at that place a year or so ago, it was pretty sad then but boy, it escalated.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: