Hacker News new | past | comments | ask | show | jobs | submit login

The other elements that may be required could be some version of the continuous sensory input that to us creates the sensation of "living" and, this one is a bit more philosophical, the sensation of suffering and a baseline establishment that the goal of the entity is to take actions that help it avoid suffering.



That's when it gets dangerous, when we try to really recreate animal (human) characteristics in digital form. Combining that with likely 1000-1000000 X increases in performance leads to these superintelligent digital creatures taking over.

Instead, we can focus on the Star Trek computer type stuff that we have with GPT and be incredibly careful about deploying those more animal/humanlike models and higher performance compute. Especially if we deliberately create the next species in digital form, make it 100X or 10000X faster thinking/smarter than us, and then enslave it, that is not only totally stupid but also proven unnecessary by the generality the latest AI models.


I think an AI may have extra qualities by feeling suffering etc., but I don't think these extra qualities are rationally beneficial.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: