What's the difference between desires and goals in this context really? You could say he is worried about a reasoning machine "relentlessly programmed" to achieve some goal, but a reasoning machine might just reason itself out of everything you've told it to do. Something so creative, so capable, so beyond us, yet it's going to...assassinate other people for you? Why?
When something goes from being a computer program to a self-aware, conscious being with agency, things change a lot.
Hinton is a major paradox of a human, he has spent his life building the very thing he says will likely doom us, and now spends his life warning us against his own work? So much of this AI doomerism just seems like a "chinese finger trap" for the ultra logical thinker.
It's a fucking weird time to be alive. The 90s felt much less weird and dystopian to me.
When something goes from being a computer program to a self-aware, conscious being with agency, things change a lot.
Hinton is a major paradox of a human, he has spent his life building the very thing he says will likely doom us, and now spends his life warning us against his own work? So much of this AI doomerism just seems like a "chinese finger trap" for the ultra logical thinker.
It's a fucking weird time to be alive. The 90s felt much less weird and dystopian to me.