Right, goals by themselves aren't a problem. The simple fix to the Bostrom scenario is "Hey computer, remember what I said about maximizing paperclips? Nevermind that, produced just enough to cover our orders, with acceptable quality and minimal cost."
What kind of AI would respond to that second order by pretending to comply, while formulating a plan to seize control of civilization in order to continue with its true mission? I don't know, but the fact that we can easily imagine a human doing that must have something to with our evolutionary origin, and our in-built drive to survive and reproduce above all else. Maybe we could build a megalomaniacal AI, but we wouldn't do it by accident.
What kind of AI would respond to that second order by pretending to comply, while formulating a plan to seize control of civilization in order to continue with its true mission? I don't know, but the fact that we can easily imagine a human doing that must have something to with our evolutionary origin, and our in-built drive to survive and reproduce above all else. Maybe we could build a megalomaniacal AI, but we wouldn't do it by accident.