Assume, for the sake of the argument, that we'll eventually have computers powerful enough to simulate brains to the accuracy we think necessary for it to function.
There are several ways this experiment could go:
First, we could fail to produce a mind, because there's some secret sauce to it we're not aware of (eg a God-given soul).
Second, we could produce a zombie, indistinguishable from a conscious individual while actually not being conscious (though note that we'd have to treat it as if it were conscious, for reasons that should be obvious).
Third, we could produce a conscious mind.
I'm in the camp that thinks option three is plausible.
Let's assume I'm right. Now, instead of a supercomputer, give every one of the billions of humans on the planet an instruction manual, a pocket calculator and a phone to communicate results, and have them do the exact same calculations the supercomputer would do. Despite the latency, if option three were true, we should expect that this would still produce a conscious mind, albeit a rather slowly-thinking one.
There are several ways this experiment could go:
First, we could fail to produce a mind, because there's some secret sauce to it we're not aware of (eg a God-given soul).
Second, we could produce a zombie, indistinguishable from a conscious individual while actually not being conscious (though note that we'd have to treat it as if it were conscious, for reasons that should be obvious).
Third, we could produce a conscious mind.
I'm in the camp that thinks option three is plausible.
Let's assume I'm right. Now, instead of a supercomputer, give every one of the billions of humans on the planet an instruction manual, a pocket calculator and a phone to communicate results, and have them do the exact same calculations the supercomputer would do. Despite the latency, if option three were true, we should expect that this would still produce a conscious mind, albeit a rather slowly-thinking one.