Similarly, by the copernican principle, wouldn't our "local" quantum computer also then be burdened by the work being sent from a near infinity of parallel universes?
I think the answer to that is no and the reason involves entropy. Of all the possible universes most have very low probability, so when we "create" a bunch of parallel universes for our computer, those universes could already be existing which would interfere with the calculating (not in the sense of slowing it down because of too many things running on it, more like a radio channel with lots of interference from nearby channels), but this won't happen in practice as these universes will have very low corresponding probability.
On the other hand there is self interference, from poor engineering of the computer. I think this is a very significant problem now - but it will usually be described in terms of decoherence.
I would say "yes" to that. But fortunately, with the load being so widely distributed, the load on our "local" quantum computers would effectively be zero (ie. x/inf). Unless, of course, our universe is the oddball and most others are running at full capacity. That's a depressing possibility.
The load could be zero, it could be infinite, or anywhere in between. Infinite universes sending infinite work is inf/inf. It's not possible to know if that's going to tend towards something like 0 or something like positive infinity without having some way to measure. But it's an error to just assume it's zero.
I completely agree. I was thinking it highly unlikely that other universes would be operating at full efficiency, so the average would more on the zero side. But, yeah, that's not how infinities work I suppose :) Still, I'm enjoying thinking about it.