quantum computers are not just faster classical computers. Quantum algorithms have no classical counterpart. In fact the literal clock speed per quantum computation may be slower but the computation will still get done faster overall.
But there’s no computation a quantum computer can do that a classical computer can’t. In that sense, it’s still a Turing machine. It can do some class of computation faster (in principle).
Richard Feynman describes such computation, that produces incorrect results on classic computer and correct one on quantum in "Simulating Physics with Computers"[0]. He follows one physical experiment detailing every step from start to finish and how it converges to incorrect result on classical computer.
However, Feynman does not claim what you say he claims. His whole article is about efficiently simulating QM on a classical computer, and he shows that is not possible given what we understand of quantum probability today, at least (he does this by explicitly asking for a computer which does not grow exponentially in size with the size of the physical system it wants to simulate).
In modern CS terms, what he is discussing is whether the complexity classes BPP and BQP are equal or not (as far as we know today, just as he claims, BPP seems to be a smaller subset of BQP, but this is not proven).
This is all perfectly in line with my claim, and is in fact explicitly there in Feynman's paper: a classical computer can perfectly simulate a quantum system/computer, but requires exponential time/space to do so (as far as we know).
Perfectly except for random number generation right? The randomness of QM can never be decoded or attacked. If we are talking about fudging/"simulating" randomness with classical pseudorandom number generators and relying on computational complexity prohibiting decoding the patterns, and quantum computers are theoretically able to speed up factorization of large numbers, don't we have a problem?
Isn't there a point where we could say we can't really simulate powerful enough quantum computers because they can "decode" the patterns behind any psuedorandom computation so quickly?
Like no one in the right mind would be using classical computers at that point except for anything except basic computation and word processing. We might as well say humans and pen and papers can simulate quantum computers. Quantum computer capabilities may outstrip classical computers nearly as much as they do humans with pen and paper.
"Simulation" kind of loses it's meaning when taken far enough.
Random numbers are random numbers. You can calculate the probabilities (the wave function amplitudes, which are deterministic); or you can use any source of random noise to reproduce the data, once you adjust for the difference between quantum and classical probability.
For an extreme example, there are finite patterns behind the lava lamps at Cloudfare or the chaos of Jupiter's storms. These are sufficiently random for any current need I can currently imagine though.
But then I also see that "classical computers can never be built big enough to explore more than 400, actually more than, probably 100 qubits, 100 qubits doesn’t seem like very much. No classical computer can do the calculation of following what 100 qubits do" https://blog.ycombinator.com/leonard-susskind-on-richard-fey...
Are we really in no danger of quantum computers being able decipher patterns behind these traditional sources of noise? There is no pattern to quantum randomness. Aren't we going to have switch to truly random sources of noise eventually, instead of pseudorandom ones (anything non quantum)?
A lava lamp is a chaotic system. The same initial conditions, no matter how precisely measured, will not result in the same outcome, it will diverge. It is non-deterministic in the real world.
Wait, let's be clear. In the real world a lava lamp is basically impossible predict as it evolves. And that is due to chaos. So far so good.
But this chaos is deterministic. Chaos means highly sensitive to initial conditions and involves nonlinearity, but it is still entirely classical and deterministic. In the real world we do not measure things precise enough to keep track of a lava lamp's deterministic evolution, but it is there within the chaos.
So I am wondering if a quantum computer, which Susskind says takes only 100 qubits to outperform any Turing Machine constructable ever, may one day do better at picking out the deterministic patterns behind the chaos of things like lava lamps. And if that happens, we may need more extreme versions of chaotic systems to keep secrets. And since quantum randomness is the only true randomness in the universe, forever indiscernable in principle; will one day all deterministic, chaotic means of adding "randomness" be replaced with quantum sources of randomness, due to how powerful quantum computers are?
*Now you could say even the lava lamp involves quantum randomness because everything is ultimately quantum. But because it is so macroscopic, it behaves more classically the a smaller quantum system.
> a quantum computer, which Susskind says takes only 100 qubits to outperform any Turing Machine constructable ever
It's very important to understand that this only applies for a limited set of algorithms. QCs are not universal accelerators. In particular, if picking out this deterministic patterns from the apparent chaos were an NP problem, the QC would be just as slow as any other computing machine that we know so far.
You're also misunderstanding how chaotic systems work. With a chaotic system, even if you know the precise time evolution rules, you're not going to be able to predict the outcome at time T, because a tiny difference in the initial conditions, or a tiny interference from the outside world, will mean vastly different outcomes.
In fact, QCs would be particularly BAD at predicting the outcome of a chaotic system, because QCs can only give answers up to some error bound, unlike classical computers which can perform exact calculations. But the error introduced by the QC itself is probably going to compound the imprecision in the initial measurements of your chaotic system.
One final note that is important to state: the problem with predicting chaotic systems is not physical or computational, it is mathematical. You can have even simple systems whose solution can vary orders of magnitude more than a variance in the parameters. Solving such a system is easy and fast, but the solution is physically meaningless: a 0.01% error in the measurements can mean that you solution is off by a factor of 100.
If chaos is not a problem for computation, why do we always hear weather simulations needing better and better supercomputers? If simulating/computing chaos is just about getting precise enough initial measurements, then that wouldn't seem to be a need for weather simulating right?
I was thinking along the lines of running current observed data backwards to fine tuned initial conditions. That must require lots of computational power. Are we sure quantum computers and quantum algorithms wont speed this part up. That has to be somwewhat isomorphic to factoring large numbers, which I thought QC does have a potential advantage. But maybe chaos is distinct from factoring, I don't know much about how it would be modeled computationally. I realize most of the battle is getting proper initial conditions. But quantum computing is also coming at time when quantum sensing is growing. To me with quantum computing speed ups and quantum sensors, we have the two ingredients necessary to make progress on chaotic systems. Better initial conditions and better computational methods. That was my thought. Sorry for the ramble
It might be possible to predict the pattern in the lava lamp, but to do so would require cutting it off from the rest of the universe. Light, heat, convection with room air, variations in local gravity all are going to effect the flow.
>"That's all. That's the difficulty. That's why quantum mechanics can't seem to be imitable by a local classical computer."
I don't think argument is about efficiency. "a classical computer can perfectly simulate a quantum system/computer" is not explicitily there, it's an argument against that. It seems to me you're saying anything that's not strictly proving BQP > BPP supports something else.
> The rule of simulation that I would like to have is that the number of computer elements required to simulate a large physical system is only to be proportional to the space-time volume of the physical system. I don't want
to have an explosion. That is, if you say I want to explain this much physics,
I can do it exactly and I need a certain-sized computer. If doubling the
volume of space and time means I'll need an exponentially larger computer, I consider that against the rules (I make up the rules, I'm allowed to do
that).
He emphasizes this again in the section about computing the probabilities:
> We emphasize, if a description of an isolated part of nature with N
variables requires a general function of N variables and if a computer stimulates this by actually computing or storing this function then doubling the size of nature (N->2N) would require an exponentially explosive
growth in the size of the simulating computer. It is therefore impossible,
according to the rules stated, to simulate by calculating the probability. [emphasis mine]
So when he uses the term 'computer' he doesn't mean 'abstract Turing machine', he explicitly means 'realizable/efficient Turing machine'.
If we want to reach AGI within the lifetime of the universe, and quantum effects are required for consciousness [within the lifetime etc...], it stands to reason we'd need quantum computers.
Yes, that is true, at least as far as we know today (it's not yet mathematically proven that quantum computers can't be efficiently simulated by probabilistic classical computers, even though we are almost certain of this).