Hacker News new | past | comments | ask | show | jobs | submit login
Quantum Computers as Bayesian Networks (arxiv.org)
43 points by psakkaris on July 28, 2016 | hide | past | favorite | 9 comments



My opinion of this paper is not very high. It's not totally confused or crankery or anything like that, but the author doesn't show the data structure they propose doing anything well. Also they seem to think that it will somehow lead to a polynomial time classical simulation for quantum mechanics for no good reason.

I'll just list three quotes that bugged me:

> The X, H, R(k), M and SWAP gates are discussed in detail and results show linear scaling as the number of qubits are increased.

How did they manage to spend linear time per operation when you can't even make two qubits interact with the given operations? Just working on the qubits individually will be constant time instead of linear time.

> The efficiency is still unresolved for these coherent control gates. The number of edges can still grow exponentially for coherent control gates, however, it remains to be seen if duplicate edges can be efficiently merged.

They didn't do the hard part yet, but the easy part was easy, so they expect the hard part to be easy.

> if BPP = BQP then it is possible that quantum mechanics can be de-randomized by a deterministic theory thus confirming Einstein’s conviction that nature does not play dice

Confusing "can be simulated classically in polynomial time" with "is not random". The existence of merge sort doesn't prove that quick sort isn't using randomness.

The arguments for randomness in quantum mechanics aren't based on complexity in the first place (e.g. we can trivially simulate Bell tests).


Exact inference in Bayesian networks is known to be NP-hard (see http://www2.stat.duke.edu/~sayan/npcomplete.pdf) (apparently, it's actually #P-hard). I'm not sure inference in Bayesian networks is in BPP in the first place, so the claim that this is grounds to prove BPP=BQP is pretty unlikely.


Thank you for the criticisms, It will really help with revision so I can explain the work better:

1. The work is "easy" or "trivial"

I disagree. It does not scale the way we want yet for coherent control gates but we were able to scale the terminal QFT accurately by using Graph Theory on a classical computer vs matrix mechanics. That is an alternate algorithm at the least so not very easy. For example, if someone told you go implement the terminal QFT on a classical computer using graph theory how quickly do you think you could have done it? Or even better, what if I tell you now "implement an alternate scalable terminal QFT algorithm on a classical computer without using QuDot Nets" how easy do you view that question?

2. Maybe I did not explain the IF BPP=BQP argument well

IF BPP=BQP, and that is a BIG IF, then it is very likely that a different deterministic theory could reproduce the randomness in quantum mechanics. We are not saying that quick sort isn't random, we are saying that quick sort can be replaced by a non random algorithm. Why? because there is much evidence that P=BPP which implies that randomized algorithms can have equivalent non-random algorithms. Similar to how the primality testing algorithm was shown to have a non-random equivalent. "Quantum computing from Democritus" does a nice job of explaining this.

Lastly, it could be that this graph algorithm of quantum computing will never be able to scale coherent control gates. But at the very least we came up with an alternate algorithm to do quantum computing on a classical device. Not completely worthless right? What else are you going to do on a Saturday night? :-)

Thanks again for feedback, even though it was negative


For 1: I didn't mean to imply anything about the work being easy or hard, by the way. I was just contrasting against the difficulty of proving BQP=BPP. (e.g. I might refer to Terence Tao's work on Navier-Stokes [1] as 'solving an easier version', but certainly no connotation about the difficulty of the work would be intended).

But I do think you should avoid making such large claims based on what was actually done. It comes off as naive; unaware of how much work has gone into trying to make these things work. Of course people have tried hierarchical models!

For 2: You're still confusing "I have an efficient deterministic algorithm to make predictions about X" with "X must be deterministic". The efficiency of the algorithm is totally independent of whether or not the other system is "really" random or not! Some random processes are downright trivial to make predictions about, even with really truly literally random inputs. And some are arbitrarily hard to predict, even when we replace the randomness with a cryptographic pseudo-random number generator (but don't tell you the seed).

Consider that we already have deterministic algorithms that predict quantum mechanics. The fact that the algorithms take a long time to run has little bearing on the predictions they make. Finding a more efficient algorithm for quantum won't turn it into a deterministic process. You'll still be able to pass Bell tests, and the algorithm had better predict that too or its demonstrably wrong!

1: https://terrytao.wordpress.com/2014/02/04/finite-time-blowup...


It is 12:00am. I just read your "implement an alternate scalable terminal QFT algorithm" note.

So, because of the restrictions on earlier operations, our input is a bunch of separable qubits. That helps a lot. At first I was thinking of turning phase gradients on one side into increments on the other, but there's a much easier way. Just measure each qubit after it gets Hadamarded. All the phase steps can then be conditioned classically.

Now it's 12:10 am.

3 more minutes to make an example circuit in Quirk... and done: http://algorithmicassertions.com/quirk#circuit=%7B%22cols%22...


Agreed. They should probably have held off till they can demonstrate the set of gates required for universal quantum computation. These are well known and have been the target of experimentalists for decades now. In fact, what has held back a quantum computer from being made thus far is that no single hardware implementation has achieved the required accuracy across the full set of gates. If single qubit operations were all that was needed we'd be done by now :)

The tone of the article also seems to imply that the quantum information community is remiss for assuming that P != NP. Everyone is aware that this is a open question: it's probably the largest open question in all mathematics. So if this new approach does yield a way to find new efficient solutions to NP hard problems, that would be amazing and prize worthy.


This. The result is utterly trivial since they are skipping coherence:

It remains to be seen if we can extend the linear (or polynomial) efficiency to coherent control gates.

The efficiency of our implementation of coherent control-U gates requires further study. Our results so far show that in the worst case the number of edges grows as 2n where n is the number of control qubits.

Yup. Nothing to see here.


(You lost the exponent in the quote. It's 2^n, not 2n.)


All kinds of architectures and algorithms can be implemented at the chip level in silicon. But, until recently, the "just make a faster Von Neumann machine" school of chip design has always beaten the "make a task optimized chip" school.

Is there a similar tension of the deign of Q-computers? Can we make an educated guess on which design school will dominate?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: