I did graduate research in an area of experimental physics that was quite close to state of the art quantum computing research. It was very exciting, but all the research was pretty much in extreme conditions that aren't really usable for a realistic computing device (ultra-high vacuum, nanokelvin temperatures, high magnetic fields or laser trap confinement, etc.). From the looks of it this paper is in the same vein. Good progress, but the gate isn't quite "open" yet.
That's true but once Science conquers something, engineering isn't very far behind. The apparatus to build Bose-Einstein Condensates was a fucking monster of vacuums and quantum optics but now we can fit a Bose-Einstein Condensate on a microchip. https://arxiv.org/abs/0912.0553
I Just read that paper and while It’s hardly just a microchip it’s damn impressive considering the technology involved and the requirements it’s fulfilling. It’s amazing that we can now build devices that can produce ultra high vacuums smaller than a hotel minibar, and design systems that are capable of active cooling down to <1K without the use of any cryogenic refrigeration at all.
The vacuum, temperature and confinement requirements would seem to be huge problems relative to the exotic material requirement of current gates (Not to diminish the significance of accomplishing this in silicon). As far as I know these 'hard' problems have remained pretty constant in quantum computing. Are we making progress on any of them? Basically, in your opinion how are things progressing in the field with respect to the big problems with making quantum computing mainframe ubiquitous?
I haven't touched this stuff in a while so take this with a grain of salt.
This is good progress since it's about an order of magnitude faster for operations than previously constructed CNOT gates, however it still has many caveats for a semiconducter-like leap towards real quantum computers:
a) Low temperature bound for the level of fidelity they want
b) Doesn't allow us to engineer quantum chips of more qubits efficiently since the decoherence between quantum dots is still a major issue.
c) The science journalism title is a bit misleading. While this happens in a silicon structure there is nothing new about it, what's new is that they managed to use the driving resonance to control the dephasing during the gate operation.
My biggest pet peeve about quantum computing is that no one can answer me the question of "what can quantum computing do for me?".
The answer I hear is that it helps solve the traveling salesman in record time, and that's all great and everything, but how will quantum computing be able to do things such as decrease the time it takes to train a RNN, or look up data in a database?
We don't know. If someone manages to find an algorithm to do such things, it'll be helpful for that. It's as if we're rediscovering the computer itself from the beginning and we're still not sure what it's good for, but the things we've managed to find algorithms for it appears to be really really good at.
Basically, don't worry too much about it, unless you wish to hunt for such algorithms yourself.
The one thing it's definitely promising for is simulating quantum mechanics... something classical computers are very bad at, and has serious practical use. For this, they will always be very useful.
Also, apart from inventing/discovering new algorithms, there's another part in inventing how to use them to help in particular "real-world" (or "non-quantum programmer world") problems. So, maybe one of the known algorithms could actually somehow help with "training a RNN", but probably nobody invented a way to rephrase ("refactor") the "training a RNN" problem in a way which could get boosted by one of the known quantum algos (the set of which can be looked at as "the quantum API") yet.
Now that I think of it, the few most well known algorithms (like the Shor's one) are probably so because "real world" use cases were found for them, which are understandable to non-quantum engineers (i.e. "factorization of primes"), as opposed to "transforming a hamiltonian foobzdringle into a hilbertian mesoism in laplacian meta-subordinates" :P
There is no added Turing power to QC over CC. Turing complete is Turing complete. Think of QC as a coprocessor for certain classes of algorithms like prime factorization.
> The traveling salesman problem is NP-complete and therefore not known to be solvable in less time on a quantum computer.
That's only a problem if you need the exact solution. If you're OK with 99.9% quality, approximate solutions are much cheaper. So the question is, is it necessary to invest into that 1%?
Approximate solutions have to do with solving problems with limited resources and scaling up. Quantum computers will be fast at solving certain types of problems, while approximate solutions are fast at solving practical problems.
I really enjoyed it for layman's perspective while still exposing technical depth. I hadn't thought about how one of the big challenges of quantum computing is figuring out how to morph your traditional parallel algorithm into a quantum algorithm (with all the weirdness that entails)
The work on quantum computing is basic research. You can't expect to be able to predict the actual real-world impact of basic research while it's still being developed.
Quantum mechanics is a good example. It's had massive, real-world impact, but nobody could have predicted that.
The details need to be fleshed out, and people then need to explore things with it, play around, do further research on top of it.
Quantum annealing can be used for optimization of functions. Shor’s algorithm of integer factorization would solve (and cause) a whole host of problems for us.
But in general, quantum computer is not strictly a superset of improvement over Turing machine. It’s just different. And while it might reduces your database query time, unless you are already working at that abstraction it might probably be invisible changes for you.
Quantum computing is certainly able to solve integer factorization and related problems (like breaking RSA encryption), just not for practical problem sizes yet.
You are probably thinking about NP-hard problems, which may or may not be solvable faster than on classical computers.
Shor’s algorithm can factorize integers on a quantum computer asymptoticly faster than on a classical computer but the algorithm requires at least a 128 qubit quantum computer to factor a 128 bit number and thereby break something like 128 bit RSA encryption. No one yet has, or is publicly admitting to having, a reliable 128 qubit device.
In some cases, error correction can be achieved simply by running the algorithm a few times.
Error correction is immaterial for the factoring case. You multiply the factors and if you do not get the original number you know you have to run it again.
In Schor's algorithm, the quantum computer finds the period p of B mod N were N is the number you are trying to factor and B is a random number less than N. The quantum computer may find p with imperfect accuracy. Running the algorithm repeatedly can improve the accuracy of p. You can also try other numbers close to p. Both are examples of error correcting a quantum algorithm.
The CPU-GPU analogy is very good for all sorts of reasons. For some things there's no speedup, for some important things there's a huge speedup, for other things there's some significant speedup, but not game-changing.
I tend to think of applications where you might want to brute-force search or simulate over some large set as being where quantum computing will be most significant, but I say that very vaguely and timidly.
This gives a number of applications and algorithms; I found some of the references in it really interesting to read:
Here's my rather crude understanding of what QC can do (please feel free to comment if this is correct): Let's say you have a problem that requires examining 2^N different possibilities. So the brute force solution would have O(2^N) runtime. QC can potentially help to solve this problem in O(2^(sqrt(N))) time instead. This can make a world of difference. For example, if N=1024 and examining each possibility took 1 nanosecond, it would still exceed the time far far more than age of universe. With QC with 1024 qubits, it would take 1 microsecond to solve it.
I'm afraid you're a bit off. A quantum computer can search N possibilities (such as when, say, looking up data in a database, though I doubt you'd go to the trouble of using a QC for that) in time O(sqrt(N)). So for 2^N things, that's 2^(N/2), not 2^(sqrt(N)). Obviously still a big improvement.
Of particular note, there are significant speed ups for optimization and constraint satisfaction, cryptanalysis, machine learning and indeed 'looking up data in a database' (grover search).
> The answer I hear is that it helps solve the traveling salesman in record time, and that's all great and everything, but how will quantum computing be able to do things such as decrease the time it takes to train a RNN, or look up data in a database?
"Quantum computers can search arbitrarily large databases by a single query" [1]. That's Grover's algorithm from 1997, one of the pinnacle results that got people excited about quantum computing. If you haven't come across it before, then you must not have read too much about quantum computing!
To me the most exciting prospect is exponentially faster simulation of quantum systems. Computational chemistry really sucks because either you rely on heuristics or you can't simulate systems with more that a handful of atoms. If we had big quantum computers we might make real inroads into understanding protein folding and could figure out how to make artificial proteins to use for all kinds of applications. Proteins are Nature's nano-robots. Full Drexler nanotechnology awaits.
However, quantum computing will likely do nothing for you as a person other than make your life more difficult due to complexity changes for solving non-pqc algorithms.
I would expect, as the technology matures and the transistors needed become smaller, to see Quantum computers instead become quantum chips that you would install as a PCI-Express card just like a GPU and would assist a normal CPU into solving some of the problems that quantum computing is better at.
Your bank could give you a quantum keyfob. All of your encrypted transactions are now entangled, so you know that nobody has spied on them: the act of observation modifies the result.
Can the idea of quantum computing be simplified/abstracted to say instead of doing binary (0 1) calculations, it do does calculation on arbitrary base? E.g. base 16 (hex) or base 1024?
If this is true, it would be easier to code, and compilers could spit out very efficient machine code.
Yes, base has nothing to do with anything, talking about "qubits" is just a convention since computation is usually thought of in terms of bits. The fundamental idea of a quantum computer though has nothing to do with binary in particular. Same as for classical computers.
It's not clear to me why you claim this would be related to efficient machine code, though.
Current "digital" microchips use electric "on" (1) and "off" (0) and base 2 (also called binary) mathematics is used.
Someone explained me quantum mechanics that it supports more states, not just on (1) and off (0), that it can have dozens or hundreds of states. Is this correct / state-of-the-art understanding?
I imagine how cool it would be to have more states, not just 0 and 1. A computer running on more states could encode machine code in fewer instructions, so it could calculate more with the same frequency.
So why downvote me? Either explain me where I got something wrong, or skip it in case you know less than me.
> Someone explained me quantum mechanics that it supports more states, not just on (1) and off (0), that it can have dozens or hundreds of states. Is this correct / state-of-the-art understanding?
No, that's not really correct. Being able to put states in superposition is very different from having more classical states.
> I imagine how cool it would be to have more states, not just 0 and 1. A computer running on more states could encode machine code in fewer instructions, so it could calculate more with the same frequency.
That doesn't really make any sense. A cycle is still a cycle. How many instructions you can encode with a given length doesn't tell you anything about how fast they execute.
Even going with an extremely charitable reading, I don't think there's any way anything that you're talking about could lead to anything better than a constant-factor speedup (and not a large one). The point of quantum computing isn't to get mere constant-factor speedups. Such a thing would be insignificant in comparison to what quantum computers can actually do.
FYI: This is for a very different substrate (quantum dots, which are basically individual electrons trapped in special silicon defects) compared to the substrates that have seen more attention for quantum computing (superconducting circuits and trapped ions). They are behind in terms of raw qubit counts -- which are not that meaningful -- and in terms of their overall abilities, but they hope to catch up by building higher quality components that cross the fault-tolerant threshold first.
If, like me, you don't already know a lot about quantum computing experiments, this article shouldn't interest you. At best, this is a small step of many thousands that will be taken on the way building a working computer.
Anybody who can read the actual PDF -- is this at room temp and no large magnetic field? Those two things seem to be the big stumbling blocks for these types of quantum computers..
Yep, still a major issue. This was run at 0.15 kelvin (-273c/-459.4f) and they'd like to get it colder:
In these experiments, the electron temperature accounts for 10% of the reduction
in our visibility, and spin relaxation accounts for the remaining 5-10%. In our experiment
we readout the qubits sequentially, reading the left qubit first. Relaxation of the right
qubit during the readout of the left qubit is what leads to the asymmetry of the expected
visibilities. Relaxation contributions to the readout fidelity can be mitigated by using a
faster readout technique such as RF reflectometry or by incorporating a cold amplifier
into the readout circuit (9, 10). Further improvements can be made by reducing the
electron temperature or using cavity-based measurement approaches (11).
Paper is linked below and way out of my depth but the only mention of temperature I could find was 150 mK so I'd assume this is not done at room temperature.
I know nothing about this stuff but this caught my eye:
“The challenge is that it’s very difficult to build artificial structures small enough to trap and control single electrons without destroying their long storage times,” said David Zajac, a graduate student in physics at Princeton and first-author on the study. “This is the first demonstration of entanglement between two electron spins in silicon, a material known for providing one of the cleanest environments for electron spin states.”
Electrides like Calcium Aluminate Electride can readily trap individual electrons. And it is stable at room temperatures.