Hacker News new | past | comments | ask | show | jobs | submit login
Lockheed Martin pays $10M for D-Wave's Quantum Computer (vancouversun.com)
69 points by ricksta on April 7, 2013 | hide | past | favorite | 58 comments



"Quantum computers operate at speeds unattainable by even today’s most powerful supercomputers, operations that are so fast, they can process millions of calculations in a fraction of the months, even years, traditional computers take"

God damn I hate bullshit lines like this used about QCs. The whole article is gives the usual misleading impression of QCs being generically faster than normal computers.

They're not.

They can answer some problems much, much faster than traditional non-QC computers because they are capable of running classes of algorithm that rely on quantum effects.

Don't get me wrong - that's a pretty darn useful subset of problems... the future of QCs is full of rosy cool stuff... but this isn't just like upping the clock cycles of a CPU.

It doesn't make everything faster. Completely different classes of constraint are being tweaked.

QCs aren't going to make everybody's laptop or smartphone rilly rilly fast.


Any chance they could render textures, geometry, bullets and blood spray really fast?


Are there any sources you know of that discuss what kinds of problems can be solved faster with quantum computing? I'm curious about them.


Wikipedia has a fairly decent list of quantum algorithms: http://en.wikipedia.org/wiki/Quantum_algorithm Shor's algorithm being one of the standard examples of where quantum computing completely crushes classical computing.


In addition to the Wikipedia site http://www.scottaaronson.com/ has a bunch of useful stuff on his blog.


D-Wave's machines are very different from "regular" quantum computers, and the sort of problems they solve faster than conventional computers are travelling salesman problems and those closely related, eg: "how can I lay out this printed circuit board so the traces don't cross and are as short as possible?"


D-wave's device has been used to solve exactly zero problems faster than a classical computer. Whether the current prototype even harnesses an actual quantum speed-up hasn't been shown.


For the record, this is also true for every quantum computer in existence. What is clear though, is that quantum computers have already solved problems in fewer steps than classical computers.

And to be fair, it's going to take a while to beat current computers. We've had decades to get them faster and faster, and we will need some time because we can perform low level operations nearly as fast and get enough qubits to get to sizes of problems that classical computers have issues with


> For the record, this is also true for every quantum computer in existence.

Yes, I'm well aware. The difference from D-wave is that the labs at IBM, UCSB, and Yale haven't claimed they've built a quantum computer (mostly).

> What is clear though, is that quantum computers have already solved problems in fewer steps than classical computers.

This is statement is false and probably not even interpretable in a sensible way. See, for instance, my esteemed colleagues on why some labs' claims to have factored small numbers like 15 are bogus:

http://arxiv.org/abs/1301.7007

> And to be fair, it's going to take a while to beat current computers. We've had decades to get them faster and faster, and we will need some time because we can perform low level operations nearly as fast and get enough qubits to get to sizes of problems that classical computers have issues with

Although it will take a while, it will probably not be because classical computers are so great. I could be wrong (I'm no expert and there could always be scaling surprises), but I'm willing to bet it takes us longer until we perform the first quantum computation that can't be done by hand than between that time and the time when we perform a computation that can't be done with a classical computer.

The short version is that you're mistaken about how research in quantum computers has been progressing. There have been fantastic successes and advances, but they cannot be measured by "number of qubits" or "difficulty of computations done". There simply haven't been any real computation performed. You should think of researchers at the stage of still trying to get the first transistor to work, not the Moore's-law stage of trying to cram the 2^Nth transistor into the silicon.


So will we ever see quantum computers as consumer products, or are their uses too specialized for that?


If we will, they will be bundled with a classical CPU, which does most of the work. However, it's more likely that quantum computing will used through the internet.


I think there is a world market for maybe five quantum computers.


while I would agree on a small market, that sounds a lot like what was said about the PC, years ago...


I believe that was the point of the joke.


well, sorry I didn't catch it.


D-Wave's advancements are very interesting to me for a few reasons. The first is that initially many people suspected D-Wave was a scam, because the most successful research efforts in quantum computing used just a few qubits, and D-Wave claimed a massive improvement (something like 128 or 256). Scott Aaronson (http://www.scottaaronson.com) was perhaps the most vocal critic. Over the years, the criticism has softened, and D-Wave has managed to get a paper or two into Nature. I think the truth of what they've achieved is somewhat less than what their marketing machine would like to suggest, but it's nevertheless very impressive (and D-Wave is certainly a place I'd like to work at if I could).

To clarify, D-Wave has not developed a general-purpose quantum computer, and in fact the term "general purpose" is kind of ill-defined for quantum computing anyway. Right now, there are a lot of different quantum effects that are used in different ways to accomplish specific tasks. I believe D-Wave's device uses quantum annealing to solve certain optimization problems, but someone check me if I'm wrong.

The little I do know about quantum computing relates to my area of study: simulation. The computation required to exactly solve the Schrodinger equation scales with 2^N for the number of particles (or whatever basis the equation is set in). Even the largest supercomputers are incapable of doing more than a few atoms [which, incidentally, is actually what I'm attempting to accomplish right now for a project that I should be working on instead of posting on here...] Anyway, with quantum computers, the scale would be O(N) instead of O(2^N), so you could perform incredibly accurate simulations that reach chemical accuracy. Chemical accuracy is kind of the holy grail of simulation, because what it means is that you can predict actual, macroscopic chemical properties of a variety of substances without doing any real-world experiments whatsoever. I believe it has been accomplished for things like pure hydrogen and quite a few bosonic systems (bosons are easier to simulate since they don't suffer from the fermion sign problem - http://en.wikipedia.org/wiki/Numerical_sign_problem).

Anyway, I probably sound like I know more than I really do, but hopefully this gives you an idea of what kind of applications a real, working quantum computer could be used to achieve.


People keep referring to Scott Aaronson like he changed his mind, and it's pretty clear he didn't. He just didn't want to be the Chief Naysayer anymore; he has more important things to do and frankly that shouldn't be his calling card.

I think the only part of "general-purpose quantum computer" D-Wave can claim is the "computer" part. They have not proven that any part of their annealing is "quantum". They have pretty epic lithography/fabrication skills, and are barreling ahead without any regard for coherence. They also have just an insane number of control lines, so there's some innovation there.

But their net worth becomes negative if you count the bad press for quantum computing in having a charlatan claim they have 1024 qubits and are "500,000" times faster at solving Sudoku. Other fields are having this PR problem too, where even careful program reviews are getting brutalized by the PR news cycle (see recent dark-matter experiment: http://profmattstrassler.com/2013/04/03/ams-presents-some-fi... -- forgive the typography)


Disclaimer: I have no idea what I'm talking about in this area. So I reserve the right to dumb questions!!!

Is it possible that in the future rather than having a sort of general "quantum computer", something like quantum processors will be built for specific computing tasks? Each very different in config and unique, specifically designed and used for one thing and one thing only?


Sure it's possible, and that's probably what will happen with the first few quantum computers. =D


Out of interest, when the chemically accurate simulations are made are there any surprises? Or do the estimations that are normally used good enough for most purposes?


D.E. Shaw current is one of the leaders (if not the leader) in molecular dynamics (i.e., studying the motion of the specific atoms in a molecule, or watching the dynamics of a chemical reaction), and it's a really hard problem. Roughly speaking, you have to calculate a constant (but also fairly large) number of operations to calculate the configuration of the molecule(s) for every _femtosecond_ of time. It would be great to see things happen at a tens of microseconds, but that is a LOT of computation.

This is useful becaue it would mean that we could see exactly how reactions take place and perhaps even engineer interesting chemical/biological phenomena. Certain processes in your body depend on proteins moving certain substances or reacting in certain ways, and if we can simulate all that with a computer, we can start to build chemical/biological tools for, for example, fighting certain diseases


Out of interest, when the chemically accurate simulations are made are there any surprises? Or do the estimations that are normally used good enough for most purposes?

Well - it's more that there are entire classes of things that we can't simulate in any reasonable amount of time that (in theory) QCs should be very good at.

Protein folding for example. Finding the lowest energy state that protein's fold into is really, really hard and slow. QCs can theoretically do it very, very quickly. This opens up whole areas of experimentation and validation that are closed to us at the moment because the feedback cycle on solutions is so darn slow and/or inaccurate.

Protein folding errors are at the heart of diseases like Alzheimer’s, Huntington’s and Parkinsons. QCs capable of simulating the chemistry involved would be a huge help in attacking those problems.


I'm not an expert on Comp Chem, but you might want to rephrase your question. If the simulation is "chemically accurate", of course it will match the real-world, by definition of "accurate"...


My understanding of these things is that the dynamics of the real world are not easily measurable at this scale. An alternative is to simulate using using packages such as Gromacs:

  http://www.gromacs.org/About_Gromacs
These packages are truly incredible but use relatively crude approximations and are in wide use. It is possible to get a decent paper out that uses a simulation as evidence to support an idea.

The aim of my question was to see if chemically accurate simulations come up with significantly different answers or to see if the current way of doing things is a good enough approximation.


You bring up an excellent point, papaf. The accuracy of the empirical force fields used in molecular dynamics simulation engines (such as Gromacs) is a hotly debated topic. Even without quantum computing the issue can be addressed with conventional computers that perform quantum mechanical calculations on interacting molecular fragments. The forces calculated from quantum mechanics can then be compared with those calculated from molecular dynamics force fields (as an example see Sherrill et al. http://onlinelibrary.wiley.com/doi/10.1002/jcc.21226/full ). Such studied have shown that current force fields fail to model certain chemical interactions and need improved. Specifically, the underlying functional forms used to model molecular forces need revised.

Currently such investigations are limited in scope by the large computational resources required to perform a single quantum mechanical calculation on a molecular fragment. With quantum computers, tens of thousands of such calculations could be performed and the results could be used to optimize new molecular force fields through multivariate regression.


Nice to see another GT person on HN! (Did my undergrad there.) Have you worked with Dr. Sherrill? It's funny you mention him; I was actually reading one of his presentations on electron-electron correlation last night.


Cool. I'm a grad student in another chemistry theory group. Dr. Sherrill definitely has the best notes on electronic structure calculations.


This part of the article makes me facepalm so hard, as someone who knows a little about quantum computing and a lot more about computer vision:

> Quantum computers operate at speeds unattainable by even today’s most powerful supercomputers, operations that are so fast, they can process millions of calculations in a fraction of the months, even years, traditional computers take.

Quantum computers can carry out some algorithms which normal computers can't, which can be much faster, but they're not usable for general computing so this statement makes no sense.

> They can even be taught and can recognize objects in images, a task standard computers struggle with.

Er... what? I wouldn't say that standard computers can do vision easily, but it's a problem of finding the right algorithms, not computing power.


$10 million is cheap. True, it only does discrete optimization problems, but I think you could make your money back in a few years years renting it out at $5k/hour and consulting on translating problems to that domain.

Will some clients be wildly overpaying for something they could do equally well on regular computers? Sure, and they'll love every $ of it because of the bragging rights. Is this an efficient use of the hardware? Certainly not, it'll probably be exploiting <1% of the system's potential. Doesn't matter. People will frequently pay more for novelty than actual utility. If their vanity subsidizes the tiny subset of research computation that would have serious economic benefits, I call that a win-win.


So it's worth like 1/10th a Google employee.


Lockheed had been working with D-Wave for a while. If they decided to actually buy one of their computers, this probably means that they liked what they were seeing.


This is the second one LMT has purchased. Which means they liked it enough to buy the upgraded model.


I wonder if Google will buy this version, too. I think they've been working with D-wave for a few years now.

http://phys.org/news180107947.html


I can almost guarantee they will buy this version.


So I am in my final year of high school now, and I'm not studying physics next year. The only thing about 'quantum physics' that we had was emission-spectra, and light-interference. Quantum Physics sounds _Really_ interesting though, so I would love if anyone could point me in the right direction with where and how I start learning this stuff. Thanks:D


This is great to see! Of course, this stuff isn't easy to jump into, but I recommend checking out some of the free online courses for something good. Coursera once had a quantum computer course (!) taught by a pretty famous guy in the field! It didn't have quantum mechanics as a prerequisite, either!


I will definitely check out the Coursera courses. thanks!


a) Learn this pair of amazon keywords: "dover physics" and "dover math".

b) Buy this book to get going: http://www.amazon.com/Quantum-Mechanics-Simple-Matrix-Physic...

c) Then buy this one to scare professors: http://www.amazon.com/Mathematics-Classical-Quantum-Physics-...


thanks! The books aren't even that expensive. Will definitely check them out!


Did anyone try their developer portal. I'm all excited to hear that they'll put up their quantum computer online after Beta. http://www.dwavesys.com/en/dev-portal.html


"500,000 times faster than its predecessor"

How much power do quantum computers use and when will they be affordable enough to put in a smartphone/tablet?

And why isn't any other company doing this?


Quantum computers are still in its infancy, and a lot more work is needed before we can even begin talking about practical, effective use. Consumer market readiness is even further away.

It's probably not worthwhile for quantum computing chips to enter the consumer market, because (AFAIK) quantum computers are only very good at solving a very specific set of problems (e.g. integer factorization), but their advantages over classical chips diminish (or even become negative) for general purpose computation. Of course as quantum chips develop, quantum algorithms will develop/evolve with them so that might change.

There have been some skepticism over whether their quantum computing chip (and similar ones that other companies develop) is actually a quantum computer (e.g. whether true quantum entanglement was observed).

There are other companies doing this, such as IBM.


Could a quantum computer mine bitcoins?


Affordability will come, the really difficult problem with the techology from a portable perspective is you need an exceptionally low temperature and very low vibration environment otherwise you'll never anneal.

Yer qubits get too warm and shakey in your pocket...


There is no need to keep them there. You can just ssh to some qbits from your pocket.


I stopped reading when the article suggested that millions of calculations, when done on other computers, take months.


If you define one calculation as one instruction that is carried in a single clock cycle, even the stupidly cheap Atmel chips (which power the Arduino) runs at 16MHz, which is 16 million instructions per second.


So does this mean Shor's Algorithm is going to kick in for real and ruin all our securityz?


So does this mean Shor's Algorithm is going to kick in for real and ruin all our securityz?

Nope.

First, as I understand it, the D-Wave stuff isn't a system that can run Shor's.

Second, Shor's only ruins security for a certain class of crypto algorithm. There are already algorithms that exist today that a proof against it (e.g the McEliece cryptosystem http://en.wikipedia.org/wiki/McEliece_cryptosystem).

Third, if you're really worried about the man cracking your s3cr3t stuff with quantum computers go pick the right cryptosystem ;-) Plenty of symmetric encryption systems that only get their key lengths reduced (effectively halved) by Grover's algorithm.


People tend to equate asymmetric crypto with the likes of RSA; systems where the efficient factoring of large numbers is a death sentence. But there's a whole slew of other asymmetric cryptosystems without such properties, e.g. elliptic-curve cryptography.


Shors algorithm also breaks elliptic-curve cryptography.


Oh, wow, I've never seen the variant that breaks ECC; that's really quite awesome. Time to read some papers!


Nielsen, Michael A.; Chuang, Isaac L. Quantum Computation and Quantum Information. p. 202 is what you want ;)


This is not correct anymore, according to D-Wave they do have a factoring algorithm now.


Interesting. Do you have some pointers to that?

I thought both the D-Wave 1 & 2 both had specialist quantum chips that couldn't run Shor's.



Hmmm... can't find anything else on that apart from the mention in that blog post. Is there more detail anywhere?

They also implicitly state that it's not Shor's which is (to the best of my knowledge) the most effective integer factorisation algorithm for general purpose QCs currently know.

They also say that they "have a factoring algorithm". They don't say that it's actually implemented on running hardware.

I also notice the complete lack of clarification on the followup question that poked on the explicit "executed it on real hardware" question ;-)


Probably not anytime soon. Right now a 4 year old kid can beat the best implemented Shor's algo (http://www.nature.com/nphoton/journal/v6/n11/full/nphoton.20...).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: