Hacker News new | past | comments | ask | show | jobs | submit login
Rigetti Forest 1.0 – programming environment for quantum/classical computing (medium.com/rigetti)
173 points by reikonomusha on June 20, 2017 | hide | past | favorite | 71 comments



Hello HN! We at Rigetti are really excited to announce and release Forest. My colleague @dangirsh and I will be here today to answer any questions that might come up about Forest and quantum computing!

Some potentially HN-interesting links:

pyQuil, a Python library for quantum programming: https://github.com/rigetticomputing/pyquil

pyQuil on RTD: https://pyquil.readthedocs.io/en/latest/

Grove, a collection of quantum algorithms in Python: https://github.com/rigetticomputing/grove


> We’ve gone from single-qubit devices to fully-functional 8-qubit systems that are now in the final stages of validation.

How hard of a problem can you solve with 8 qubits? For example, if you're implementing shor's algorithm -- it looks (very naively) from the wikipedia article that to efficiently factor a number of size N, you need about log(2N) qubits. So with 8 qubits you could factor the number 32 efficiently (is that calculation right?). Can you do things more difficult than 'factor 32' with 8 qubits?

(not intended as an attack at all, i genuinely just do not understand how to reason about the power of quantum computing devices with X qubits)


This is a good question. 8 qubits will not solve any problems that a classical computer can't. In fact, considering just how blazing fast modern CPUs are, you can simulate an 8-qubit chip faster and with more fidelity than you can with 8 homegrown qubits.

Shor's algorithm isn't a good candidate for near-term use of a quantum computer. Other algorithms, such as the variational quantum eigensolver or the quantum approximate optimization algorithm are better. These can be used to solve chemistry and optimization problems, for example.

We actually made a little demo to show a little game you can play with 8 qubits. It may not be able to handle the load of HN if it finds its way to the masses, but check it out! [0]

[0] http://demo.rigetti.com


> you can simulate an 8-qubit chip faster and with more fidelity than you can with 8 homegrown qubits.

That is really useful to know! At what point does that stop being true? like, if I had 100 qubits could I still simulate them faster on a classical computer than with actual quantum qubits? (I imagine this is a hard question to answer because it's a complexity theory question, and I guess quantum computers with 100 qubits don't actually exist in real life, but I'd be curious to know what known bounds there are on the answer)


Right now, if you want to simulate 40 qubits, you need a computer with at least 16 terabytes of RAM. 41 qubits needs 32 terabytes. You can't do this in a single machine, but you could get these with enough computers linked together. But big iron doesn't scale exponentially, so at some point, you run out of atoms in the universe to use as transistors.

And 100 qubits. That needs more than 18 quintillion terabytes of RAM. In other words, it's not going to happen.

With more memory, you need more time. Every quantum operation [0] needs to touch all of that memory. And if it takes 10 ns to touch one byte of memory, then it will take hours to even do one simple operation on 36 qubits. [1] On a quantum computer, an operation would be on the order of nanoseconds.

I would say this, as a rough answer to your question: A quantum computer is practically faster and more useful when you can no longer touch the RAM necessary to simulate it in a 1 millisecond. Going off of our 10 ns rule, 1 ms = 10^6 ns. So if 10 ns allows us to do something with 1 byte, then that's 10^5 bytes. So just about 10 KB.

And, drum roll, that's just under 10 qubits [2].

[0] There are some very special cases where this isn't true, but these special cases also don't give you any sort of quantum advantage.

[1] There are some tricks you can play to speed things up, like parallelizing across processors. But now you'll have to factor in communication time, memory, and bandwidth.

[2] One factor I'm not considering is the fidelity. Superconducting qubits can be thought of as analog devices with analog noise.


The largest full quantum computer simulation done so far is 45-qubits which was performed on the Cori supercomputer [0][1]. The main limitation in these kinds of exact simulations is memory, and so simulating 50-qubits would require about a 1000x the memory of Cori, which is infeasible.

There are some other methods of approximate simulation, but that gives the cut off for full, exact simulation of a quantum computer.

[0] https://www.nersc.gov/news-publications/nersc-news/science-n... [1] https://arxiv.org/abs/1704.01127


A small nitpick on the game scoring:

I assume once the "computer" reaches a new local maximum score (i.e. each time they find a way to get a new high score) you're storing the time it took for the algorithm to reach that score, even if it goes off and looks for other less-high-scoring combinations after.

However, the "player's" score is dependent entirely on when they submit their solution. If I click the bubbles to reach a maximum score of 11 early on, and then spend a significant amount of time trying other combinations to make sure it's the best I can do, my score is based on how long I took trying all combinations instead of how long it took me to first reach 11 (like how it scores the computer).

I know it's structured as a "race", but this little discrepancy in scoring makes it feel a lot more like it's testing heuristics of how well you can reason "this is probably the maximum score" instead of actually tackling it in the same way a computer would.


As a software engineer looking for a highly marketable and differentiated skill set, given your projections for quantum computing roadmap, when should I start exploring this area? (or: when should I start writing code and doing side projects)


Now. "Exponential" is faster than most people, including myself, can believe. When one unit of resource doubles your computational capacity, it doesn't take many units.

I like to use the analogy. Adding 1 GB of RAM these days isn't that big of a deal. You can maybe open two more tabs in Chrome. :) Adding 1 giga-qubit to your computer would make it 4.6 x 10^301029995 times better. That's unimaginably more powerful than anything any human can think of.

We don't have quantum software engineering figured out. And it's not going to be figured out by a few academics, although they may lay some good foundations. It's going to be figured out by the same folks who figured out traditional computing: people who try stuff, break stuff, and experiment.


I disagree. Through Scott Aaronson's writings, I gather we still don't have conclusive evidence [1] that anyone's built an actual quantum computer capable of more sophisticated computations than a human 10-year-old can do with pen and paper, and furthermore that it's not clear we'll be able to build such a machine in our lifetimes. I'm not saying it's impossible, but we know it's going to be really hard.

What you're ignoring with your exponential growth argument is that it is also exponentially harder to add one qubit while maintaining usefulness (i.e. long decoherence times).

I'd say, anyone who doesn't want to work either in academia or on "vanity projects" like D-Wave's much-hyped collabs with Lockheed, Google etc. should wait half a decade and see.

[1] Arguably the D-Wave machines are faster than a human, but we don't have evidence (yet) that it's not just a fancy annealing ASIC.


I do not dispute the claim about the existence of a quantum computer which surpasses its classical brethren. Scott is correct.

I do disagree that it is exponentially more difficult to add a qubit. Coherence times are something to optimize, and densely packing qubits is also difficult with coherence times, but the notion of adding a qubit to a system doesn't come with an inherent exponential difficulty.

Regarding whether it is useful or not to learn quantum computing for your profession, if it's true that systems can be built that grow with exponential power, then they'll be relevant faster than one might think.


> I do not dispute the claim about the existence of a quantum computer which surpasses its classical brethren.

It's less "please beat a $20000 server stuffed with GPU accelerators" and more "please beat a 6502, or to start with at least an abacus".

The problem is that everything is so toy-level so far that it's not even in the category of "computation tool".


Again, I do not dispute this claim and also do not see it as a problem. Folks working on it, including myself, would like to see it as a viable replacement for any computational device. And it's not, right now.

There's a reason we work on it, though, and that's because of two reasons: (1) our current and insofar accurate understanding of physics says with certainty that a quantum computing device is superior to its classical counterpart, and (2) while the problem is not easy, there seems to be just the right number of engineering problems (signal integrity, signal routing, superconducting non-magnetic fab, etc.) in our way that we feel we can tackle them in a timely manner. Rigetti in particular is a company that believes that having a full stack team will allow these interdisciplinary problems to be solved faster.

No one, on our team at least, is disillusioned about where we are. As the article says, 8 qubit chips are in the final phases of validation. As I say, 8 qubit chips are simulatable faster on your shiny Intel chip. Does that mean the entire enterprise is useless? No. It is a stepping stone for a company that has raised less money than many CoolNewLikeUberButForX apps you see pop up here. I find that unimaginably remarkable.

When I answer questions about quantum computing, however, I want to share my and others' visions about it based off of what we know, in a relatively accurate fashion, that is understandable by a general audience.


Isn't it true though that there would be an O(n^2)-type difficulty in adding extra qubits, since they all need to interact?

Or is that an oversimplified view?


They do not all need to interact directly with one another. You can create full entanglement even if they linearly interact. It just means you pay a penalty in the compilation of your program.

Architectures with higher two-qubit connectivity is merely an optimization.


> Now. "Exponential" is faster than most people, including myself, can believe. When one unit of resource doubles your computational capacity, it doesn't take many units.

Please don't make bullshit claims about exponential speedups. I don't know exactly what technology you are claiming to have, but statements like this cause me to believe less in your technology, not more.

We've been through the cycle of unfounded hype many times (with D-WAVE and others). Scott Aaronson has an entire category on his blog filled with depressingly many posts debunking the same bullshit over and over [0].

[0] http://www.scottaaronson.com/blog/?cat=17


The simplest quantum algorithm shows an exponential speedup over the best classical solution, this, of course, is the toy example often used in QC texts of determining whether or not a function is constant or balanced(Deutsch).

But, as John Preskill points out...this is not even the really interesting thing here. Quantum simulation actually lies outside the class of NP, because there is no efficient way to verify the solution of such a simulation.

This area is where quantum computers, in my opinion, are the most interesting, we will be able to do things we simply cannot on a classical computer....and for the record..most people commenting should know that D-WAVE operates using the Adiabatic model, and is not a universal quantum computer.


The size of the state space in which the qubits live is exponential in the number of qubits. This is because the qubits live in an n-fold tensor product of two-dimensional Hilbert spaces. Performing an operation on a single qubit is the same as performing a 2^n-dimensional unitary transformation on the state of the system.

This is not disagreed by experts in the field of quantum computing, including Scott.


I know what a Hilbert space is, and I also know that this 2^n-dimensional space cannot be accessed except through a destructive measurement operation. An exponential state space does not imply that there is exponential computing power to be harnessed there.

As an analogy, when you execute a randomized classical algorithm, the size of the state space in which the bits live is also exponential (and at the end you observe the result, and your uncertainty collapses from a probability distribution to one of its possible outcomes). Yet you would look at me like I'm crazy (or a fraud) if I claimed that randomized algorithms have exponentially more computing power than deterministic ones.

The only way in which the quantum case differs from the classical picture above, is that amplitudes have a phase and can thus interfere (constructively or destructively). The art of creating quantum algorithm lies entirely in orchestrating favorable interference patterns.


The way I like to explain it simply to people is that right now, you can use frameworks to manipulate probability distributions. Quantum logic gates are basically a restriction on the operations you can use to combine pdf functions. Ultimately, unless you can find a clever way to convert an algorithm into one that uses pdfs and then achieves a pdf where one single value has 99% of the EV, QM aint gonna help ya.


It seems like we should be more careful when saying "exponential" increase in computational performance. For many quantum algorithms, the speedup is actually superpolynomial [1] [2]. In some sense, this is due to the fact that the state space grows exponentially but, as you correctly pointed out, it can only be accessed in a destructive manner. For many algorithms (e.g. Shor's), the net result is a superpolynomial improvement in the resources required for solving a practically important problem (factoring).

Unfortunately, the nuance of superpolynomial vs exponential is lost in many high-level discussions about quantum computing. Maybe we should just say "much, much faster" ;) To make matters worse, quantum computing textbooks often present Simon's Problem [3] as a showcase for truly exponential speedup. It turns out this is misleading, as I've never heard of a practically relevant algorithm with truly exponential speedup.

[1]: http://math.nist.gov/quantum/zoo/

[2]: https://en.wikipedia.org/wiki/Time_complexity#Superpolynomia...

[3]: https://en.wikipedia.org/wiki/Simon%27s_problem


> In some sense, this is due to the fact that the state space grows exponentially

What I take issue with is precisely the conflation of the size of the state space with the quantum speedup. Shor's algorithm is fast because QFT (quantum fourier transform) creates an interference pattern that can reveal the period of certain functions, and QFT can be implemented efficiently because of its specific structure. As I said before, the size of a classical state space of a probability distribution is also exponentially large, so no, the root cause is emphatically not the size of the state space, but the way in which that space can be manipulated and the fact that amplitudes add up in a way that's not linear (when looking at the resulting probabilities).

Note that Grover's algorithm achieves only a quadratic speedup with the same size of state space as Shor's. Your explanation doesn't add up, it just adds to the confusion.

I just think that it's very important to stay far away from the (wrong, but pervasive in pop science) idea that quantum computers are fast because they "try exponentially many solutions in parallel". Excessively highlighting the size of the state space is already a step too far in that direction for my taste.

My words are a bit harsh, but I do appreciate the fact that you are engaging honestly, and please don't take my skepticism personally. I would like to hear what your technology brings to the table, how it differs from competing approaches, etc.


I am in agreement with your sentiment here. Adding a qubit does not mean that every single thing you do on a quantum computer doubles in speed, which is a possible way to interpret some of my statements.

From a purely personal perspective, I do think that it is very interesting that we can affect the entirety of a state with an otherwise linear number of physical operations. Whether that is useful in providing lots of exponential or even polynomial speedups in the arena of practical algorithms is yet to be determined. I suspect that with a robust enough computer, the answer will be a resounding "yes".


> As I said before, the size of a classical state space of a probability distribution is also exponentially large...

This is true, but a single state in a classical probability distribution is not exponentially large. Because of superposition, a single quantum state can be associated with an exponentially large number of amplitudes. As you mentioned, quantum algorithms rely on the interference of these amplitudes. However, if you could somehow assign a complex amplitude to each state in a classical probability distribution, you would still be limited to manipulating only one amplitude at a time. It is in this sense that the exponential scaling is important.

> Note that Grover's algorithm achieves only a quadratic speedup with the same size of state space as Shor's. Your explanation doesn't add up, it just adds to the confusion.

I didn't mean to imply that all quantum algorithms have superpolynomial speedups. But (especially) for the ones that do, I about the exponentially large set of amplitudes being manipulated in parallel.

> I just think that it's very important to stay far away from the (wrong, but pervasive in pop science) idea that quantum computers are fast because they "try exponentially many solutions in parallel".

100% agreed.


Ah,looks like I botched parts of this:

> However, if you could somehow assign a complex amplitude to each state in a classical probability distribution, you would still be limited to manipulating only one amplitude at a time.

This is probably just more confusing. What I should say is that classical probabilities have no physical manifestation that you can directly manipulate - they just denote our lack of information about a system. Amplitudes in quantum systems can be related to probabilities, but they don't represent lack of information. The probabilistic nature of quantum systems is deeper than that: measurements project superposition states onto classical states in a probabilistic way. This is

For exponentially large superposition states, there are an exponential number of amplitudes. When we act on the state in certain ways, we update all of the amplitudes in parallel. There is no counterpart to this when acting on classical states, even when you have incomplete information about the state (and thus an exponentially large probability distribution).

> But (especially) for the ones that do, I about the exponentially large set of amplitudes being manipulated in parallel.

Let's try again.


Ah, looks like I botched parts of this:

> However, if you could somehow assign a complex amplitude to each state in a classical probability distribution, you would still be limited to manipulating only one amplitude at a time.

This is probably just more confusing. What I should say is that classical probabilities have no physical manifestation that you can directly manipulate - they just denote our lack of information about a system. Amplitudes in quantum systems can be related to probabilities, but they don't represent lack of information. The probabilistic nature of quantum systems is deeper than that: measurements project superposition states onto classical states in a probabilistic way. But before this projection, we're forced to say that the physical state of the system is in superposition. Even more, the amplitudes accociated with each part of the superposition state are part of the physical definition of the state. In this sense, they are more "real" than classical probabilities.

For exponentially large superposition states, there are an exponential number of amplitudes. When we act on the state in certain ways, we update all of the amplitudes in parallel. There is no counterpart to this when acting on classical states, including when you have incomplete information about the state (and thus an exponentially large probability distribution).

> But (especially) for the ones that do, I about the exponentially large set of amplitudes being manipulated in parallel.

Let me try again. The built-in exponential in the physical state (as I described above) helps me see how quantum speedups (especially super-polynomial ones) could even be possible. You're right that there's more to the story than just having an exponentially large number of amplitudes, but it's an important part of the story!


But you said "When one unit of resource doubles your computational capacity" which I believe is what your parent comment rightly called bullshit.


> Adding 1 giga-qubit to your computer would make it 4.6 x 10^301029995 times better. That's unimaginably more powerful than anything any human can think of

That's not true and you should know better. For example, there are very few problem for which quantum computers are known to perform better than standard computers.


It is true that I am not being mathematically precise in my statements. The precise way to say what I said is: In order to represent completely an arbitrary state in the space of one billion qubits, you will need a number of bytes exponential in that number of qubits. If we have, as mathematical entities, one billion additional qubits, this will be equivalent to increasing the dimension of our existing system by 2^(1 billion) times.

Of course, I am saying "mathematical entities", and almost all practitioners of quantum computing are aware of the challenge to actually build them.


> there are very few problem for which quantum computers are known to perform better than standard computers.

http://math.nist.gov/quantum/zoo/


That is a great list! (But I don't think it contradicts what I wrote)


what would be signs that this is taking off commercially? will explore the github repo!


There are lots of "checkpoints" one can imagine with the commercialization of a technology. Right now, large industry players, whose survival depends on their tech strategy, are investing in quantum computer R&D. I don't mean that these companies are themselves trying to build quantum computers, but they are interested in applying them to their hardest technical problems.

Quantum computation is such a new and different computing paradigm, that whoever is prepared will be able to reap the benefits much earlier. And, if the promises of scaling are true (they are from a fundamental physics standpoint), such companies will propel themselves far ahead of the competition.

I would say that, in the current stage of development of quantum software and hardware, even a seasoned software professional will not—on short order—be able to apply the tools directly to their problems. As a programming language enthusiast, it's like taking a long-time K&R C programmer, and asking them to be productive in Agda. It's not that they can't, but they probably won't be able to do it by tomorrow. It'll take time, energy, and investment to think in new ways.

I personally believe that commercialization will become more and more apparent when services are accelerated by quantum computation. But how many people are going to share that secret sauce?


I'd hope one milestone is "Someone with no knowledge of physics, nor a desire gain any, is capable of programming with this hardware/software."

Is that feasible? Is it desirable?


That's my goal! It is desirable, and I think it is feasible.

I said in another comment that I think the best thing we can do is get quantum devices in the hands of people and let them play. Unfortunately, for a long time, quantum computers and their programming have been so utterly out-of-reach and opaque that that has been difficult. Now I think we are taking good steps to opening the possibility of experimentation up.


Just to give you an idea, I've spent about 30-45 minutes reading over various materials (the Github links). I think my level of knowledge would be equivalent of understanding how dup, drop, and rot work in FORTH (or car and cdr in Lisp)... Basic element manipulation (bit/qubit, stack, and list).

The difference though, is that I only needed to understand there was a container of multiple items in FORTH and Lisp. For basic element manipulation, I needed to understand matrices.

At this rate, it would take hours before I understand how to write a basic program. And my trailblazer sense is already tingling (that I should let others be pioneers).

Normally I'd just resume lurker mode at this point, but my interest in combinatorics is driving my curiosity towards understanding what might be possible.


I admit it is an unusually larger leap to get to anything useful. We have been blessed to have such a fantastic and intuitive understanding of classical computing. We can pick up most new programming languages gradually and efficiently. When the fundamental object of manipulation is this wacky thing called a "state vector in 2^n dimensional Hilbert space" as opposed to "a bag of bits", and operations must be reversible, and ... and ... and ..., things are just harder.

I hope we (both Rigetti and the quantum computing community at large) can continue to refine and simplify the concepts at hand.


Unlike adding RAM, adding each extra qubit is also exponentially harder, since maintaining coherence of all the qubits becomes more and more difficult. That's why scaling from the tiny quantum computers we have today (which are not useful) to a useful quantum computer remains a decades long research agenda.


You might take a look at this other reply:

https://news.ycombinator.com/item?id=14598516


Also what percentage of the engineers at Rigetti actually think that you are in any sense "leveraging the multiverse" with this sort of technology, and how many of you prefer an alternative explanation?

I understand if you don't want to make a public statement about such divisive (and perhaps, more importantly, ill-defined) matters in this context. I'm just curious about the way that the people who are actually building these things tend to view them...


I think this is the best question so far :)

After an informal poll of ~30 of our scientists/engineers, ~8 said they subscribe to the many-worlds interpretation [1] [2] of quantum mechanics. To be honest, this is a discussion that comes up surprisingly infrequently at the office!

This result reminds me of Sean Carroll's "Most Embarrassing Graph in Modern Physics" [3]. When top theoretical physicists are polled, it seems like no interpretation of quantum mechanics even takes the majority! In my view, the lack of consensus around this (after ~100 years) underscores the strangeness of the theory.

I should mention that David Deutsch (one of the pioneers of quantum computing), Steven Hawking, Max Tegmark, Sean Carroll, John Preskill, and many other prominent physicists prefer the many-worlds interpretation (citation needed).

If you like the "leveraging the multiverse" view of quantum computing, be sure to watch David Deutsch's video lectures [4]! In his view, multiple universes is the way to explain the power of a quantum computer.

[1]: https://en.wikipedia.org/wiki/Many-worlds_interpretation

[2]: https://plato.stanford.edu/entries/qm-everett/

[3]: http://www.preposterousuniverse.com/blog/2013/01/17/the-most...

[4]: https://www.youtube.com/watch?v=24YxS9lo9so


@dangirsh is polling people at the company now to answer this question!

As for me, the only way I could even attempt to understand this quantum mechanics business was through the multiverse perspective. Fun fact: The QVM actually implemented the multiverse interpretation of quantum mechanics. Every qubit measurement, in particular, would branch the wavefunction into two separate universes. This was a perfect way to bring my computer to a grinding halt in no time.


Your GitHub account [0] gives Common Lisp some love. Why did you choose to use this language? Specifically, for the implementation of your Quantum Virtual Machine?

[0]https://github.com/rigetticomputing


That's my doing.

When we started thinking about quantum programming languages, we didn't know what they should look like. Experimenting with different languages efficiently requires a language that makes language-building easy. I find that there are two classes of languages that provide that: Lisp-likes with metaprogramming and ML-likes with a good type system including algebraic data types.

Quil [0], our quantum instruction language, came out of language experimentation in Lisp. In fact, the Quil code:

    H 0
    CNOT 0 1
    MEASURE 0 [0]
used to look like this:

    ((H 0)
     (CNOT 0 1)
     (MEASURE 0 (ADDRESS 0)))
This was a no-hassle way to play around without thinking about how to wrangle lex and yacc with shift/shift and shift/reduce issues. If you've ever used them, it takes a while to get what you want, and integrate it into a product you're trying to create.

In addition to the need to construct languages, we also needed to simulate/interpret the language. Simulating the evolution of a quantum state is very expensive (which is why we are building quantum computers!), and is usually relegated to some closer-to-metal language like C.

Unfortunately, in C, high-speed numerical code (that is also blazing fast) is very difficult to experiment with, extend, maintain, etc.

Fortunately, over the past 30 years, Lisp compilers have become excellent at producing native machine code. With a compiler like SBCL [1], you can produce code which is very nearly optimal. For example, we have a function to compute the probability from a wavefunction amplitude. Here it is:

    > (disassemble #'probability)
    ; disassembly for PROBABILITY
    ; Size: 90 bytes. Origin: #x22B67F5A
    ; 5A:       F20F104901       MOVSD XMM1, [RCX+1]
    ; 5F:       F20F105109       MOVSD XMM2, [RCX+9]
    ; 64:       F20F59C9         MULSD XMM1, XMM1
    ; 68:       F20F59D2         MULSD XMM2, XMM2
    ; 6C:       F20F58D1         ADDSD XMM2, XMM1
    ; ...
There's some additional code that follows, but it disappears because this function is also inlined at the call sites.

Writing the QVM and associated compiler in Lisp has allowed us to move fast. The code is compact, extremely robust, and extremely fast.

I could say a lot more about this, but that's the gist of why we thought it was a sensible decision.

[0] https://arxiv.org/abs/1608.03355

[1] http://www.sbcl.org/


So if I understand it right this process requires some pretty specialised and delicate equipment... for instance you have to be able to get Helium3 down to something like as cold (compared to us) as the sun is hot. For what I'm asking it doesn't matter if that is technically true... point is you need some pretty fancy technology.

Now it's true that classical computers used to take up whole buildings. Living people remember this. And progress is supposed to be getting faster and faster. But given the particularly arcane constraints... how long if ever before this kind of technology can be a part of the daily lives of most Teran Citizens? Will it ever be possible for us to have it at home? Or will we always have to send out requests to more centralised machines that will then send us back answers?


I don't know whether or not we will have the technology in the future to eliminate the need for such environments. Different kinds of qubits are an active research activity. Ions, dots, diamond vacancies, etc.

The bigger point to realize, I think, is that most people's computing doesn't even happen at their home anymore. Much of it happens on some blade in a server rack. I think, for the time being, quantum computing will be just like that.

It was pretty inconceivable, in my retrospective of computing, to think that vacuum tubes would be miniaturized to fit in little boxes in bedrooms. You really needed the discovery of a IC transistor to let that happen. That hasn't happened yet with qubits, and I don't think one can with any certainty predict it.


Thanks. That helps me to think about it in a more realistic and pragmatic way. Good analogy!


How is Forest different from IBM's quantum experience?


Great question. Both the IBM Q experience and Rigetti Forest allow users to write quantum algorithms with Python that can execute on real quantum hardware. Forest is different in 3 main ways:

1. Forest was designed with near-term applications in mind. Specifically, it uses our quantum instruction set (Quil) [1], which was designed for implementing classical/quantum hybrid algorithms [2]. These algorithms can leverage near-term quantum devices significantly more than "textbook" quantum algorithms (like Shor's). IBM Q places much less emphasis on hybrid computation.

2. Forest provides raw access to the quantum hardware. There's an API [3] for users to run "analog" experiments to understand the performance and noise characteristics of our qubits. If you're developing near-term applications for quantum computers, having access to this physical layer of quantum devices is crucial. IBM Q doesn't provide a similar API to my knowledge.

3. Programs written with Forest can execute on up to 30 virtual qubits on the Rigetti QVM [4]. This allows users to develop quantum algorithms ahead of any physical device that can run them. Especially if you include noise modeling (we do), 30 qubits is well beyond what you could simulate with your laptop. IBM Q offers a 20 qubit simulator, which is roughly 1000 times less powerful than 30 qubits.

I must mention that IBM recently announced their experience will have up to 17 real qubits! This is larger than any physical device Forest is currently connected to, and represents exciting progress.

[1] https://medium.com/@rigetticomputing/introducing-quil-a-prac...

[2] https://arxiv.org/abs/1509.04279

[3] http://pyquil.readthedocs.io/en/latest/qpu.html

[4] http://pyquil.readthedocs.io/en/latest/qvm_overview.html


Can you measure and reinitialize qubits? Testing fault-tolerant error correction with eight qubits would be very exciting.

Is there a chance that you can roughly summarize noise levels, to give an idea of what to expect? Something along the lines of Table 2 (page 6) in arXiv:1705.02771 would be helpful.

A. Bermudez et al. "Assessing the progress of trapped-ion processors towards fault-tolerant quantum computation", arXiv:1705.02771 https://arxiv.org/abs/1705.02771


The computing model is given by Quil [0]. Section III-F talks about exactly this idea of "measurement-for-effect". You can use measurement in Quil as a way to project into a state that you want. (You can even use a conditional instruction to get feedback and flip it with an X if it measures into an undesired state.)

[0] https://arxiv.org/abs/1608.03355


"Is there a chance that you can roughly summarize noise levels, to give an idea of what to expect?"

You can find metrics for some of our more recent hardware here: http://www.rigetti.com/papers/Demonstration_of_Universal_Par...


Awesome, thanks for the reply. Would you be providing the computational resources to run those 30 qubit simulated experiments? Would there be a limit? That's no small system size!


Yes, we provide the QVM [1] as a cloud-hosted service and it's capped at 30 qubits for now.

[1]: http://pyquil.readthedocs.io/en/latest/qvm_overview.html


I came across this video trying to understand what are quantum computers about... After all, Feynman himself said: "If you think you understand quantum mechanics then you don't understand quantum mechanics." and "If you cannot build it, you do not understand it."

I wonder if anyone could link to something that makes the stuff clear.

https://www.youtube.com/watch?v=dKAF9OCQtIo

"QBism is NOT NEW but at least people are reviving what Bohr thought. QM just involves expectations of observables and the Born rule is just "metaphysical fluff." The confusions are all about false counterfactuals."


If you're looking for a short, and practical introduction using Python, then one is included as part of the documentation for pyQuil (part of the Forest toolkit):

http://pyquil.readthedocs.io/en/latest/intro_to_qc.html


Hi there! I see CL repositories on github too ? Do you guys also actively use Lisp at Rigetti ?



What is the state of languages for programming quantum computers?

I noticed you seem to be using assembly at the moment. Are there higher-level languages out there at the moment, or are those still a ways off?


We have a Python library called pyQuil [1] for writing high-level programs for quantum computers. The "assembly" you mentioned is Quil [2], which we don't expect most users to use directly.

Other high-level languages for quantum computing include LIQUi|> [3] and Quipper [4]. Each approach has it's strengths, but we believe Quil/pyQuil is the best choice for near-term applications. See the Quil paper [5] for more details.

[1]: https://pyquil.readthedocs.io/en/latest/

[2]: https://medium.com/@rigetticomputing/introducing-quil-a-prac...

[3]: https://www.microsoft.com/en-us/research/project/language-in...

[4]: http://www.mathstat.dal.ca/~selinger/quipper/

[5]: https://arxiv.org/pdf/1608.03355.pdf


Do you guys provide access to actual quantum/hybrid computers now or is it all simulated?


Signing up automatically gives you access to run on the simulator. For select users, we are providing some limited access to one of our prototype quantum processors. It's the one discussed in this paper [0]. This initial access doesn't allow a user to run full quantum programs like on the simulator, but but one can test out a few interesting experiments. You can read about what experiments are available here [1]. If you'd like to apply to run some experiments on our hardware then email us with some information about you and your interests at support@rigetti.com

The access to quantum hardware is limited at this point, but we'll be adding new features over the coming months.

[0] http://www.rigetti.com/papers/Demonstration_of_Universal_Par... [1] http://pyquil.readthedocs.io/en/latest/qpu.html


How many issues are there going from the simulator to the practical circuit? I've had issues with the IBM computers.

Thanks for all the cool stuff to play with!


We haven't integrated arbitrary circuit execution features in yet, but when we do we'll make the transition as smooth as possible.

It's important to remember though that noise is a fact of life for this generation of small prototype quantum processors. You'll definitely see a difference between a perfect simulation and true hardware.


But does it run Linux?


We couldn't make the binaries small enough for our current device ;)

We use Linux heavily to develop our own quantum OS. Programs written with Forest targeting the QPU [1] pass through the quantum OS, which runs on the control systems at our facilities. Thanks to our cloud API, users don't need to know these details.

[1]: http://pyquil.readthedocs.io/en/latest/qpu.html


too bad "rigetti" in Italian means literally "you puke"


Unlike Italian, the company name is pronounced with a hard G. And as far as I know, "rigetti" in Italian is "you discard".

Take it as discarding the transistor era of computing for wonderful qubit era. :D


It's both "discard", "puke", and "throw again"


it is the founder's last name.


I read this as "Bighetti" from HBO's Silicon Valley.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: