Hacker News new | past | comments | ask | show | jobs | submit login
Scientists use quantum device to slow chemical process by factor of 100B (sydney.edu.au)
155 points by yoquan on Sept 4, 2023 | hide | past | favorite | 82 comments



I was reading a book hoping to work out how to derive E=mc2. My idea was to go from being about 400 years behind science to merely being 100 years behind. This sort of reporting makes me realise those 100 years are not linear.

I had to go check that this was real - https://www.nature.com/articles/s41557-023-01300-3, because it could have as easily been a marketing site for the next Marvel movie for all I could ground it in my understanding of experimentation.


I'm not sure I really "get" the 400 year old physics. Things like the Lagrangian, Lagrangian transform, and the principle of least action and the Hamiltonian. I understand them on a superficial level and I know very well how to use them, but I just don't really get it. Why can we even write equations for this stuff. I have a degree in physics which makes it all the more depressing.


> Why can we even write equations for this stuff.

It's a (very) weird way to write the same equations as Newton. It happens to be easier to solve on some cases, because calculus works like that.

It's conceptually not very different from using a Fourier or Laplace transformation to simplify some signal handling. But 400 years ago they didn't have any easy way to understand it, so it got an aura of magic that never got away.

(But you probably already know all that. You probably just didn't internalize it because of that aura.)


Alexandre Koyré has written a book and a lot of articles about it (well-known are the Copernician and Gallilean moments), and it made me mostly realize how different their thinking could be. His works are sometimes hard to read because of the mathematics involved but also the extensive quoting in Italian, Latin, Greek etc (though with Google Lens it shouldn't be a problem in 2023).


That's "just" special relativity. I would suggest to see it exposed in current textbooks.

Spoiler: E=m_rest c2 + E_kinetic unless you redefine the mass as a function of velocity. Something that people used to do a century ago but is unusual to it nowadays


Mentioning Marvel in this context made me think of Quicksilver saving everyone. https://iv.ggtyler.dev/watch?v=ZnZqB5Z75zI


This is really a bad idea to study anything. Actually this is what sets aside naïve hobbyist interested in Physics vs someone studying Physics systematically and efficiently as part of PhD program. I did this mistake repeatedly. I started with annotated version of Newton's Principia, spent about a year just pulling my hair and gave up to realize how much time I had lost in all kind of minutia, noise and random things that were completely useless to develop good understanding of core principles. I repeated this mistake yet again by buying Maxwell's original publications and wasting yet another year. And then again with Einstein's original papers and wasting yet another year.

I thought I would do differently and everyone else was doing wrong. I was the one doing wrong. I didn't learned much from all these original manuscript. I also lost precious years which I could have obtained real Masters degree. It is super important to understand that original manuscripts have tons of noise and baggage that only make sense in historical context. They also have unbacked goods which are super hard to digest, if you can digest at all. A ton of experts have already spent their lives distilling these original writings that fits with everything else, easy to digest and doesn't have all that noise. So, get a good textbook and follow that. Stop chasing original manuscripts.


Yeah the beauty of physics is that the idea can always be abstracted and re-explained in an easier to digest way. It's extremely rare that the first presentation of an idea is a very nice way to understand it.

Also, the fashion of physics changes. Back in the day, physicists seemed much more comfortable writing long intuitive arguments with lots of words, and were always happy with a strong connection to classical physics which they obviously had a strong understanding of (since at the time, that was all physics). These days, it's far more common to prefer a more mathematical oriented approach; people prefer to see an equation rather than paragraphs of intuition dumping. Also, we don't care so much about a connection to classical physics, since most physicists are now quite comfortable with quantum mechanics.

Even as a physics researcher it can be painful and difficult to go back to the original papers for things. This is also reflected in the popularity of wikipedia among physicists. It's much more likely that I'm going to understand the explanation of (for example) Bell's theorem on wikipedia than reading his original paper "On the Einstein Podolsky Rosen Paradox." People have come up with better examples, emphasised important points better, and refined the understanding.

We also can't forget the impact that LaTeX (or equation typesetting in general) has had on physics. Reading some of these old typewritten equations can add unnecessary cognitive load.

Now, this is unique to physical sciences. None of this can be relied on with philosophy (at least, continental philosophy). Sure, there are good modern summaries like on SEP, but you can never really be sure that your interpretation of the author will be the same.


There is still something amiss.. A interactive way to break latex symbols down into 3b1b video like explanations.. The glyphs come alive to express there purpose in dance to all..


This is also why I don't understand people who insist we need to read the original philosophy texts.

I can't. I tried reading e.g. Plato and I just want to argue with him forever. He starts with assumptions which were "obvious" then, but are just obviously wrong now. It's much better to read a modern interpretation / summary which takes the bits which are still relevant, and Contextualize them with what we know today.

(Note - if you want to study history of something, by all means do read original texts. But if you actually want working knowledge of the subject, it's a horrible horrible way to go).


I tried reading e.g. Plato and I just want to argue with him forever.

That’s the whole point of reading Plato. You’re being exposed to the Socratic method.


Then I have not explained myself correctly.

If I take a philosophy class, I can talk to teacher, discuss things,explore questions and Contradictions. And I do :-).

When Plato says "let us all agree there exists a perfect blue independent of any real world blue", I want to argue that no, "blue" is a human construct and dependent on accidental biology. We can't even make sure thay we all experience the same blue and what "blue" is, is purely definitional. There exists no inherent "blue" other than us arbitrarily taking a chunk of EM spectrum and attaching a loose meaning of "blue" to it.

Plato did not know of EM spectrum or cones in our eyes. half the things he starts as "it is obviously true that..." I scream"no it is not! I challenge your key assumptions ".but he's dead and can't argue with me :-). So for me, that's not the whole point of reading Plato - it's the whole frustration.

-----

Edit : you've added "that way you're exposed to Socratic method " which changes the gist of the comment, but I'll still argue it's a horrible way to do so. I understood Socratic method for a decade if not two before I tried Plato. If I tried Plato first, the dubious and implausible arguments would have turned me off completely. Both the teacher's and the student's proposals are frequently at odds with our current understanding, so as a reader I'm not invested in either and the argument feels farcical as opposed to logical.

My point is:

If somebody asks "what is Socratic method", pointing them to Plato is a horrible (and frankly elitist / gatekeeping) way to go. It can rather be explained in a few minutes with excellent, understandable modern examples. You can get working knowledge quickly and effectively. There are circumstances and situations where reading the original is relevant! But if you want to understand the basics, or even intermediate knowledge of actual subject, reading original newton and original Plato are not great ways to start.


but he's dead and can't argue with me

But you’re alive and you can supply both sides of the argument, if you believe the topic is interesting. The concept of “perfect blue” is beautiful and powerful - even if it’s wrong! It can lead to very interesting ideas, such as “universe as a mathematical structure”, or having a soul. Just play along and be flexible.

Also, you can prompt GPT-4 to argue with you as Plato would.


This reminds me of Kierkegaard arguing with himself in various publications under pseudonyms. I think he must have had some intense cognitive dissonance about most things for most of his life. It makes for a great way to flesh out a topic from both a pro and con standpoint, but it can't have been easy to go back to "normal" thinking modes after doing so.


> When Plato says "let us all agree there exists a perfect blue independent of any real world blue", I want to argue that no, "blue" is a human construct and dependent on accidental biology.

Assuming some premise for the sake of an argument is not just a plato thing, and its not particularly relavent if the specific premise is true.


Oh for sure but I think it's a spectrum.

I find it most productive when the assumption is asked and granted for something uncertain, interpretable, subjective, etc. Or when we are just having fun :-)

If a) you are building a serious proposedmodel of reality and b) start with something demonstrably false, I'd have to be in a pretty specific mood to go with it :-D


You're not reading.

You've fallen into the age old trap of reacting, to written text.

His argument might well be farcical, suspend disbelief like you would watching some Micheal Bay nonsense.

Do you let people get to the end of their sentence irl? I've noticed more that some people wont tolerate a building narrative in basic sentences, and want to deal only in conclusions.

I really do miss the days when people regularly had multi-layered, long running conversations. Now everything has to be said in discrete steps, one by one.


Thx for the response!

(To your very last sentence, I'd venture that kind of fees like the Socratic method :-)

But to the rest of the post, can you elaborate? Are you suggesting relaxing the skepticism / logic muscle and reaction for a while, and reading to see where it goes? I'd be willing to discuss that approach, though I'm also aware that's exactly what my cousin pishing deepak chopra keeps saying :-P.

In the finite time we have on this earth, how does one choose where and when to relax the filters, as opposed to saying "this is not bringing value, there are better materials to ingest?".

And the other context, which I feel we still haven't addressed is - lateral to whether there is any advanced and isolated value in reading Plato, do we really feel it's a good suggestion to get somebody introduced to concepts? I still feel there are way better ways to get somebody introduced to core ideas, and then advanced readers who want to zero-in, experience the original material and immerse themselves into context of the times (and the good / bad / ugly / archaic thay goes with it), can certainly choose to do so.

Thx again!


Does there exist a perfect number four independent of any real world number four?


There's a related idea, which is that there's no reason that the first person to discover something will happen to find the best way to teach it. In fact there are many reasons for that not to be the case: their own understanding is nestled in a complex web of adjacent trivia, some of which is critical but they don't realise it, and some of which is unrelated but seems vital to them.


On the other hand, some of Darwin's original treatise - Origin of species of species in particular, is well worth a read.


From skimming Gallileo he could also be a nice read.His writing is entertaining and has a lot of examples, so could help in internalizing some basic ideas.


Yeah! Worth noting that both Galileo and Darwin intentionally published their most important works in the form of easily digestible „pop-science“ intended for the inclined layman. This was because they knew they faced near universal opposition in the scientific establishment of the time.


Maybe I got a bad translation but that was completely unreadable to me.


As a complete novice, I had a similar revelation naively trying to derive one of Maxwell's mind-blowing insights connecting light to magnetism. It was well accepted that if you just divided the electric constant by the magnetic constant (in their respective inverse square force formulas), you end up calculating the speed of light. The units and numbers matched up nicely using old definitions and units. I wanted to see it soup to nuts in modern terms, but so much has changed not only in the definition of magnetism, but also new standardization of units, more accurate measurements of vacuum permeability... I just gave up. I'll take Maxwell's word for it.


The derivation is pretty simple, but a little messy in the integral formulation of Maxwell's laws

https://phys.libretexts.org/Bookshelves/University_Physics/B...


This reminds me of how nobody reads Keynes and everyone has read Hick's interpretation of Keynes instead. Except in this situation there is quite a large discrepancy between the two.


That's what the shoulders of giants are for. No need to grow your own beanstalk.


A factor of 1e11, while large, is commonly found in chemistry where most phenomena have a log relationship. The standard pH scale, for example, already spans 14 orders of magnitude.


I think there is a key difference in magnitudes of ion concentration vs time. At this scale, a faster detector seems to be comparatively less trivial than a more sensitive one.


You've got to love the declaration at the end of the article:

> The research was supported by grants from the US Office of Naval Research; the US Army Research Office Laboratory for Physical Sciences; the US Intelligence Advanced Research Projects Activity; Lockheed Martin; the Australian Defence Science and Technology Group, Sydney Quantum; a University of Sydney-University of California San Diego Partnership Collaboration Award; H. and A. Harley; and by computational resources from the Australian Government’s National Computational Infrastructure.

This is sponsored by the military.


Sure, but you're using the internet, courtesy of military-funded research, using electricity that's been routed through distribution and substations that exist thanks to military-funded research, travel places based on GPS, courtesy of direct military research, your modern life is only possible because just like everyone else, the military has a budget allocated for researching computing and electronics advances that they can use, which invariably translates to things the private sector can use.

So what's your point?


Well, it does make it clear that human advancement is along the terms set by military diktat - there's nothing natural about it.

Why should we be worried that the military fund the internet, quantum computing, etc? Can there be any reason for concern?

Could it be that the military is less concerned about external threats, but way more concerned about managing internal ones? And that funding technology to create a technocratic panopticon has been long in the making? If directing technology towards technocratic infrastructure, such goals are really pretty complete, when you think about it.. Surely only a couple more elections before it is switched on, if that.


There's nothing natural about modern society, so again: what's your point? Because not being natural is kinda of literally the whole point of society itself: we removed ourselves from "the natural order of things" because that allowed us to kick progress into higher gear.

You live in a highly curated society brought to you by, for an incredibly long period of time, military advances making its way into civilian life. (Does that mean only military advances? Of course not, don't pretend that ∃A, A→B means ∄ C, C→B)

Especially since, and this is something people seem to forget, the US military actually works for the US. Unlike congress or the NSA, it doesn't give a shit about oppressing fellow Americans, it literally funds projects it knows will benefit progress. Sure, it funds whackjob cultists like Raytheon too, because they need weapons for playing war, but the vast majority of US military research is actually for things that benefit everyone, not just the military.


> There's nothing natural about modern society, so again: what's your point?

It makes it very clear how society and its technology is being steered by the military. Most people are under the impression that scientists etc are free to decide how best to uncover truth. But no, this is not a free investigation - it is directed.

> You live in a highly curated society brought to you by, for an incredibly long period of time, military advances making its way into civilian life.

I agree. But the military is in service to money and power, just like everything else. They are there for specific jobs, including funding the technology they want.

> Because not being natural is kinda of literally the whole point of society itself: we removed ourselves from "the natural order of things" because that allowed us to kick progress into higher gear.

What "we"? What choices did anyone make? This is all forced behaviour.

> Especially since, and this is something people seem to forget, the US military actually works for the US. Unlike congress or the NSA, it doesn't give a shit about oppressing fellow Americans, it literally funds projects it knows will benefit progress.

Why would the military not be subject to the laws of money and power, as opposed to other agencies? None of these institutions are there for our benefit. They are there to control.

You might like the subjugation, and feel comforted by the fact some powerful force is running things, preserving you from all the fears they tell you are out there. Personally I would like to experience reality without the manipulations and orchestrations of all these governance institutions. This is also known as freedom. But then, people are so trained and invested in this nonsense, as if it was some sort of God, they would be unable to function if this strong guidance were taken away.


Is that a bad thing? Isn't a large amount of computing research paid for by the military?


So they simulated a chemical process, and the simulation ran 100B times slower than the actual chemical process?

Kind of like how simulating anything in detail tends to be a lot slower than the actual thing you're simulating?

Is the difference here that it's basically an analog rather than a digital simulation?

Not following if there was any breakthrough here or not.


> “Our experiment wasn’t a digital approximation of the process – this was a direct analogue observation of the quantum dynamics unfolding at a speed we could observe,” he said.

No, they didn’t simulate it in the way we typically simulate. They created a physical process that was an analogue of the actual process but 100b times slower so they could directly observe it.


That sounds awfully like an analogue simulation.


Not an expert, but what I'm understanding is this:

With a regular simulation, whether analogue or digital, you create a mathematical model, and evaluate that model in some way to get a result, which hopefully matches the actual phenomena in whatever it is you were trying to simulate.

Here, it's more like instead of measuring the phenomena, they instead measured something that is physically related to that phenomena. There seems to be no modelling involved.


Like a scale model aircraft in a wind tunnel?


Like a scale model aircraft in a tunnel, with a fluid that behaves like air, but just enough it can flow slower. Before FEA and CFD, people did a lot of work with miniatures. I remember simulating a long metal rod for an oil rig that was a plastic pipe filled with liquid mercury so that it'd flex the same way the kilometer-long metal rod would.


Sounds like a semantics debate about the meaning of the word "simulation".


No. An analogue system is not a simulation. A simulation is some approximation - a model - of a physical system that you compute with some finite precision.

An analogue is a physical entity with the same physics as some other thing.

A straightforward engineering example would be scale model of an airplane in a wind-tunnel.


MONIAC was an analogue system which implemented a model - an approximation - of the UK economy. From https://en.wikipedia.org/wiki/MONIAC

> The MONIAC (Monetary National Income Analogue Computer), also known as the Phillips Hydraulic Computer and the Financephalograph, was created in 1949 by the New Zealand economist Bill Phillips to model the national economic processes of the United Kingdom, while Phillips was a student at the London School of Economics (LSE). The MONIAC was an analogue computer which used fluidic logic to model the workings of an economy. The MONIAC name may have been suggested by an association of money and ENIAC, an early electronic digital computer.


You're getting confused between the words "Analog" and "Analogue".

Analogue: something that is similar or comparable to something else either in general or in some specific detail : something that is [analogous] to something else

Analog: of, relating to, or being a mechanism or device in which information is represented by continuously variable physical quantities

GP's comment has nothing to do with whether the simulation is "analog" (as opposed to digital), their point is that instead of being a "model" of a way a physical system might behave, the mechanism in this case has the exact same physical properties represented in a different form. An example might be doing a calculation in decimal vs doing it in binary—the exact steps you take will look different, and the answer might look strange, but there's absolutely no doubt about the fact that you'll get the same answer however you happen to compute it. Another synonym for the word "analogue" in the sense that the GP is using it might be "isomorphism"—you can prove that some transformation holds, and then you can do whatever you want to the transformed version and know that the results you get can be "transformed back" to the original form and reinterpreted.


Then so is Wikipedia. The quoted text was "The MONIAC was an analogue computer".

Ahh, "analogue - (mostly Commonwealth) Alternative form of analog" https://en.wiktionary.org/wiki/analogue#Adjective .

https://dictionary.cambridge.org/us/example/english/analogue... gives examples like "The new digital technology—which we welcome—merely replaces the present analogue system" and others.

English is a nuisance.


My (non-expert) brain maps it to this. Your output is a curve of some kind with the same mathematics as the problem you actually care about. The term “Quantum computing” can really refer to a number of quantum mechanical systems with fundamentally different properties (e.g., boson vs fermion) that have the same mathematics as some classes of problems


I'm no expert but my understanding is that the word 'simulation' is a misnomer in this context. A better way to describe what they're trying to do is a proxy system. Effects observed in this proxy system would be used to predict interactions in reality. Whereas simulations are better at describing how real world interactions diverge from our models.


> Kind of like how simulating anything in detail tends to be a lot slower than the actual thing you're simulating?

From the article[1]:

Our approach avoids the limitations of direct experiments on molecular systems, where only few observables such as spectra and scattering cross sections can be measured. [...] A further advantage comes from the ratio (r) of the ion’s natural timescale (ms) and the measurement speed (ns), leading to an increase in the observable timing resolution of r ∼ 10^6. This improves the achievable resolution of chemical dynamics measurements relative to ultrafast observations.

So seems this would be more like running a computer simulation with a super-short timestep, allowing you to extract more details of a process. It's only related to the wall-clock in that they're using a physical analog system, rather than a computer.

> Not following if there was any breakthrough here or not.

Again from the paper:

Our approach to quantum simulation using an MQB trapped-ion system makes chemical dynamics that are otherwise unmeasurable directly accessible in the laboratory. This is a key demonstration of the utility of small-scale quantum computational devices to offer practical insights into chemical dynamics and resolve intractable problems in chemical physics.

Seems the measurement itself was a showcase for the techniques developed.

[1]: https://arxiv.org/abs/2211.07320


Was expecting they'd be using the quantum Zeno effect[1], which really does slow down the time-evolution of what you're measuring.

[1] https://en.wikipedia.org/wiki/Quantum_Zeno_effect


> This allowed them to design and map this very complicated problem onto a relatively small quantum device ­– and then slow the process down by a factor of 100 billion.

They say they mapped the problem.... So is this a model of an observation, or an actual observation?

> “Until now, we have been unable to directly observe the dynamics of ‘geometric phase’; it happens too fast to probe experimentally.

> “Using quantum technologies, we have addressed this problem.”

'cos if its a model, they are obviously still not observing whatever-it-is directly, right?

PS I'm pretty sure they are talking about their model.


I’m stuck on how it is known that the quantum analog is operating the same as what it is an analog of. I’m completely out of my depth so I fed the paper to Claude 2 and, after some back and forth, got this:

“You raise a good point. There is no absolute certainty that the analog quantum system operated in exactly the same way as the original chemical reaction dynamics it was meant to model. Some key caveats and limitations include:

- The analog system is still an approximation, so there may be small differences in how the dynamics play out compared to the real system.

- Mapping a complex molecular system onto qubits necessarily requires simplifications and abstractions that could influence the outcomes.

- Factors like experimental errors, imperfect state preparation or measurement in the trapped ion system may introduce discrepancies.

- Important details like multi-particle interactions or higher-order effects may not be fully captured.

- Verification that the analog system exhibits the same identifying signatures or phenomena as the natural system would strengthen confidence in the analogy.

So while the researchers aim to design the quantum analog to faithfully mimic the essential physics, perfect equivalence cannot be taken for granted due to modeling approximations and technological limitations. The mapping should be validated by testing for characteristic properties before concluding the slow-motion "observations" definitively represent the original phenomenon. With improvements, analog quantum simulation could provide increasingly accurate models of chemistry.”

Is this a reasonably well grounded statement? And if so, how can anybody hope to verify the analog is exhibiting “the same identifying signatures or phenomena as the natural system” if the whole point is that we can’t observe the natural system with any precision to start with?


Does anyone have any idea how a person could go about trying to understand the basics of what's going on here, even in just a hand wavey way? This is so far above my head it seems like magic.


There are chemical processes, most notably photochemical processes like photosynthesis, that funnel molecular states into new configurations so rapidly, that observation of the details of the reaction is impossible. This research simulated one class of such reactions (conical intersection reactions) by modeling the wave function that governs the time evolution of the reaction using a quantum computer, allowing them to run the reaction 100B times slower than occurs in nature, and to measure the quantum state evolution. They are thus able to get a clear picture of how the reaction proceeds, as measured by several different observables, from the simulation.

It's a direct demonstration of the utility of quantum computation in molecular modeling. The meta-relevance, to me at least, is that it demonstrates real progress in one of the areas where quantum computation is most likely to have an important impact.


Thank you, this is helpful.


I guess some theoretical chemistry basics omitted in the short article wouldn't hurt:

In order to describe chemical reactions or atomic arrangements in terms of wave equations one normally treats the motion of the nuclei (slow/heavy) and the motion of the electrons (fast/light) separately simplifying the Schrödinger equation to the Born-Oppenheimer approximation.

In introductory chemistry textbooks [0] a diatomic example is mostly used as an illustration, for >2 atoms usually only the ground state is considered. This is because (1) in a diatomic setting the vibrational degree of freedom in the nucleus reduces to 1 and (2) the ground state can be well distinguished from other electronic states.

However when studying (advanced theoretical) chemistry or material sciences, polyatomic arrangement with tightly packed electronic states and a lot of nuclear degrees of freedom are the norm and the theory of so-called conical intersection of electronic energies essential in that regard.

Early on this was taken into account as the Jahn-Teller distortion[1]: a kind of spontaneous symmetry-breaking which seemed exotic when it was first described in the 1930s; in that same vein Teller later proposed an ultrarare occurrence within a few vibrational periods (sub-femtoseconds) by which a loss of electronic excitation was not followed by a photon being emitted: radiationless decay. Now, in refined orbital models [2] this seems to be a normal state of affairs e.g. in organic chemistry.[3]

Because of the tiny time scales involved theoretically predicted phenomena like a Geometrical phase/Berry phase (which itself has the Foucault pendulum in relation to Earth's latitude as its mechanical analogue [4]) have not been observed, yet. So borrowing from a topological analogue (Dirac points) [5] a quantum simulation seemed feasible.

To be honest the actual paper [6] linked in the article was hard to follow through so I found a similar paper [7] where the presentation of the general idea is more clear and concise.

[0]https://chem.libretexts.org/Courses/Pacific_Union_College/Qu...

[1]https://en.m.wikipedia.org/wiki/Jahn%E2%80%93Teller_effect

[2]https://core.ac.uk/download/pdf/9426023.pdf

[3]https://en.m.wikipedia.org/wiki/Quenching_(fluorescence)

[4]https://en.m.wikipedia.org/wiki/Geometric_phase#Foucault_pen...

[5]https://condensedconcepts.blogspot.com/2015/08/conical-inter...

[6]https://arxiv.org/pdf/2211.07320.pdf

[7]https://arxiv.org/pdf/2211.07319.pdf


What does this mean? Can you slow down time? I would assume you must also slow down physics in order to slow down chemistry. No?


They're using an analogous system. The speed of the process in the analog does not have to be the same as in the original system, in terms of wall-clock time.


Can a quantum device speed up a chemical process by the same?


Unfortunately they didn't change the speed of a process. They mapped the process and them change the replay speed to watch it, like changing the play speed on a video.


One technique, with repeating processes, is to observe adjacent very thin slices of time from different cycles. This is how movies of short light pulses moving through space are made.



>According to the discoverers, a minute amount of administratium causes one reaction to take over four days to complete when it would have normally occurred in less than a second.

There are 86k seconds in a day. 345 600 seconds in 4 days.

This quantum-slowing effect reduces the speed by 100 000 000 000.

Administratium is about as close to the actual speed of the reaction as it is to the slowed-speed reaction.


I think you might have miscompared to a day instead of a second? Unless I missed something. But you're right that if we were to be observing the administratium reaction over one millisecond in the slowed scenario, it would take 2.73 nanoseconds at full speed, nothing like femtoscale.


> ...administratium causes one reaction to take over four days to complete when it would have normally occurred in less than a second.

Let x = administartium slowdown effect = 4 days / 1 second

x = 4 days * 24 hours / day

x = 96 hours * 60 minutes / hour

x = 5760 minutes * 60 seconds / minute

x = 345600 seconds

Wrote this out to also convince myself, I am 35, did physics for 3 years in undergrad, and am apparently still bamboozled by orders of magnitude regularly. Completely unintuitive!

In fact, lemme do it in reverse, I'm shocked.

Given x = slowdown factor = 100B = 100_000_000_000

x = 100_000_000_000 seconds / (60 seconds / minute)

x = 1_666_666_666 / (60 minutes / hour)

x = 27_777_777 hours / 24 hours / day

x = 1_157_407 days / 365 days / year

x = 3,170 years!


Wow. Such a better illustration to compare 4 days to 3170 years than comparing 2.73 nanoseconds to a femtosecond. It makes me realize how linear my in-the-moment imagination of nano to femto is.


Or a day to a thousand years, give or take a little.


What's the difference between a millionaire and a billionaire?

1 million seconds == 11.57 days

1 billion seconds == 31.71 years


If you swallow it, your computer will appear 100B times faster!


Now that's some quality humor!


You know, I think these sorts is scientific research explainers would be better if they actually just said what they did, what the insight was, what we can do now or what predictions we can now make.

I understand the desire to make the discovery accessible but this does not accomplish that. If we measure information by “what predictions can a reader now make that they couldn’t make before” then this press release is information free.

Instead we have a lot of words to attempt to create the impression of having read something.


This stuff is straight up magic.


It's literally not


There's always someone who doesn't know all the dictionary meanings of "magic." From Google's copy of the Oxford English dictionary:

- a quality that makes something seem removed from everyday life, especially in a way that gives delight.

- something that has a delightfully unusual quality.

- very effective in producing results, especially desired ones.

- (informal, British) wonderful; exciting.


So it doesn't tick any of those boxes expect maybe half of the first? Sounds like it's literally not magic.


I'd say all of them but YMMV.


When was the last time quantum physics sparked delight in a member of the general public? Sorry, let me rephrase: has there even been a first time?


I'm a member of the general public and it's done that lots of times for me.

Besides that, "member of the general public" isn't part of Oxford's definitions. If the "magic" commenter above is a professional quantum physicist, that doesn't invalidate their feelings about this. If you're more jaded, that doesn't make your own feelings more valid or authoritative.

In any case, this is getting a bit tedious and silly.


Is anything? Or is there a possibility they didn't intend for an implicit "literally" to placed in there?


It's fair to say "sufficiently advanced technology".


why take it literally




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: