Hacker Newsnew | past | comments | ask | show | jobs | submit | rramadass's commentslogin

Not sure what is going on in QC world; With this ACM prize it has become even more murky.

As Sabine Hossenfelder (Theoretical Physicist) points out, companies to do with QC are seeing a surge in investments and marketing. It is as if somebody knows something that the "common public" doesn't - https://www.youtube.com/watch?v=gBTS7JZTyZY

I don't know enough about the science/technology to form an opinion but have recently started down the path of trying to understand it - https://news.ycombinator.com/item?id=46599807


> It is as if somebody knows something that the "common public" doesn't

oooorrr - and hear me out - investments are inherently hype-based and irrational and there is too much money flying around to do actual smart decisions


Nope.

Quantum Computing (QC) is unlike previous technologies which were all mostly "logical structures" (i.e. the underlying Physics/Technologies were well-known). The viability of both the core Physics itself and its realization through Technology for QC are questioned by some Physicists/Technologists themselves. But in 2024/2025 many Govts. and Companies both have started investing heavily in QC. Moreover the advanced countries have implemented export controls on QC technology prohibiting export of QC computers above 34-qubits.

And now the ACM prize for something done long ago in quantum information.

Finally note that QC algorithms can be simulated (for small size qubits) on conventional computers and the current AI technologies may also play a part here i.e. implement QC algorithms on the "Cloud supercomputer" and using AI technologies.

The logical inference is that there has been some technological (one or more) breakthrough in the realization of the QC qubits technologies, QC algorithms running efficiently on the cloud, AI usage for QC etc. Nothing else explains all of the above facts.

See also: The Case Against Quantum Computing by Mikhail Dyakonov (Professor of Physics) - https://spectrum.ieee.org/the-case-against-quantum-computing


> Quantum Computing (QC) is unlike previous technologies

aaand you entered the "hype and irrational" territory. I dare you to reread your own comment, it is funny

right now QC is 5 orders of magnitude away from practical systems - there's NO profit to invest for. It's all research that is being hyped and overpromised because there's not enough money in that sector and because established players (like google) don't want to lose their face

viability of core physics does not imply immediate creation of product. I'd point to fusion, but that's also currently getting over-hyped 15-20 years too early

governments are only investing the same way as into particle accelerators - in form of research grants

simulation of QC is both extremely trivial (in "exponentially-slower" way) and existentially impossible (the whole sector would not exist if it was actually possible to use good old normal CPUs fast enough). Bringing in "AI technologies" only shows you as a gullible idiot that still parrots ai bubble without understanding exact details

If there is a breakthrough - it is secret government information, and it would not be available to non-government companies, especially those you can invest into. The moment such breakthroughs reach the market, knowledge of the very existence spreads - and yet all current known progress is dull.

The only evidence worth anything out of what you brought up is the export controls - and those have been extremely pre-emptive in preparation for geopolitics and far future tech. Error-correction barely started to be useful at 100 cubits, so 34 makes no sense other than to minimize brain drain with base tech


> aaand you entered the "hype and irrational" territory. I dare you to reread your own comment, it is funny

You have not understood the first thing about what i had pointed out.

> right now QC is 5 orders of magnitude away from practical systems - there's NO profit to invest for. It's all research that is being hyped and overpromised because there's not enough money in that sector and because established players (like google) don't want to lose their face

While there has been hype, in the last couple of years things seem to have changed and now culminated in the awarding of the ACM Turing Award prize. Do you know anything about the Physics/Mathematics behind qubits (eg. probablities/superposition/phase/noise etc.) and/or how that has been realized via technologies (eg. superconducting/photonics/trapped-ions etc.)? People are looking at "hybrid" quantum computers i.e. conventional+quantum (eg. IBM, Fujitsu), shuttling qubits on silicon (eg. Hitachi) which allows existing foundry technology to be used for QC. This is huge.

> viability of core physics does not imply immediate creation of product. I'd point to fusion, but that's also currently getting over-hyped 15-20 years too early

Non-sequiteur.

> governments are only investing the same way as into particle accelerators - in form of research grants

No, Govts. are actively funding startups in this area and including technology research/transfers in their Free Trade Agreements with other govts.

> simulation of QC is both extremely trivial (in "exponentially-slower" way) and existentially impossible (the whole sector would not exist if it was actually possible to use good old normal CPUs fast enough).

Simulation of QC is not "extremely trivial" but requires HPC technology. Datacenter/Cloud technologies are also utilized here. Generally only around 30-50 qubits have been simulated with 50+ qubits being exponentially prohibitive in terms of compute power/memory.

>Bringing in "AI technologies" only shows you as a gullible idiot that still parrots ai bubble without understanding exact details

To use your own language; this right here shows that you are just a clueless idiot about this domain. AI is a tool applied to various domains eg. AlphaFold for protein structures in Biology which solved an almost intractable problem. People are doing the same with QC+AI. There are a bunch of papers on this; for your edification start with Quantum Computing and Artificial Intelligence: Status and Perspectives - https://arxiv.org/abs/2505.23860

> If there is a breakthrough - it is secret government information, and it would not be available to non-government companies, especially those you can invest into. The moment such breakthroughs reach the market, knowledge of the very existence spreads - and yet all current known progress is dull.

This demonstrates your gullibility. Since one of the best studied usecases for QC is cryptography, if there has been a breakthrough in some lab (govt/academia/company all of whom have secrets), the powers-that-be would not want it to be widespread for security (mainly) reasons. But hints might have been given and investments encouraged. Almost all QC companies have a govt. tie-up and cryptographic technologies have already been subject to export controls from the very beginning. Another scenario is defense applications. There are plenty more but these two are the main ones.

> The only evidence worth anything out of what you brought up is the export controls - and those have been extremely pre-emptive in preparation for geopolitics and far future tech. Error-correction barely started to be useful at 100 cubits, so 34 makes no sense other than to minimize brain drain with base tech

That is the obvious superficial take. Given what i have written above, what if semiconductor technology i.e. the "hybrid" QC+Conventional allows one to simulate 100+ qubits easily now? What if there has been some breakthrough's by using AI on QC algorithms both existing and new ones? Have any formerly intractable problems in Physics/Chemistry/Biology/Mathematics been made tractable now due to AI usage? How many of these can be implemented on a QC? Etc. Etc.

To summarize; you have to look at the whole complex picture before drawing conclusions. Merely parroting trivialities like "hype" is meaningless.


Commercialization can bring in speculators and hype. And, I'd argue that speculation is a necessary for accelerating market development. Commercialization brings with it unique forcing functions that don't exist in academic settings, and this historically leads to acceleration of functional products. The first step is building a quantum computer to learn how to build a quantum computer. That step is done, while research continues in many areas, the commercialization challenges are largely engineering in nature.

I've only seen 34 qubit simulators (eg AWS SV1). My understanding is that 34 qubit uses 512GB of RAM, and each additional qubit doubles the RAM requirement. So, 50 qubit simulated would require 16.8M GB of RAM.

100 logical qubits seems to be the minimal threshold for interesting/useful quantum computing, albeit with very limited use cases. Classical still beats most. Quantinuum will hit that number in 2027. And, IonQ (often cited as being a hype-machine) expected to have 800 logical qubits in 2027.

The industry is moving out of the NISQ Era (noisy-intermediate-scale-quantum) and into the Fault-Tolerant QC (FTQC) era. NISQ is experimental. FTQC is commercial (ie reliable, repeatable).


Nice, an informative meaningful comment. From your userid; would i be correct in deducing that you are affiliated with QuBOBS Project? - https://qubobs.irif.fr/portfolio/

You are certainly right that commercialization (and speculation does play an important role here) serves as a forcing function to accelerate development of products. But this needs to be done somewhat in-sync-with/a-little-ahead-of the actual science and engineering. When the subject is inherently difficult to understand (as is the case with QC) it can very easily get out of hand and become just snake-oil/bullshit and exploited by hustlers/grifters/charlatans.

Do you have any links to more information on the points that you make above that you can share? Specifically on hybrid quantum-classical systems and silicon-based shuttling-qubits which can use current foundry technology? To me, this seems to be the future since both the scaling and availability are taken care of.

As regards scaling of qubits, Caltech recently achieved 6100(!) qubit-array - https://www.caltech.edu/about/news/caltech-team-sets-record-...

Wikipedia also has a list of quantum processors and their specs - https://en.wikipedia.org/wiki/List_of_quantum_processors


Check out Eric weinsteins latest theory about how frontier physics has moved “dark” (with a grain of salt, some of the other things he says might tempt you to discount him completely)

I mentioned this in the other thread on Tony Hoare;

Theories of Programming: The Life and Works of Tony Hoare published by ACM in 2021 - https://dl.acm.org/doi/book/10.1145/3477355

See the "preface" for details of the book - https://dl.acm.org/doi/10.1145/3477355.3477356

Review of the above book - https://www.researchgate.net/publication/365933441_Review_on...

PS: You can check with some lady named "Anna" on the interweb for book access :-)


You can find more on his website - https://dwheeler.com/

How he wrote it: AI Agent Teams Finished My Sci-Fi Novel in 12 Hours - https://barrgroup.com/software-expert-witness/blog/ai-agents...

This is an excellent research presentation in the History of Mathematics from Ancient Sanskrit Sources.

The Presenter: Professor Clemency Montelle from University of Canterbury - https://www.math.canterbury.ac.nz/~c.montelle/


For a beginner;

1) C++ Primer 5th edition (updated to C++11) by Stanley Lippman, Josee Lajoie, Barbara Moo. Don't bother with any other book until you have made a full pass over this. You can later update yourself to C++20/C++23/etc. from the books by Stroustrup/Others.

2) Inside the C++ Object Model by Stanley Lippman. An old classic to understand the "C++ Abstract Machine" built on top of the "C Abstract Machine".

PS: The Definitive C++ Book Guide and List - https://stackoverflow.com/questions/388242/the-definitive-c-...


This is a article with quite the wrong framing.

Motherhood is Nature ensuring the propagation of the species.

What is wrong today is that the Western World (from 20th century onwards) has demolished the support system which was built into societies of all cultures to support motherhood (with extended family/relatives/neighbours/societal-help/etc.) and forced Mothers to do everything solo and alone. Add in the current fast-paced, consumerist, more-selfish, i-can-do-everything, career-mom, single-mothers, media-driven societal culture and you have an almost impossible to bear load on Mothers today.

See also;

‘It felt shameful’: the profound loneliness of modern motherhood - https://www.theguardian.com/lifeandstyle/article/2024/aug/21...


[flagged]


> Ahh yes.

This is a hallmark of a low quality comment on HN, as is the snarkily dismissive tone of this whole comment. It's also specifically against the HN guidelines to attack the weakest-possible interpretation of what someone says. HN is intended for people who are looking for a higher-standard of discussion than what is available elsewhere. If you want to keep participating here, please read the guidelines and make an effort to observe them. https://news.ycombinator.com/newsguidelines.html


Ahh yes. Usual low effort strawman with nothing meaningful to discuss.

Faraday was the quintessential "non-formal" Scientist. He was proof that you don't have to always formalize everything mathematically (in a domain) before understanding and contributing to it. While formalization is important it is not the be-all and end-all that it is often made out to be.

Here is a great communication from Faraday to Maxwell on receiving one of Maxwell's paper;

Maxwell sent this paper to Faraday, who replied: "I was at first almost frightened when I saw so much mathematical force made to bear upon the subject, and then wondered to see that the subject stood it so well." [Faraday to Maxwell, March 25, 1857. Campbell, Life, p. 200].

In a later letter, Faraday elaborated:

"I hang on to your words because they are to me weighty.... There is one thing I would be glad to ask you. When a mathematician engaged in investigating physical actions and results has arrived at his conclusions, may they not be expressed in common language as fully, clearly, and definitely as in mathematical formulae? If so, would it not be a great boon to such as I to express them so? translating them out of their hieroglyphics ... I have always found that you could convey to me a perfectly clear idea of your conclusions ... neither above nor below the truth, and so clear in character that I can think and work from them". [Faraday to Maxwell, November 13, 1857. Life, p. 206]

PS: You can read Faraday's (and other 19th century scientists) letters at the Epsilon website - https://epsilon.ac.uk/


Thank you very much for the link.

This letter from Maxwell to Faraday: https://epsilon.ac.uk/view/faraday/letters/Faraday3354 is very surprising to me. Already they are thinking of gravity in field theoretic terms. Just to set context: Riemann's Habilitationsschrift was just 3 years earlier but not to be published till late 1860s.


Faraday actually did some experiments to try and show that Electricity and Gravity were related and in 1851 published On the possible relation of Gravity to Electricity in "Philosophical Transactions" of the Royal Society!

This article gives the details; Michael Faraday, grand unified theorist? (1851) - https://skullsinthestars.com/2009/03/06/michael-faraday-gran...

His ability to conceptualize/intuit and devise theories/experiments-to-test-theories was unparalleled. Maxwell could not have come up with his formal mathematical equations if he did not have Faraday's conceptual work on which to build upon. It goes to show that concepts/intuition must always go ahead of formalization.


Because SK Hynix has substantially ramped up its production of HBM memory for GPUs (due to AI demand) which requires more silica per die than DRAM. Since companies produce HBM and DDR memory in the same factory and using the same equipment (more or less) shifting production to HBM has the double whammy of less silica for and hence less DRAM production.

See this video by Anastasi In Tech to understand the memory crisis - https://www.youtube.com/watch?v=KghkI5Oh_lY


> IDK what Dijkstra believed in terms of how programmers should have looked like,

https://news.ycombinator.com/item?id=47373080


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: