'Room-Temperature Superconductivity Achieved for the First Time'
Quanta Magazine ought to know better than to lead a story with a misleading headline like that. Moreover, they've even made it worse by 'exaggerating' the temperature by quoting it in degrees Fahrenheit in stead of Celsius (using Fahrenheit is a no-no in science).
That superconductivity was achieved at room temperature but at that huge pressure is hardly worth reporting as it's of no practical value whatsoever thus only interest to fundamental research.
Fundamental research is of great importance. It is where the breakthroughs of practical value begin.
Our understanding of electricity did not emerge with the lithium battery, transistors, nor solar cells. It began with scientists studying static electricity, Leyden jars, and making frog-legs twitch, all without practical value.
Superconductivity in any context at room temperature is a substantial breakthrough, doubly so because the experimentalists were operating with semi-viable guidance from theory. This is physics at its best. It may well lead to higher-Tc materials.
You didn't read what I said, I was being critical of Quanta's sensational and misleading headlines, I was NOT being critical of fundamental research.
' That superconductivity was achieved at room temperature but at that huge pressure is hardly worth reporting as it's of no practical value whatsoever thus only interest to fundamental research.'
It shouldn't be necessary for me to labor the point, but my quote above does NOT say that that I'm against fundamental research (in fact I'm very much for it and have always been so).
Frankly, I'm annoyed that you can even suggest what you have said. Just because you disagree with part of what I've said does not give you the right to twist my words to mean something altogether different—something I did not say. I am very careful in what I say online and usually I labor the point by restating what I'm saying in different ways so that I'm not misinterpreted (unfortunately I did not do that here as what I was saying seemed clear enough).
With respect to science reporting generally: sensational and misleading reporting by the media—often encouraged by scientist themselves to increase their changes of getting new grants, etc.—has done a great deal of damage to science and scientific research over the last five or so decades. Probably the 'best' example of this is cancer research which has turned great swathes of the public off science big-time because the enthusiastic promises of cures were never met. You only have to see how science and scientific research has dropped in importance in the eyes of the public over the last half century to know that. In other posts I've even spelt this out with examples.
Moreover, I standby what I said, this research into superconductors is so exotic that it is of little practical value now. I am not saying that it won't be value in the future (only time will tell if it will).
It wasn't obvious until a few years ago that high pressure supercomputing was a thing at all.
It's absolutely worth reporting that room temperature supercomputing is achievable. There are a bunch of applications in things like quantum computing that this makes simpler.
The title definitely should have mentioned the pressure needed though. It's the difference between "revolutionary" and "probably no practical applications, but nice for academia I guess".
Quanta is not written for scientists, it's written for lay people. They do good work translating today's science and math into everyday language without too much oversimplification, but it's always a balance.
Most American (and many British) readers use Farenheit to describe the weather and room temperatures, and given the context, it absolutely makes sense to use it in this article.
'Quanta is not written for scientists, it's written for lay people.'
This is all the more reason why Quanta should not misrepresent or exaggerate the research. You should read what I have to say above in reply to ISL 4 about the matter.
'Most American (and many British) readers use Farenheit to describe the weather and room temperatures, and given the context, it absolutely makes sense to use it in this article.'
You're not trying to tell me that Quanta wasn't hamming up the report by using Farenheit are you? Come on, pull the other one.
Next, I suppose you'll be telling me that not enforcing the use of SI units wasn't the reason for Hubble's 'blindness'. The UK is supposed to be metric but unlike Australia the conversion was stuffed up as it wasn't enforced by legalization (in Australia, hardly anyone knows what Farenheit means these days).
The fact that the US is so damn backward in this matter is a combination of factors (a) bloody-libertarian mindedness in that people can't be told even if it's for their own good (just look at the US's fiasco over masks and COVID-19 and you'll get the message); (b) industry (mainly heavy industrial/machining) pulled too many strings in Congress saying the conversion would cost too much; and, (c) the US education system is far too fractionalized to agree upon anything let alone the Metric System/SI units. Thus the US is the laughing stock of the world, as it's the only major power left officially with imperial (British) units. It's even more laughable given what took place in the US in 1776! How many more centuries does the US need?
That said, the people who write for Quanta and Quanta's editorial policy ought to be in line with the rest of the scientific world. Keeping Fahrenheit only 'absolutely makes sense' if one's still living in a time warp that should have long since gone. Sorry!
Wow, this transformed into a weird microrant. I didn't mean to offend - apologies if I have. To be clear - imperial units are awful. Everyone knows this. Americans know it, Brits know it, and while we might occasionally defend this system tongue-in-cheek, we know it's objectively idiotic. We're the ones that have to live with this thing every day.
> b) industry (mainly heavy industrial/machining) pulled too many strings in Congress saying the conversion would cost too much
This is real the only reason. The U.S. is somewhat unique because it had already heavily industrialized prior to the international push toward metric, and had basically no rebuilding to do after the war. It's similar to the technical debt that affects most giant old companies, but on the nation-scale. An (arguably) poor early design choice in industry-building and nation-building.
In history, generally, the largest economic entity gets to use whatever units it wants, and force the rest of its trading partners to deal with it. The British Empire picked their units, which the USA inherited, and kept using due to inertia. If/when either the E.U. or China eclipse the USA's GDP, you can probably expect the USA to try again to shift to metric.
This is all unrelated to anything of course. I assume the writers of Quanta articles are scientifically literate, prefer Metric, but know where most of their readers are from, and have chosen to use their language. I have lived in both England and America, and to this day, I still don't know what room temperature is in Celcius. I really don't think this was written with misleading, malicious intent - it's just an article that wasn't written for you, but rather for (the tremendous amount of) people still living in a "time warp" like me.
More about if or when the US converts to metric. Earlier today, I accidentally came across some new material that was too relevant to these posts to let it pass by. Here's a quote from a part of my post [https://news.ycombinator.com/item?id=24819990] to a related matter:
"... Incidentally, by sheer happenstance, earlier today in connection with another matter altogether, I was reading the preface in one of my old mathematics textbooks, that being Calculus by Stanley I Grossman, first edition 1977 [43 years ago] and I came across an interesting comment that's pertinent to this discussion. I quote:
"… Moreover, I have included "real-world" data whenever possible, to make the examples more meaningful. For example, students are asked to find the escape velocity from Mars, the effective installment rate of a large purchase, and the optimal branching angle between two blood vessels. Finally, as most of the world uses the metric system and even the United States is reluctantly following suit, the majority of the applied examples and problems in the book make use of metric units."
Oh dear, dear, what ever happened? Keep in mind that Grossman was no minor author; he was well in tune with what was happening, he wrote many mathematics textbooks and they were widely used in colleges, universities and US schools, and across the world. Moreover, he wasn't an outsider looking in, he was based at the University of Montana and his publisher was Academic Press which was based in New York!"
Yeah, there was a brief period in the late 70s (1975 - 1982) where we gave metric a go. My parents remember learning it in school, and some large businesses began using both systems to prepare for more strict mandates.
Right, I remember driving on Californian roads around that time which were marked in kms/h, same if I recall correctly in Hawaii, only to later find that they'd reverted back to miles/h!
There's nothing like a bit of hyperbole to get a debate started (and these days there's never enough of what I'd call formal debating going on).
I don't live in the US but I've been there many times and even worked there in a technical capacity (and most of my relatives are US citizens), so I understand where you are coming from so I understand why the US thinks the way it does about temperature measurement.
Another comment I'd make is the fact that the US's continued use of imperial measurements/standards is a significant cost to other countries that have to trade with the US. Exports from the US that are in imperial measurements cause all sorts of compatibility and maintenance problems. Similar problems occur with imports to the US that must be in imperial units; this adds considerable additional cost to manufacturing plants in many countries. Another problem is that in recent years the Chinese have borne much of the brunt of having to manufacture stuff in two standards—stuff in metric for themselves and everyone else except the US, then in imperial for the US. That's the position in theory anyway, the trouble is that the overflow from China's manufacturing for the US often flows elsewhere, so we end up here with nuts, bolts, etc, etc. that are in imperial sizes and this also causes a multitude of problems. You only have to go to one of our major hardware stores to see the problem firsthand, there's thousands of essentially identical (duplicated) items except for one lot being in imperial and the other in metric units (1/4" thread ≈6mm, etc.). This doubling up means huge price hikes because of the need to hold smaller individual quantities and extra line/inventory items not to mention all the extra shelf space. Yes, it's right damn mess because of the US!
Same goes for the US's 110-117 Volt/60 Hz power system which often sees exported equipment blowing up from over voltage on 220-240 volt 50Hz systems (or that said equipment needs 220/240-110V transformers to use them)—as much of the rest of world uses 220-240V—as we don't have the luxury of having huge copper reserves à la Bingham Canyon mine [sorry, as it once had] to waste on thick copper low-voltage busbars. I know from experience, only recently I blew up a very expensive Tektronix spectrum analyzer that was 'correctly' wired with a 220-240V plug but was still internally set for 110V. Even when US manufacturers figure out that there are other power standards from their own and they supply power transformers to suit, we often find that they've forgotten that everyone else is on 50Hz and not 60Hz, so the transformers overheat (lower frequency thus less inductive reactance)!
There's little wonder much of the world gets annoyed with American Exceptionalism.
'I still don't know what room temperature is in Celcius.'
I know this is a commonplace view in the US but I must admit it's pretty odd one for most of us outside the US (except the UK, but nevertheless they've still some notion of Celsius). I'm old enough to remember Fahrenheit. At school the first thing we did when entering the science lab was to write down barometric pressure, humidity and the temperature in both Celsius and Fahrenheit (Fahrenheit in brackets as it was the lesser important of the measurements) in our workbooks. The room temperature is also puzzling, for that one is so well known: 20°C is nominally 68°F. If you've ever done photographic processing and bought a Kodak developing thermometer—even in the US—then it had both scales on it and a big black line at this magic room temperature of 20°C. Moreover, the conversion formula was drummed into everyone until it was mantra (almost all lab equipment is calibrated at 20°C - even in the US):
T(°F) = T(°C) × 9/5 + 32
'I really don't think this was written with misleading, malicious intent'
Quanta Magazine ought to know better than to lead a story with a misleading headline like that. Moreover, they've even made it worse by 'exaggerating' the temperature by quoting it in degrees Fahrenheit in stead of Celsius (using Fahrenheit is a no-no in science).
That superconductivity was achieved at room temperature but at that huge pressure is hardly worth reporting as it's of no practical value whatsoever thus only interest to fundamental research.