Quote: "First, scientific knowledge advances most rapidly, and is of most value to society, not when its course is determined by the “free play of free intellects” but when it is steered to solve problems."
But the history of science boldly and flatly contradicts this claim. The most productive and society-reshaping products of science arise in pure, not applied, research.
Quote: "Second, when science is not steered to solve such problems, it tends to go off half-cocked in ways that can be highly detrimental to science itself."
Also contradicted by history. As just one example, the success of Bell Labs over the decades resulted, not from a focus on solving particular problems, but a focus on research for the sake of research -- pure science.
The author of the article raises an alarm about a supposed scientific crisis, and eventually reveals what he thinks is the source of the problem -- a waste of scientific talent spent on pure research. He needs to read the history of science with an open mind.
Quote: "It was military purchases that kept the new transistor, semiconductor, and integrated-circuit industries afloat in the early and mid-1950s."
That's true, but it's misleading because the development of the transistor at Bell Labs wasn't an applied science project, it resulted from pure research in materials science and physics.
The author isn't reporting on the state of science, he's complaining that it's not what he thinks it should be, in a way that stands at odds with science's history.
I think your claims require a bit more backing than mere assertion. Certainly, some of the most important research (now described as "pure") was done with applications immediately in mind.
For example, Newtonian physics always had the goal of calculating artillery trajectories. Nuclear physics had the goals of energy/weapons. Probability theory, operations research, and most of our modern computational infrastructure came directly from people trying to do applied work. Nonlinear wave equations, to discuss a niche example I know well, are primarily motivated by applications in photonics.
I'm very well aware of the many anecdotes of very pure research turning out to be useful later. But there are also a huge number of anecdotes of pure research being directly motivated by providing theoretical justification for/analysis of applied work.
So I don't see any compelling reason to believe your unsupported assertions.
1. Quantum mechanics was never motivated with the thought of semiconductors (therefore computing technology). If you were motivated by building a computer, you would never have discovered quantum mechanics.
2. Probability theory was "invented" to understand/solve gambling problems. Nobody anticipated how widely it would be used.
When paradigm shifts occur, it takes a long time for the effects to percolate, before we can even get a feel for the space of possible applications. However, if resources (including smart people's time) are not spent laying the foundations, one could never have taken aim at the applications! If one is always chasing applications, who spends time and money on the preliminary legwork?
At any point, if resources are directed through some small set of people who decide and enforce the directions to be pursued, then the outcomes will be more representative of their biases than reality. Those few people effectively act as a bottleneck for human ingenuity.
First of all, quantum mechanics was motivated by the thought of semiconductors. One of the primary use cases of it was explaining the photoelectric effect [1]. Secondly, other primary motivations for QM were explaining and predicting chemical reactions, radiation sources, and nuclear energy.
Secondly, probability and statistics - as you note - were invented to understand/solve gambling problems. Virtually every early advance was then made by people attempting to use it. These include Gauss predicting the orbit of Ceres, Graunt and Halley (yes, he also spotted Halley's comet) doing insurance, Galton and Pearson studying evolution and developing eugenics, and Gosset using statistics to brew better beer.
Probability and statistics are perhaps the worst possible example of pure research that - purely by chance - happens to be useful later.
[1] Interestingly, the classical belief that the photoelectric effect proves the quantization of light is wrong. The Schrodinger equation + continuous electromagnetic fields actually exhibit the photoelectric effect.
> First of all, quantum mechanics was motivated by the thought of semiconductors.
Excuse me? Quantum mechanics was developed in the 1920s, long before any thought of applying it to any practical problems. Semiconductor research came much later, and that was also largely pure research into material properties carried out at Bell Labs, a facility noted for its isolation from any commercial application of its work.
So, entirely false.
> One of the primary use cases of it was explaining the photoelectric effect [1].
You're confused. Einstein explained the photoelectric effect in 1905, then spent the remainder of his career objecting to the quantum theories that developed from this starting point, all without any practical applications in mind by any of the participants.
> Interestingly, the classical belief that the photoelectric effect proves the quantization of light is wrong.
There is no such belief, so discussing it is pointless.
> Probability and statistics are perhaps the worst possible example of pure research that - purely by chance - happens to be useful later.
They're examples of pure research into mathematical ideas that --purely by chance -- happen to have practical applications. How is that a bad example of the point that pure research is the source of most insights into nature?
Claiming that quantum mechanics (QM) was motivated by semiconductors is bordering on discussing in bad faith.
A glance at [1] will show that people were thinking about issues leading up to QM for several decades. Planck's equation relating energy to frequency of light (in several ways the first "quantum" idea that conceived what we today call Planck's constant) was motivated by understanding the "ultraviolet catastrophe" [2] (which was a purely "theoretical" endeavour as some would call it today). Planck's work preceeded Einstein's explanation of the photoelectric effect by several years. Even when the photoelectric effect was observed, it was first noticed in zinc (IIRC); it was only in the 1930s that QM was applied to understand the functioning of semiconductors.
You are confusing all the things we use QM for today with all the reasons for which it was first conceived. Moany of those reasons of course spurred development in QM after it was conceived -- but none of those motivations would have conceived QM.
With regards to your comment on the development of probability:
There was always a reason/purpose something was conceived. So claiming that it was "motivated by applications" is tautological. The relevant question to ask is whether the applications today are different from the original motivations. If they are, then frankly, it doesn't matter what the original motivations were... the idea would have been difficult to conceive starting with the eventual application in mind. Eg: Without an understanding of probability, linear algebra and differential equations, there would have been no quantum mechanics. Somebody observing the photoelectric effect could not have developed those tools for their "application".
I notice your other comment on the thread (OP) talks in analogy with physical training. IMHO, such an analogy is misguided for endeavours which cannot be reasonably well specified so as to be manageable (in that it can be managed, with the goal in mind). Basic research is often not amenable to that because it has tons of unknown unknowns [3].
The photoelectric effect was first discovered in silver chloride solution. I don't know that much about the energy bands of silver chloride, so I won't comment about whether that was a semiconductor. (I also know very little about liquids, basically all the physics I did happened in semiconductors.)
The first solid state demonstration was in selenium, which is a semiconductor. This is what I was thinking of when I said that the photoelectric effect was semiconductor physics.
The relevant question to ask is whether the applications today are different from the original motivations. If they are, then frankly, it doesn't matter what the original motivations were... the idea would have been difficult to conceive starting with the eventual application in mind.
You are defining "pure" in a far more expansive way than the article does. Your definition is actually so broad that it doesn't contradict the article at all.
The article claims that science, with the goal of building cool military applications (or presumably life tables or brewing beer) will work better than curiosity driven applications. Then it claims the fruits of those labors will be useful elsewhere. Now you seem to be agreeing with this, or at least not disagreeing.
Note that the article isn't saying "don't figure out fundamental physics". It's saying "go build a giant wall of ice to keep the mexicans out and a better understanding of pure thermodynamics will be one output of that project."
Also note that I'm not arguing for the premise of the article, necessarily. I'm simply arguing that it can't be casually dismissed without even an argument. My analogy is meant to be suggestive, not to prove the point.
> The photoelectric effect was first discovered in silver chloride solution. I don't know that much about the energy bands of silver chloride, so I won't comment about whether that was a semiconductor.
Had the first example of the photoelectric effect originated in a semiconductor, that cannot be used to argue that the research was motivated by the goal of practical application. By that reasoning, the fact that particle physics is about atoms, and that atoms can be used to make weapons, could be used to construct an absurd argument that all research that involves atoms has the ultimate goal of designing weapons.
> The article claims that science, with the goal of building cool military applications (or presumably life tables or brewing beer) will work better than curiosity driven applications.
The phrase "curiosity driven applications" assumes what it should be proving. Not all curiosity into nature has application in mind, indeed that's not now pure research is defined.
> You are defining "pure" in a far more expansive way than the article does.
Pure research is research meant to discover properties of nature, without any concern for practical application. That's hardly worth discussing as though there's any controversy about the definition.
> Without an understanding of probability, linear algebra and differential equations, there would have been no quantum mechanics. Somebody observing the photoelectric effect could not have developed those tools for their "application".
I disagree strongly, and history might too. For example, Heisenberg invented matrix multiplication for quantum mechanics. In general, physicists have been very happy to invent entire fields of mathematics (eg, calculus!) for their direct application. At the very least, I think you're overselling your point.
Taking nothing away from Heisenberg, he reformulated a particular quantum system in the new language. But in order to be able to apply QM to all kinds of systems, one needs to also reformulate it in a universal language. At that point, Max Born interpreted Heisenberg's example using his knowledge of math -- Here's the story I found: https://en.wikipedia.org/wiki/Heisenberg%27s_entryway_to_mat...
----
Max Born had two seminal contributions:
1. Interpreting the noncommutative structure in Heisenberg's observation as linear operators
2. The "Born rule" that relates amplitudes involving the wavefunction (quantum state) to probabilities.
For both of those, he used existing math knowledge. He did not re-invent linear algebra or probability theory. And differential equations were part of the standard toolkit of physicists by then (given successes of the theory of heat and the wave theory of light)
> For example, Newtonian physics always had the goal of calculating artillery trajectories.
Excuse me? Principia Mathematica' real purpose is to compute artillery trajectories? Sorry, this claim is absurd and the burden of evidence just shifted to you.
> Nuclear physics had the goals of energy/weapons.
That's entirely false. What we now call nuclear physics originated in particle physics. Weapon research is a recent spinoff of this field, and is not a source of new insights into nature.
When Leo Szilard realized that a chain reaction was possible, he wasn't trying to solve a practical problem, he was simply brainstorming a pure-research idea (and it is said, while waiting to cross a street).
When Lise Meitner and Otto Hahn discovered nuclear fission, they were engaged in pure research, they were not trying to solve any specific problem.
The Manhattan Project and all that followed from it were applied-research spinoffs of basic breakthroughs in pure research.
> But there are also a huge number of anecdotes of pure research being directly motivated by providing theoretical justification for/analysis of applied work.
Your claim is self-contradicting. If the goal of a research program is the solution of a practical problem, then it's not pure research.
> So I don't see any compelling reason to believe your unsupported assertions.
I don't have unsupported assertions, instead the history of science has copious examples in support of the idea that most of our insight into nature arises in pure research.
> The most productive and society-reshaping products of science arise in pure, not applied, research.
Yes. Unfortunately, from the perspective of the researcher (or even the research team), pure research is a lottery, whereas applied research offers more chance to success.
> ... pure research is a lottery, whereas applied research offers more chance to success.
That depends on how one defines success. Was Einstein successful? His work was pure research, and he was never financially successful, but his work resulted in a greater understanding of nature, a result we all share. Someone could argue that, because Einstein was never wealthy, therefore his work failed, but the defects in that argument are clear.
From a political perspective, pure research funding is certainly a lottery, but that doesn't address the question of what kind of science produces most of our insights into nature. The answer is pure research.
Yup, IMHO you have a better grasp than the author.
I've mainly built & developed labs for the analysis of very high-dollar materials, using the same advanced research equipment that the academics & industrial researchers occasionally have breakthroughs on themselves.
I'm an extreme problem-solver, compared to most of the other operators of similar equipment, and more experienced. On my own, it can still be challenging to make a lab both profitable and sustainable.
So you do exactly this, substituting commercial activities for "teaching & administrative duties":
"You will need forty hours a week to perform teaching and administrative duties, another twenty hours on top of that to conduct respectable research, and still another twenty hours to accomplish really important research.... Make an important discovery, and you are a successful scientist in the true, elitist sense in a profession where elitism is practiced without shame.... Fail to discover, and you are little or nothing."
It follows that even if you are not an anti-social person, you will need to take on some anti-social responsibilities.
That little feature of natural science has not changed since Edison was alive.
Once the money-making milestone has been reached however, technical problems occasionally still need to be addressed, sometimes requiring a higher rate of breakthroughs compared to institutional operators.
After those emergency challenges are overcome, why not put the same effort into untapped areas that come to mind, which might be addressable using the uniquely developed features of the particular facility?
In the right hands this can sometimes lead to more commercially-exploitable breakthroughs than the direct solution of commercial problems themselves.
There is also quite an advantage when owning your own apparatus compared to working on institutionally owned gear. Either way, over the last few decades I have consistently tried to invalidate the following statement:
"Scientific progress on a broad front results from the free play of free intellects, working on subjects of their own choice, in the manner dictated by their curiosity for exploration of the unknown."
In that area I have completely failed. That simple concept just gets more validated every decade no matter how I try to find an alternative.
It turns out, you take a true intellect who has the innate curiosity and chooses to explore that particular unknown, provision them with some applicable resources and allow them free play. In return you get more scientific progress for your money than any other way so far.
It doesn't even always require "amazing powers of observation", just not below average like many professors or technical administrators.
"Scientific progress on a broad front results from the free play of free intellects, working on subjects of their own choice, in the manner dictated by their curiosity for exploration of the unknown."
'On a broad front' I think that's probably correct. If you have specific goals you want to achieve, sure you should probably direct research at that goal to have the best chance of achieving it. But if you are more interested in exploring the wild frontiers of science, less directed efforts are the way to go. Of course in reality we want to do both, but the article offers no such grounded perspective.
Anyway, who is being lied to and to what goal? Is anyone really fool enough to think that all of US science spending has been purely provided to scientists free of all strings? Has there really been an actual coordinated effort to persuade anyone that this is true?
I read Bush's statement as an aspiration, not really a statement of incontrovertible fact. Can aspirations be lies? Is there some terrible conspiracy afoot? This pudding is being very heavily over-egged.
The latter part of the article is a cogent and reasonable criticism of some of the problems in modern science. There's a lot of house cleaning that needs to be done. But that cause is not well served by dour, grandiosely pronounced, clickbaity conspiracy mongering.
The best way to get a human into good physical shape is to prepare them for a fight. I've never been in better shape than when I was boxing - I was strong, I was fast, my cardio was great. In theory, nowadays I should be in better shape. Rather than focusing my time on bag work, drills, footwork, etc, I could be focusing on fitness. Yet in reality I'm nowhere near my fighting peak. I can do a lot of pullups, but I doubt I could crank out more than 20 burpees right now.
The reason for this is that I've lost my focus: if my cardio sucks, the result is no longer getting punched in the face.
It's an interesting hypothesis, and one that should not be dismissed out of hand, that societies behave in the same way. Think about our modern malaise - we have no grand projects, particularly in the public sector. All we do is funnel money in the general direction of something we like - nondeterministic optimism, in Peter Thiel's language.
Consider California high speed rail, supported by both the president and governor of CA. 8 years later lots of money has been spent but no track has been laid [1]. Would 8 years of delay on a vital project be acceptable to a nation preparing for war? I suspect not.
[1] There is no technological barrier here. The Qinhuangdao–Shenyang high speed rail - 250 miles long - was built in 4 years.
> Think about our modern malaise - we have no grand projects, particularly in the public sector.
Interestingly I've heard that in economic return-on-investment terms grand projects are almost always failures.
> [1] There is no technological barrier here. The Qinhuangdao–Shenyang high speed rail - 250 miles long - was built in 4 years.
The barriers are other than technological, sure - I don't know about California, but the things delaying the next high-speed rail line in my own country are court cases, appeals, and political disputes over matters like: some houses need to be demolished to build the stations; the line might disturb the ecology of some wetlands, the line will make a naturally beautiful area less so. Along with some analysis-paralysis issues (is this the best use of public funds? The model for the original analysis was wrong! Will the line still be in the right place by the time it's built?)
I suspect China, or a hypothetical America-at-war (or even America-at-cold-war), would not worry about the first category, and would take higher risks on the second. We've become a lot more risk-averse as a society, sure. I'm not convinced that this isn't simply a rational response to a safer world, where most citizens, on the whole, enjoy a pretty good life. Risking a few deaths and some blighted regions for the sake of a bit more growth makes more sense the poorer you are.
It doesn't need to be a fight. We can generalize to competitions, where we have examples such as athletes and gymnasts (who in particular are amazing). But you can also look at performers outside pure competitions too: acrobats and dancers are also very fit individuals.
Fight and competition are too specific. There's just needs to be some well defined goal to move towards, it just so happens that competition is one of the best ways to provide that. And since competitions need not be violent, let's prefer to use that terminology instead.
Didn't even see a mention of what science really is. I bet you many "scientists" don't know (pun intended) and couldn't tell you what science is.
Knowing an experiment or observation is not repeatable _is_ knowledge that _can be_ useful. What are the causes of non-repeatability? That's even more useful to know.
The observation and experimentation processes of science may be reflexive and would be worth investigating.
The money spent on "science" has created ecosystems where the novel, accidental discovery might be more likely.
The claim that "science" is self-correcting must be supported by evidence.
Finally, the media reporting of science does injustice to scientists by making claims not put forth in publications so frequently, that you are wise to disregard any media reporting of the science, and must go to the paper itself.
But the history of science boldly and flatly contradicts this claim. The most productive and society-reshaping products of science arise in pure, not applied, research.
Quote: "Second, when science is not steered to solve such problems, it tends to go off half-cocked in ways that can be highly detrimental to science itself."
Also contradicted by history. As just one example, the success of Bell Labs over the decades resulted, not from a focus on solving particular problems, but a focus on research for the sake of research -- pure science.
The author of the article raises an alarm about a supposed scientific crisis, and eventually reveals what he thinks is the source of the problem -- a waste of scientific talent spent on pure research. He needs to read the history of science with an open mind.
Quote: "It was military purchases that kept the new transistor, semiconductor, and integrated-circuit industries afloat in the early and mid-1950s."
That's true, but it's misleading because the development of the transistor at Bell Labs wasn't an applied science project, it resulted from pure research in materials science and physics.
The author isn't reporting on the state of science, he's complaining that it's not what he thinks it should be, in a way that stands at odds with science's history.