Hacker News new | past | comments | ask | show | jobs | submit login

Maybe it's more fruitful to not just consider the Death of Corporate Research Labs, but also ask ourselves "why did they even exist in the first place?" This day in age, the expectation, for better or worse, is that the State sponsors research; this was very much not the case from, say 1850-1950. Luminaries did tons of basic science research under the auspices of corporate (like Irving Langmuir) and private (like Peter Mitchell) labs.

Without assigning causality (I happen to think that the primary cause is "low hanging fruit" phenomenon), the transition to state sponsored research has coincided with a general loss of scientific productivity in many fields, and even in some fields with explosive growth, like genomics, there has been major corporate involvement (half the human genome was sequenced by a not just private, but a wall-street traded, entity).

Yet the narrative of late has been "if the state doesn't sponsor it, nobody will". I don't feel like I have a solid understanding of why corporate research labs were a thing, or even, how science got done in the pre-manhattan-project/Vannevar Bush era.




None of the bedrock game-changer research from that period - EM theory, thermodynamics, radioactivity, QM, relativity, the original discovery of DNA - was funded by corporations.

Commercial research was responsible for conceptual refinements of existing insights, and for innovations developed as industrial and consumer applications, but not for the original intellectual breakthroughs that made specific classes of inventions and devices thinkable.

Computing itself was state-sponsored - first with Turing+Church's theoretical work, then with various practical implementations whose develop was accelerated by WWII, then with state subsidies and grants for further military and industrial development.

Recently state funding has become less and less effective because of the financialisation of academia. Universities now have explicit financial goals and lean heavily on the grant funding game.

Previously the culture was far more tolerant of blue sky inquiry that might not have an immediate payoff - or any payoff at all.


Thermodynamics is an interesting one, because the theory was created to explain the reality of steam engines, a commercial invention. At the time of the first steam engines (Newcomen's is a good starting point), the most popular theories (Aristotle's) declared that there could not be a vacuum.

I would also suggest that computing began no later than Babbage; Turing definitely contributed to the field, but Babbage and Clement kicked it off (largely with government funding).


Now imagine if someone had founded a corporate research lab around Babbage's work...


Do you think the technology was ready? AFAIK mechanical computers were (and still are) prohibitively expensive.


The first electromechanical computers weren't cheap either.


It helps to have a war to win I guess.


To add to your argument: Xerox PARC's progress was based in Engelbart's research in HDI on mainframes that afaik. also was state funded.


> Recently state funding has become less and less effective because of the financialisation of academia.

That, along with the decrease in overall amount. State funding for research seems to be more limited than it used to be during much of the last century; especially if we normalize by population size.


Why would we normalize by population size (or GDP)?

Are discoveries harder in bigger countries? So would we expect a single researcher to be more productive in England than in the US?


Just assuming a normal curve - regardless of the threshold of "minimum ability to count as a researcher" one would expect between two comparable populations of different sizds more capable of and actively driving research as population grows.

Higher GDP is correlated implies higher per person productivity and more specialization to decommoditize themselves vs the global labor. Plus pushing the envelope often requires more funding for novel projects as there isn't an off the shelf for something novel to reduce costs.

They aren't perfect measures by any means (there exist things which boost either not research) but are decent enough proxies.


A single researcher is rarely productive on their own. Effective research requires a tight network of likeminded individuals and support structure.

If the number of corporations needing new discoveries keeps growing but the funding is fixed, that means either the researchers are spread thin or there’s a centralization of research that has a hard time transitioning into the wild. (Both situations exist, and it sucks.)


Replace 'single researcher' with 'a fixed size research group' in my comment.

Your argument about a minimum size research group for productivity is interesting. I am not quite sure whether that means we should normalise by population size, though.


> None of the bedrock game-changer research from that period - EM theory, thermodynamics, radioactivity, QM, relativity, the original discovery of DNA - was funded by corporations.

Ever heard of the Solvay conferences?

While not embedded in one company, these confs were definitely sponsored by an industrial. And strongly impactex 20th century advancements in physics.

https://en.m.wikipedia.org/wiki/Solvay_Conference


Or look at the history of the chemical and pharmaceutical industry in Germany.


Well student's t test was funded by industry, and definitely bedrock.

Langmuir flow, plasma physics, and fundamental surface chemistry. Basically all of organic chemistry, too. Hormone biochemistry, etc.


One thing to remember is that not all research was being done in corporate labs. A lot of it was being done in universities as today? How was it funded? A lot of competitive funding agencies similar to today's NIH/NSF except being run privately. It's interesting that you bring up Vannevar Bush. He actually ran one of these private funding agencies (The Carnegie Institution) before being tapped to run the Office of Scientific Research and Development and modeled this (and the NSF, which he later ran) after the Carnegie -- that is, he had scientists write proposals which were evaluated by their peers and the ones judged best funded.


Up until the 1950's pharmaceutical development was entirely a government enterprise: https://www.google.com/url?q=https://www.slu.edu/law/academi...


What? No.

People seem to get confused on what actually goes into bringing a new drug to market. Yes, a significant amount of basic research is done through government funding. But discovering a new drug target is about 20% of the way to a new drug.

You still need someone to develop new candidates, screen them, identify promising compounds, bring it through years of clinical trials, develop a manufacturing process, get FDA approval and then sell it.

A good rule of thumb is that the cost of R&D is typically 1/3 research and 2/3 development. Development is almost always entirely private. And a significant amount of research is too.


When people talk about the death of corporate research labs, they're largely talking about the death of fundamental research -- places like PARC, Bell Labs, etc. Corporate research to take fundamental discoveries and find better ways of productizing it is alive and well.


At least from the pharmaceutical perspective, there has definitely been a shift away from basic research. Why? Well, in the past, it was drug companies that had the money to actually fund it. It wasn't a big deal to spend $1M on the latest lab equipment. We had university researchers asking for time on our equipment.

From talking to some of the old-timers, universities started to get better and better funding (NIH grants, royalties, etc) and eventually could buy the equipment they needed themselves, no need to rely on drug companies.

So it might be more accurate to say that basic research funding has grown and most of that growth happened on the university side. Pharma companies have either held steady or decreased their spend on basic research (it really depends on the company).


I know these comments are semi-discouraged, but I honestly found your response to be so insightful that I wanted to thank you for sharing your perspective. At least, it fits with what I've seen (GF in PhD program, stepdad has an extremely successful lab at a non-profit institute doing basic science research in immunology), but not seen stated explicitly.


Pre-1950 may have been different. The NYS Department of Health Lab developed antitoxins for diseases like anthrax and antifungals (ie NYstatin) in that era.


My comment was more the use of entirely in the OP's statement. There was a lot of drug research done before the 1950's and there still is today, but before 1950 it certainly wasn't only the government doing it.


Great point, I missed that nuance!

Certainly with the costs associated with drug approvals being exponentially higher today, only Federal government could possibly be in the drug business.


Basic research is one of the riskiest aspects though. It is much easier to justify internal research on a product where basic research shows that it works and suggests it will scale. MUCH harder when it is a moonshot. You'll notice there's a difference in capital between companies that do the former vs the latter.


True. Basic research can takes years to yield anything useful. However, basic research is also relatively low cost compared to the cost of say, running a clinical trial.

A single phase 3 clinical trial can run into the hundreds of millions of dollars. You can fund a lot of basic research for that kind of money.

When I was in grad school, my PI got grants of ~$2M a year and that supported a lab of ~10 researchers.


> When I was in grad school, my PI got grants of ~$2M a year and that supported a lab of ~10 researchers.

There's a big pay difference between grad students, industry researchers, and government researchers, with grad students being the lowest on the totem pole. Which should make sense because they are researchers in training. There's many different types of research too.


That's true, but even if you quadrupled the costs to account for market rate salaries, you're still two orders of magnitude smaller than typical development costs.


I'm not saying Phase III is really expensive, it is. But you're also ignoring the failure rates of Phase I and Phase II. If you only count the success rate of Phase I's that make it to Phase III and beyond then you're ignoring >80% of the basic research cost.


Big Pharma is a classic example of what happens with private enterprise becomes so powerful that "regulatory capture" occurs. They can charge what they like, and undertake other dirty marketing practices, because nobody can compete.


> half the human genome was sequenced by a not just private, but a wall-street traded, entity

They reportedly were so successful in a large part because of re-using a lot of data already pain-stakingly produced by publicly funded research labs.


No, they pivoted to shotgun sequencing earlier, which had better scaling properties because it's self-decoupled. There was a hangup about it only affording asymptotic completeness (aka completeness was "only theoretical") and them they realized how silly that statement is, and how the decoupling trade-off gave them an advantage because of their deep pockets.


> This day in age, the expectation, for better or worse, is that the State sponsors research

But the state (at least the US state) has been cutting spending on research over the last several years - and those cuts have accelerated under the current administration which does not seem to understand the value of basic scientific research.


Cutting research is an overstatement: US federal R&D funding has been increasing, just at a significantly lower rate than GDP. Since it's ideas being created and not widgets, you'd expect the same number of researchers to create at least as much positive impact for society through idea generation.

It seems obvious that they haven't.

I don't think it's anything nefarious about government funding (or lack thereof) that did it; it's just we've already picked all the low hanging fruit. Plucking all the undiscovered ideas at the frontier of knowledge will be increasingly expensive, because if they were cheap to discover, we already would have.

Though I always wonder if there's some trove of cheap ideas hanging out somewhere...


I totally agree that stuff get's harder, but in my specific area of expertise, programming languages, I see the perfect example of untapped engineering potential. Thus, when others say there is an R...D pipeline problem in other fields, I'm inclined to give them the benefit of the doubt.

Also, if I my toot my own horn, I think better programming languages has the potential to revolutionize all knowledge work, making everything else more productive, and helping unlock whatever other basic research might remain woefully untapped.


I'd love to hear some more details of the untapped engineering potential!

I have to admit some skepticism. What follows is my opinionated thoughts as a naif: new languages have helped a lot in creating new models of social organization. To be concrete, the significant 90s languages were forgiving enough, powerful enough, and easy enough to learn that they allowed for a kind of triumph of the programmer proletariat over the artisans. That allowed for the explosion of information technology that we've been observing for the past 3 decades, which enables new business models and makes old ones substantially more efficient.

But I don't think a contemporary PL person would think of Java as a prime exemplar of the value that the research community brings to the world. And when I think of more recent language innovation (Go, Rust), they've only been incremental improvements that were anticipated decades ago. And although I love them both, neither has shown any signs of enabling new social modes like earlier languages did: my like of them is purely aesthetic. All the real, productive low hanging PL fruit has already been taken.

Opinionated, like I said =)


> But I don't think a contemporary PL person would think of Java as a prime exemplar of the value that the research community brings to the world.

For someone focused on the language itself, I agree. But consider the JVM: an enormous amount of research has been done on, and applied to, the JVM. Garbage collection and just-in-time compilation are two particular fields that the JVM has benefited from enormously.


APL was created a long time ago and still we aren't using array languages. It's not like we haven't come up with great ideas (statically typed FP and array languages), it's just that the industry is too pathetic to actually capitalize of them.


APL is well alive in Numpy and R, good ideas typically get diffused even to industry. However, it only adopts what fits in the existing systems. Good ideas that cannot be adopted incrementally tend to not diffuse well (e.g. dataflow parallel programming).


> New languages have helped a lot in creating new models of social organization

Glad you agree!

> To be concrete, the significant 90s languages were forgiving enough, powerful enough, and easy enough to learn that they allowed for a kind of triumph of the programmer proletariat over the artisans.

In terms of practice, yes the languages that would go on to be popular and didn't drag one down with manually memory management were created in the 90s. The technology was older.

> Java as a prime exemplar of the value that the research community brings to the world

Agreed. Still not as un-innovative as Go though!

> And when I think of more recent language innovation (Go, Rust), they've only been incremental improvements that were anticipated decades ago.

Rust fully admits it is recycling old research. It's productivity gains are diminished by the fact that it is targeting systems programming which is less productive. If somebody made a a Gc<T>, and made an implicitly garbage collected version of Rust to compete with Go and Java (of course with great FFI to regular Rust for free), it would already be a better candidate than any other popular language, and that's no research required.

> I'd love to hear some more details of the untapped engineering potential!

So I am Haskeller. I would say there are 3 tracks that interest me:

1. The "classic track" of fancier type systems. The Dependent Haskell work is interesting because it should allow the hodge-podge of language extensions we have today to be streamlined into fewer features, reducing complexity. Writing proofs means less productivity, however, combining libraries with lemmas might be extremely productive. Think being able to blindly mash stack overflow answers together with extreme fidelity. That should be a goal.

2. The "categorical strack". Lambda the ultimate....mistake :D. Our programs are higher order, but there is often a first order steady state, like the things we put on whiteboards for our managers. Our current practices utterly fail to capture that first order steady state anywhere in the code, and as languages become "more functional" (java 8+, ES whatever, etc.) we're basically throwing more lambdas at problems without principle.

Using some categorical primitives---and I mean the "realer ones" in https://hackage.haskell.org/package/categories not the ones in https://hackage.haskell.org/package/base that need a lot of squinting to make out the original math---gives us a chance to perhaps fix this. Categories putting the output and input on equal footing helps, and the simple wire diagram makes the plumbing / dataflow way of thinking that every programmer should employ much more apparent. "point free" programming sucks, but I ultimately think we can get something like Scratch that makes sense for programmer-adjacent types, and yet is useful for "actual work".

3. The "incremental track". There is a decent amount of literature around incremental / reactive programming and models of computation, but it needs to be properly synthesized. I think this is will be huge because it will improve techniques that actually match what real world programs do (toy "everything terminates" stuff from school is actively harmful when students extrapolate that theory has nothing to do with practice). This will defeat an entire class of concurrency woahs that currently is a huge source of productivity loss---I don't mean just race conditions, but the more general "how can I analysis (with the ooriginal "break apart" connotations) my program into simply peaces and then synthesize a holistic understanding". In terms of how this will actually happen, this strongly ties into the above.

Let me say 2 more things:

1. we should have "one language to rule them all", in that I can write my firmware, kernel, end applications, and everything in between without compromise in it. People think this is silly, "engineering is tradeoffs, amirite?" but it isn't. The language will begin a "dialect continuum" that supports vastly different clashing idioms for those different domains (GC everywhere? Manual memory management? No call stack even?), but do support flawlessly type-safe FII.

2. Non-programmers think "applied math" means plugin in numbers, and writing regular natural language prose to with it. Math isn't numbers, and people's notion of it must be fixed accordingly. Proof theory means the whole argument, not just the numerical proessing, can be formalized. This is how many jobs should work.


Thanks for the meaty comment! I was expecting a bunch of stuff in the "fancier type system" category and was pleasantly surprised.

Do you have any references to share to learn more about potential practical uses of categories? Assume the knowledge of someone who isn't intimidated by the wiki page on categories but who learns by reading it.


> Thanks for the meaty comment!

Sure! Hope it wasn't too rambly.

> I was expecting a bunch of stuff in the "fancier type system" category and was pleasantly surprised.

Yeah after enough functional programming, one's sense of the new lowest-hanging fruit switches.

> Do you have any references to share to learn more about potential practical uses of categories?

I'm afraid not. I suppose there is plenty of blogs on Haskell and category theory, and N lab for the actual math, but I don't of a resource that zooms out from the neat tricks and tries to discuss broad needs for better programming and architecture.

But I'm relieved to say that where I work, we are working on some things in the vein of things for tracks 2 and 3. It will be open sourced when it's ready, so... stay tuned, I guess?


That's untrue and not really relevant to the point anyway. Spending changes are small relative to the total spending, and have been increasing, for the most part.

NIH budget for example increased from $32 billion in 2016[1] to $41.68 billion in 2020. Trump has proposed a 6% cut for the 2021 NIH budget but that is not finalized yet.[2]

NSF budget increased 2.5% from 2019 to 2020, to $8.3 billion. [3]

NOAA +4%. [4]

DoD Science and Tech: +1%. [5]

[1] https://www.nih.gov/about-nih/who-we-are/nih-director/fiscal...

[2] https://www.the-scientist.com/news-opinion/trump-proposes-si...

[3] https://www.aip.org/fyi/2020/final-fy20-appropriations-natio...

[4] https://www.aip.org/fyi/2020/final-fy20-appropriations-natio...

[5] https://www.aip.org/fyi/2020/final-fy20-appropriations-dod-s...


It isn't the budgets, it's the scope (at least in IT/computing). In the last decade(s) there has been a move away from "blue sky" research to focusing on ideas that can be implemented in short timeframes like one year or less. "Little r, Big D" it's sometimes called. Places that used to fund core research in higher risk sectors have scaled back to focus on things that can be transitioned into products within 1-2 years.


Doctoral students are absurdly good value for money, especially in nations that have a high density of high prestige universities, and they both train the workforce, and are trained. So I think it's an obvious move for states (especially ones with a ton of grand old universities) to milk that for all its worth. It's a way of transforming historical prestige, and a smattering of cash, into a competitive economic advantage.

I guess the flipside of this is that, for a business, it makes way more sense to give money to a university than it does to start your own lab. You'd be paying real (and probably decent) wages to people who wouldn't be nearly as motivated as the phds who do the same work for less than minimum wage.


I'm sure it's not the whole answer, but given large profitable companies with (for the moment) unassailable market positions, the answer may come down to "Because they can."

Everyone likely agrees that most large companies should have some longer-term speculative projects going on. As those companies get wealthier and fatter, those research organizations want to grow, they help position the company as an "innovator," and they may even have an OK payback over a long enough time horizon.

As to that time horizon, people have been arguing about that and it's swung back and forth at different forms for as long as I've been in the industry.

I think it's also true that there were always a relatively few large true research labs. There's a reason that, say, Bell Labs, Xerox PARC, and IBM Research come up (in roughly that order) and then most people have trouble naming examples.


RCA had one, and the amount of money they wasted trying to invent three different barely-working home video systems is arguably a big factor on why they went bankrupt.


Is there anything like ARPA's VLSI Project running today? That project has to have had one of the greatest cost:benefit ratios ever.


DARPA does have some interesting VLSI projects running currently. Some of the ones I've noticed are:

Chiplets: https://www.darpa.mil/program/common-heterogeneous-integrati...

PIPES - optical chip io : https://www.darpa.mil/news-events/2020-03-16

Automated ML chip generation: https://www.darpa.mil/attachments/Real%20Time%20Machine%20Le...


Maybe separating state funded research from privately funded research is the wrong perspective: for example in France there was a major entanglement of private and public structures (organizations, funding, people) after WW2 and that is still the case, even at the European level (is research done in state-funded labs with/for Airbus in Toulouse public or private research? Does it make sense to try to entangle state, private, European funding there? Who really decided that there must be aeronautical research here and now?).

A more dynamic view might better explain what’s going on: the flow of public money, the grants criterias, the state funded researchers gone to work for private companies for some years and then coming back…

Some example of this entanglement:

- in France any company can ask for a dedicated Credit Impôt Recherche, a tax credit for research purposes. Most seem granted for projects that are very light on fundamental research. That allowed many small companies to set up research/lab teams. It seems a more lightweight continuation of the European policy to help companies to integrate research and to create consortiums.

- when looking up public AI research in France, I found that nearly all the top AI/ML researchers based in France have been recruited by Google and Facebook. They are still fully part of their original, state-funded labs, but now work and are paid by Google/Facebook. They seem to still do the same kind of research they were doing at their state-funded labs.

I’m sure there’s other examples in other countries


In the old days most of the corporate research was in the physical sciences because the products, large "mainframe" computers and their peripherals, were very mechanical and cast in hardware. The advances of the next generation of product were driven by faster chips, more dense storage, printers that handled more complex print streams, faster networks, better user interfaces, etc.. Now that software and interoperability standards have eaten the world and made IT hardware a commodity, the competitive advantage rarely comes from physical science advancement. Relatively few companies now need physical hardware improvements to drive the next gen product. Intel, AMD, 3M, storage vendors, maybe a few more. The barriers to entry in a software world are so low that buying startups is more cost effective than investing in your own people. There's a survivors bias (I think,) in looking at what startups get acquired. For every one that gets acquired there are probably 10+ that don't. Why would a corporate CEO to invest in 11 research projects knowing only one might pan out if he could just wait and pick a winner from 100 startups?


I think part of it is a scale question--there are a number of discoveries that require a huge investment in both manpower and equipment. Beyond the work of corporate research labs, there are some discoveries that don't have immediate payoff--think for example general relativity. Yet, attempts to measure it led eventually to atomic clocks and to GPS.


Implying research today is anywhere close to research of the past. The amount of capital and complexity to do fundamental research far eclipses what was required in the past.


This is absolutely the reason. So much was unknown at the time that a huge chunk of inventions and discoveries were completely accidental. There's no way a young adult can build a particle accelerator and play with it in his basement and discover something new. But countless people did it in the past with chemicals, electricity, pendulums, optical devices, and mechanical contraptions.


Maybe as engineering got more advanced economies of scale paid off more and more. We don't get all our food from family farms either.

Not trying to be some fascist statist, let's just recognize that since the 1980s, we've not been in a new gilded age so much as a gilded age cosplay (e.g. we are not actually closer to well functioning free-market capitalism for the "real" economy), and growth and large-scale investment have stalled accordingly. A lot of libertarian and decentralization ideology has grown up in the era since under false pretenses.

I would love devolved power and decentralization to actually work, but we need better, more honest theory on how to make it good, and less blind faith that centralization and monopolization will fail because that bad and "the arc of history bends towards justice".


> I would love devolved power and decentralization to actually work, but we need better, more honest theory on how to make it good, and less blind faith that centralization and monopolization will fail because that bad and "the arc of history bends towards justice".

I mean come on. The only way you will get to that theory is if you have brave people who try (maybe that's self serving), and if you aren't at least questioning the narrative of "the only way forward is through the state", especially when there is (increasingly sporadic, anecdotal) evidence to the contrary, then you are tacitly discouraging people to try.


I am trying? For example I like to work on Nixpkgs because I think it is the Wikipedia of the software commons. I am doing some things with Nix and IPFS at the moment for similar reasons.

I share my doubts here because HM tends to overly romanticize small organizations. I don't mean to tell people to give up and work for FAANG.


Edison made them fashionable. There was also a lot of low hanging fruit to pluck in the 20th century.


Do you have anything to back your claims?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: