Fine article from a reputable source, but there's a tendency I despise in modern journalism: making it unclear they're talking about a declining growth rate (positive first derivative, negative second derivative: increasing with the convexity down), not a decline (negative first derivative).
But hey, mainstream news sources fail at elementary school math, at least here they're making the effort to fail at high school math.
This isn't a news article it's primary source information as it's from the New York Federal Reserve discussing data-based correlations in economic factors and outcomes.
"Once electronics were small enough to be used in almost all industries, their effect on productivity vanished."
eh, I figured it out, but I think even that only works if the writer and reader have shared assumption that the thing in question inevitably grows and so insert "(rate of increase of)"...
"VR slowdown", I'd think less people are buying in, but no-one's throwing their headsets away. "Border crossing slowdown", the number of people crossing the border is actually less than before.
that only works if the writer and reader have shared assumption
It's the Federal Reserve writing for the sort of people who read studies published by the Federal Reserve. That's a pretty specific audience, and one very likely with a large pool of shared assumptions. This wasn't written for HN randos.
I think many hn randos are capable of understanding the piece, but those people will probably not make comments because the thesis here is not exactly non-obvious.
> Declining Productivity and Moore’s Law
> In the U.S., average labor productivity growth in the 1995-2004 period was 2.85 percent per year. This productivity growth significantly declined in the following decade
Title of section says "declining productivity"; body says "productivity GROWTH declined". Those are two very different things.
The article also states that once electronics “saturate” an industry (i.e., they are adopted everywhere they can be), their affect on productivity is zero. Not true. Their affect on productivity GROWTH may be zero, but take away the electrics, and productive would decline (according to the assumption of the article that electronics improve productivity).
So it seems like the term productivity is jargon for “productivity growth” in this discipline. Still, this seems a bit sloppy and lazy. How hard is it to s/productivity/productivity growth/g?
Another problem with the article seems to be that they are assuming causation and simply using a regression to prove it. There is probably a kernel of truth to the story they are telling, but their argument seems to be begging the question.
I disagree, productivity by itself doesn't imply growth. For me "slowdown in productivity" unambiguously means a decline. You also don't write "slowdown of the vehicle speed" when you mean a decrease in acceleration.
It still talks about productivity slowdown afterwards, in overall repeatedly, while in fact the productivity increases 1.27% in the worst period analyzed.
Well, that used to be untrue — for a long time, the future was a lot like the past on human lifetime scales. But now we have technical and consequently cultural and even environmental changes that make it likely the future will be unlike the past, so now it is true. But maybe someday that will change and future will be like the past again.
I think some alternative hypotheses should be considered:
1. Workplace culture has changed drastically over the last 20 years, at least in Sweden. Comparing the work ethic of my parents generation to that of millennials is… like comparing night and day.
2. We’ve been living in a radical monetary policy experiment for the last 20 years, with zero (or even negative) interest rates and quantitative easing. It should also be noted that the previous change in productivity growth rate (in the 80-ties) coincided with a dramatic change in monetary policy.
> Comparing the work ethic of my parents generation to that of millennials is… like comparing night and day.
It's easier to keep a strong work ethic when the returns of work are consistent. Harder to keep when people work their asses off until their forties and still can't afford owning a house and sustaining a family in the income of one member of the couple. Just one or two generations ago most people routinely were able to achieve that in their 20s.
I worked at some very high paying tech companies, most of my coworkers could and did easily afford housing and family, and yet the work ethic was simply not there.
That is unfair. The twitter post has pictures of articles along with their sources going back to the 1800s. That is not a bar a random search comes up with. If it is so easy then use the same bar and get one sentence sources going back to the 1800s backing the thesis that people nowadays have less work ethics.
The monetary quantitative easing is keeping up with technological productivity improvements: central banks inflate currencies as much as they can get away with without getting a revolution from the working people.
Yes, I have a feeling this is how it works “big picture”, but I don’t have any trustworthy sources for that claim. Do you know of any? I’m thinking academic economists who have written on the subject or something like that.
Interest is, after all, the return on capital. While an individual borrower may be over the moon to have to pay less interest, on aggregate it means lower return, and return on all capital across the economy is growth.
I'm wondering if the decline after the mid-2000s isn't due to computers so much as the internet.
Once somebody has an internet-connected computer at their desk, they have more distractions (from internal applications like email to external ones like Facebook or reading news online). They also have more kinds of problems that can interrupt their work (server failure, network problems, viruses).
I think it already makes sense just by looking at computers. A computer ca. 2000 was a machine for productivity mostely. Sure you could play games, look at pictures, videos or similar things, but the whole design of everything in that computer was still very office like (cue pictures of people playing solitaire on windows xp).
Look at e.g. Windows 10/11 today. Open the Start menu at a default installation and you will get a much more time wasting User Interface, candy crush: I am looking at you.
Windows is no longer the operating system meant for professionals (even if you get the professional version), it is trying to sell you a sweet promise of a lazy joyful time at every corner.
Windows has tried to become more like IOS or Android, which are arguably systems designed for consumption more than production.
The internet has played it's role in all of this for sure, as it is no longer our bosses wallets they are after, but our attention and time.
It's been getting worse for a long time. Windows 2000 was the best version and it's been downhill from there. I was lucky to be persuaded to switch to Mac in the mid-2000s and miss most of the awfulness.
At this point, it's the worst of both worlds: baroque, distracting and confusing, but without genuine interest or charm. It's like staring at one of those failed neural net generated pictures where your visual cortex is vaguely stimulated but you have no idea what you're looking at.
My kids will never touch Windows. It would feel like the electronic equivalent of letting them live on McDonald's and food coloring.
That’s why I moved from windows to mac. There’s no distraction for me there, aside from what I chose myself. Even linux OSes seem more distracting for some reason. Mayne Mac has a more cohesive feel
Productivity here doesn't mean "how much you get done" but "how much money is made off you". There may be some ways in which the two ideas correlate, but it's a complicated relationship and it's not even always in the same direction.
They can also use the internet to quickly look up things, ask questions to a giant group of experts, not likely to be available in-company. As well as the countless other productivity boosting resources made available through it.
I worked in a office in late 1990s/early 2000s when it transitioned from Novell NetWare on DOS and Windows 3 to Windows 95 and 98 on the internet.
People using the internet on their work machines added so many problems:
- People spending a lot of time on shopping or porn. (This was before Facebook, and I don't think MySpace was around or popular yet.)
- People downloading and installing software they shouldn't. Which ranged from pirated software to adware to malware, or even something benign that conflicted with the tools they needed for their job. (This was mostly fixed when we migrated people to Windows 2000.)
- Viruses.
- People set up file sharing nodes for music and porn.
All of these problems came about once we added access to the internet.
Admittedly, it should have been rolled out differently, perhaps using something like Windows NT with firewalled networks.
But giving people access to the internet just allowed a lot of people who were naive to shoot themselves in the foot and waste a lot of resources fixing their computers.
At a former place of work the secretaries became experts at using Excel and even Access or Crystal Reports. I remember one had a beefy SQL book on her desk to make queries directly from the company's Oracle databases.
Eventually they got newer titles of "executive assistant" with better pay.
The IT department I worked at also lobbied for a policy, so that when computers were upgraded that they should get the newest and more powerful ones, since they were doing the bulk of computing.
First, it is based on manufacturing, which has been a minority of economic activity throughout the period it discusses. To explain anything, it needs to explain services.
Second, the model is too elaborate and depends on a hard-to-measure variable (saturation).
Let's use a really simple model in which work (service work) is made up of two tasks, decision-making and communication.
Recall that 1995-2004 is the exact period when cell phones and email diffused throughout the economy. These improved the rate at which communication tasks could be completed, so productivity grew in this task. But as with everything there are diminishing returns to further improvement.
The other task, decision-making, is much more varied, being different for truck drivers, real estate agents, nurses, and yoga instructors (to select a few examples). Increasing productivity in the decision-making task across the whole economy is much harder, which is why productivity growth has slowed down.
We've not tapped the full productivity of computers yet. I'll still be a while before we get capability based security built into everything, then there'll be another boom as creativity is once again released, without the artificial drag of a horrendous security model of ambient authority making everything unsafe.
How feasible is this alternative theory: the growth of computing systems, while increasing productivity, has also allowed a substantial growth in bureaucracy.
For a) you can at least measure the ratio of managerial to non-managerial staff (which has become more substantially more heavily weighted towards management since 1983).
Embarrassing that software can be completely elided like this. What gains it has brought have been offset by the stifling of creativity, ossification of process. Maybe there are a few more variables here?
Slower growth, why is that a problem? That is still a growth.
Are we really expecting not only everlasting growth but exponential growth until the end of times?! Come on! Even the 1.27% growth of the groth seems unsustainable, we should be happy about it it not seeking the faults in not achieving somthing unattainable.
Most measured economic growth is like this - getting more from less.. For example, a process that allows producing 2x more electricity per unit of fuel would increase the productivity of the generator station, and be counted as economic growth.
Economic growth also measures increased consumption of natural resources, so it's a mixed measurement admittedly.
Except that gdp has no conception of the impact of improved productivity.
For example, cars are an incredibly inefficient and expensive transportation model from a capX and operating cost perspective. Switching to a public transit model would increase quality of life, but would shrink the gdp, because the operating costs for public transit are orders of magnitude less than car-oriented transit.
So even though public transit would improve efficiency, the 1st order effect would be to reduce gdp.
Do you think humans have run out of things to invent and improve? Are standards of living good enough that we can be content to stop here? Why exactly should we expect (let alone hope) that growth will stop?
If you want to improve standards of living for more people with respect to the socioeconomic realities of each person, you’re going to end up with an outgoing cash flow towards some welfate policy or humanitarian cause that funds people’s basic needs and healthcare, which won’t count as “growth” in the capitalist sense, which is the sense being used here.
While I agree that increasing productivity is the prime directive when it comes to improving standards of living, dismissing the benefits of wealth transfer is naive.
Direct wealth transfer is a useful tool for mitigating market failures like labor market monopsony and the various negative externalities of excessive inequality. When used correctly, it does in fact improve economic efficiency and societal well-being.
Living standards have been improved far more effectively by bringing more people into jobs with higher productivity, driven by capitalist systems.
Look at the tens or hundreds of millions of people brought out of abject poverty in China and India and other countries by adopting free market economic principles.
> you’re going to end up with an outgoing cash flow towards some welfate policy or humanitarian cause that funds people’s basic needs and healthcare, which won’t count as “growth” in the capitalist sense
Properly executed redistributive policy absolutely increases growth in the "capitalist sense", whatever that means. That's kind of the whole point.
Why should there be a limit on human productivity? After all it means we are getting more efficient at the things we are doing, doing more with less, is that not a laudable goal? Any time we have approached a limit we have found a way to cheat it, that’s cause for celebration.
> Slower growth, why is that a problem? That is still a growth.
Capitalism increasingly consolidates the economy in the hands of the majority capital owners that control most of the economy. For that reason, in order to have a society that at least has a semblance of being a functioning one, there must be growth to provide some wealth/income to the majority.
When a capitalist economy is no longer able to grow exponentially, it starts eating itself from inside.
What about the adoption of cloud, SaaS, and improvements in information systems in industry (e.g. Flexport for freight)?
The article doesn’t touch on software at all and instead attributes productivity growth purely as a function of Moore’s Law, which seems counterintuitive as both HW improvements and SWE improve the SW running on top.
The article attributes ~12% of productivity growth to electronics miniaturization. It isn't meant to be a comprehensive analysis of the factors of productivity growth. The author is testing a hypothesis about electronics miniaturization and finds it correlated with some of the productivity acceleration during the 80s and 90s. He is using a new dataset and his analysis is shaped by that.
But it does provide some interesting ideas about measuring new contributions and identifies them in the combination of electronics and domains of industry.
- access to computing devices increased from 1980-2005 which increased productivity
- microchips could be added to products, increasing the utility of the product / variability hence productivity.
I mean, maybe. Seems reasonable. I suspect that the chase after productivity now has to stop in the physical and be looked for in code and in organisational arrangements - ie we don't need so many managers, but we do need more coders - and how do we get coders to co-ordinate ? mostly through openness, democracy and ... testing :-)
The proposed model explains slowdown in terms of production technologies and materials, in particular microelectronics. But, is it fair a model that does not take into account the effect of internet during the last 20 years?
Not only the hardware have changed the production landscape, but also the networks.
The conclusions of the work might be correct, but in my opinion the model seems incomplete to draw conclusions from it.
From what I understood in the end the conclusion was that only a relatively small part of the productivity difference can be explained by electronics minification.
And while we can find reasons the productivity increased faster in the 90s, it's not difficult to find reasons why the productivity should increase now as well.
Software, network and sensor technology advances rapidly. Biotechnology and medicine too (MRI, better cancer treatments, RNA vaccines, etc).
In the same period of time (80s/90s) there were enormous changes of the manufacturing economy in the US. Reforms to the financial system. Outsourcing to Asia etc.
So many factors not taken into account. Such science only serves to create compelling stories of the past, it doesn't have predictive value.
IMVHO it's not a matter of "we can integrate more creating a scaling effect" or "we have done too much, too intricacy" it's a matter of "basic technology" designed for business purposes instead of whole society serving purpose.
Try a modern car: it's a crapload of crapware crap. It's not because we have tried to integrate too much, automotive sector do not know IT, ... it's because modern cars are NOT designed for the human, formal owner, but as a service, a way to (ab)use the human owner to milk extra money. Similar thing happen with the cloud+mobile integration.
The SOLE way IT works is an open ecosystem where ANYONE can see, change at any level, contribute and discuss, because IT it's not a layer of automation over something else (typically a mechanical/electrical thing) but a logic layer between humans and their own machines. Productivity fall because user freedom an power falls. Users now are just a mechanical arm of a crapware system to do the part the crapware system can't do alone. The systems is designed to milk money on them, not to serve them.
If big of IT want real productivity MUST look back at their history: the more they abuse users the more short-term profits they elicit, than a fall. Or they came back to a FREE software, open hardware where they compete in features not in lock-in and crappy UIs/tools or they will be pushed aside as a generic commodity sooner or later eroded by some other sector who made more money them them (energy sector seems a candidate so far).
Off topic: It’s really hard to read on a mobile device. I’d really wish blog posts would look more like the iOS reader mode: https://i.imgur.com/K9nDmpN.png
Please explain how it would be "easy" to save 40% of those costs. The chips are doing things that are legally required or that customers demand, like controlling the ABS or streaming Bluetooth audio. Sure it's theoretically possible to build a cheap car with minimal electronics but it wouldn't be legal to sell, and few customers would even want it. (Yes I am aware there are some people who want a simple car. They aren't a large enough market segment for any manufacturer to care about.)
I'm all for computer controlled engine fuel and ignition, but otherwise I don't want computers in my car.
I was happy with my 1987 Chevy S-10 Blazer until the oil pan rusted out due to winter road salt. Then I upgraded to a 2001 GMC Yukon. The Yukon has some automatic stuff, and I hate it. If the newer cars have more automatic stuff, then I will hate that, too.
I don't believe you are correct that the laws and regulations require automation that needs for computers in cars. E.g., as I recall, we had anti-lock brakes long before we had computer chips in cars.
I used to be a car guy so understand quite a lot about cars and how they work. E.g., need computer chips to monitor the transmission? We've had transmissions for 100 years and quite good automatic transmissions since the 1950s all without computer monitoring.
For the drivers wanting Bluetooth, ..., an entertainment center, a communications center, a lot of self-driving, a touch screen, I doubt it.
Use computers to help with door locks? To me that is directly from wack-o land.
For what the computers do, I don't want it. For their buzzers and flashing lights, I wish they'd just go away. I don't want to pay for the chips when I buy the car, put up with the false alarms, or pay for repairs.
E.g., my Yukon has a little light on the instrument panel that says something about "BRAKE". Well, it's a tiny, obscure thing, so I looked at the PDF of the owner's manual I downloaded and got the details. It's a light about the parking brake. The parking brake is NOT on, and the light is a false alarm. I wish the light would stop, quit, turn off, go away, disappear, be gone. A false alarm for the parking brake and then reading up in the PDF -- WASTE of time. Busy work. Deliberate extra complexity.
To me, most of that computerization looks like a crowd pushing/following a fad, a fad that will go away in a few years.
It's like the low flow plumbing: Maybe, I doubt it, but maybe in a desert there is a reason. Otherwise it is a waste of my time -- takes longer to fill a pot with water. I don't live in a desert. We have plenty of water. Low flow is the result of people making work for themselves and hurting my life. But, the talking about saving water has died down and maybe the same for low flow plumbing. Hopefully any low flow is implemented with just a little restrictor that can be removed, drilled out, or otherwise disabled.
I suspect that the cars are designed so that no matter what happens to the computerization, the car will still function -- the doors will still open and close, the door locks will still work, etc. That is, the designers knew that the computerization is glitz, nonsense, that can't be permitted to be essential to the car.
The US has a compulsion, can't quit, "change the cars!!!!". Can't leave good enough alone. Bunch of people making jobs for themselves, jobs that don't need doing -- jobs doing things for helpless me I very much do not want done.
And, whatever else, that 40% is a big waste of money, and the US has more than enough pressures on family budgets to waste money.
Big Point: US is failing at family formation and having kids. The birth rate is so low that we are going extinct, literally, rapidly. IMHO, the main reason is that US families are short on money for having kids. The two big expenditures are a house and two cars. If we could save the 40% on each of the cars, then, net, the US would have more kids.
But I might be wrong, in small ways or big ones.
This whole thing is part of a big pet peeve of mine: I want to keep things simple and spend my time, energy, and money working on my startup or whatever, certainly not mud wrestling with maintenance of absurd nonsense on a car.
E.g., several times a day my Firefox browser reminds me that there is a newer version I can download. I HATE that. I do NOT want a newer version. Where can I even pay money to STOP the notifications of updates? A newer version will make changes, there are no changes I want, the changes may make some things worse, and there is nearly no chance the changes will make any improvements. The version of Firefox I use has something it calls "pockets". I don't know what they are, and I don't want to devote the time or energy looking into what they are.
The Internet, TCP/IP, the Web, URLs, HTTP, HTML, CSS, Ethernet, WiFi, etc. are all very nice and simple. I just want to use them, not futz with more that is worthless.
I needed a laptop, got one, and it had Windows 10. I HATE Windows 10. My development computer is Windows 7 Professional, and THAT is what I like and want. From all I've heard about Windows 11, I will hate it even more. I just want an operating system to do the basic things, let me run my programs and write more programs.
It goes on this way -- people trying to make a living by making stupid changes and then forcing them on me.
No I am 100% correct. FMVSS requires stability control. It is impossible to meet that regulatory requirement without computers. And that's just one example, there are others. Read the rules.
Besides safety rules, new vehicles are also legally required to meet stringent emissions and fuel economy requirements. As a practical matter that also requires computers.
> No I am 100% correct. FMVSS requires stability control.
So, we have the DOT, Department of Transportation, the NHTSA, National Highway Traffic Safety Administration, and the FMVSS, the Federal Motor Vehicle Safety Standards.
49 CFR Part 571 Electronic Stability Control Systems for Heavy Vehicles
As I read this, it is only for "heavy vehicles",
"This proposes to establish a new Federal Motor Vehicle Safety Standard No. 136 to require electronic stability control (ESC) systems on truck tractors and certain buses with a gross vehicle weight rating of greater than 11,793 kilograms (26,000 pounds). ESC systems in truck tractors and large buses are designed to reduce untripped rollovers and mitigate severe understeer or oversteer conditions that lead to loss of control by using automatic computer-controlled braking and reducing engine torque output."
It is not clear that that regulation ever got implemented. And it is less clear that it got implemented for passenger cars.
"When ESC detects loss of steering control, it automatically applies the brakes to help steer the vehicle where the driver intends to go."
Also
"ESC has been mandatory in new cars in Canada, the US, and the European Union since 2011, 2012, and 2014, respectively."
So, yup, I'm driving and the ESC gets confused and applies brakes at some of my wheels and reduces my engine power. No thanks.
Sounds like the DOT, NHTSA, and the FMVSS went all obsessive and compulsive. And maybe they made the stockholders of Bendix really happy. I'd like to see some statistics, (1) when the ESC, electronic stability control, helped and when (2) when it failed and caused an accident.
I quit reading at the claim of 1000 chips in a passenger car.
For your
> Besides safety rules, new vehicles are also legally required to meet stringent emissions and fuel economy requirements. As a practical matter that also requires computers.
I agreed that computer based engine controls are good. They are terrific.
To see some of why, just get a little understanding of what we put up with before, with carburetors, chokes, ignition points, centrifugal advance springs and weights, etc.
E.g., for a carburetor, it sits on top of the intake manifold. It has several ounces of gasoline in its float bowl. Turn off a hot engine, and heat from the hot cylinders conducts to the float bowl and boils and evaporates the gasoline then heats the carburetor and warps its shape changing its fuel/air mixtures. The worst is the choke: For a cold engine, just block the flow of air into the carburetor and suck in lots of raw gas. In cold weather with a cold engine, only a little of the gas evaporates and creates a combustible mixture, and the rest is wasted, washes down the cylinder walls and into the oil pan, dilutes the oil, and shortens engine life.
And the ignition system: Maybe occasionally it delivers a good spark at the right time!
These are some of the worst problems, but there are more.
Computer controlled fuel and ignition deliver a good mixture, hot or cold engine, hot or cold weather, sea level or high altitude, and a good spark on time. Engines get to last about twice as long, 200,000+ miles.
But computer controlled engines don't require anything like the "1000 chips". Add in ESC and still are far short of 1000 chips. So, your
> No I am 100% correct.
is an exaggeration!
1000 chips? For a car? GADS!
The chips are "40%" of the cost
of a car? Double gads!
How to save nearly 40% of cost of a car? Sure, get rid of nearly all the "1000" chips.
For ESC, good to see that if there is any question, should be able to disable the thing just by cutting some signal wires. Same for anti-lock brakes.
I've driven about 1 million miles to the present without an accident and without any of that fancy stuff for the brakes. Okay, I'll go along with the dual master cylinder requirement. For the rest, no thanks.
Congress and the DOT, NHTSA, and FMVSS -- politicians get into mechanical engineering. Not good. Looks like my 2001 Yukon is a jewel of special value!!!!
For a car of 2011 or later, looks like I will need a list of the wires to cut to disable nearly all the electronics. I don't want some computer applying my brakes or keeping me from applying my brakes.
1000 chips in a car -- still sounds like way too many.
The chips 40% of the cost of a car? Yup, sounds like a great, fast, easy way to lower the cost of a car by about 35%! Cars are a very competitive, price sensitive market -- 35% off the price is a great way to get a lot more market share!!
Yup, even in my 2001 Yukon, when I pull into my garage, the headlights come on automatically!!! Sick-o. Demented. Deranged. Delusional. And if it fails, it could leave the headlights on too long and run the battery down. Fortunately, I have plenty of means to recharge the battery.
Dead batteries? I remember. That's how I got to college. When my battery went dead, I pushed the car down out of my driveway -- got it rolling and then jumped in. Once out of the driveway I rolled and steered it so that it was headed downhill on the street. Then pushed the car, got it rolling, and then reached in and swatted the gear shift lever to put the transmission in third gear (test question, why not first gear?), kept pushing, and as the engine began to start jumped in, pulled the door shut, put the transmission back in neutral, let the engine get going fully, and then drove to college.
And I need computers in my car? Laugh of the millennium!
Politicians, keep your dirty, crooked, paid off, incompetent, fumbling hands OFF my cars.
Computer controlled engines are extremely vulnerable to EMPs. Read "one second after". Congress held hearings on this with the military saying it is one of the worst threats the US faces right before 9/11 eclipsed it.
All modern US military combat vehicles rely on computerized engine control modules. They are shielded to some extent. But concerns about vehicle vulnerability to EMPs are exaggerated. Due to the inverse square law, the flux drops off rapidly unless you're close to the explosion. And if you're close to the explosion then you have other problems.
I thought they were talking about the lost working time whenever my machine locks up because a job I'm running uses up all my cores/memory, or I have too many browser tabs open.
But hey, mainstream news sources fail at elementary school math, at least here they're making the effort to fail at high school math.