Hacker News new | past | comments | ask | show | jobs | submit login
U.S. workers have gotten less productive – no one is sure why (washingtonpost.com)
353 points by pseudolus on Oct 31, 2022 | hide | past | favorite | 695 comments




I've read that doctors now spend as much as 50% of their time documenting their work. Companies such as Epic, which provide the software that hospitals use to build databases of patient data, have been big winners in the new world of hospitals-depending-on-software. But did the doctors become more productive? By almost any measure, they became less productive.

People in tech keep thinking more tech will solve problems and they keep underestimating the flexibility of the old models. For instance, most large companies used to be run by armies of secretaries, and the senior secretaries functioned as what we would now call "project managers" -- they made calendars, oversaw who was working on what, followed up to keep track on whether work was being done, and kept a close eye on what money was being spent. The crucial thing about having humans overseeing such work is that humans can take a flexible approach to the rules: they know when to break them. By contrast, systems that are highly dependent on software tend to be more rigid. Software doesn't know when its rules should be broken.

The flexibility of the old system is constantly underestimated, the rigidness of the new systems is often misunderstood.

In his book "The Design Of Design" Fred Brooks talks about the power of trust, and he contrasts that situations where everything needs to be first negotiated and specified in a contract. High trust systems are flexible and fast, whereas a system where every detail needs to be specified in a contract is slow and rigid. We should stop and ask ourselves, our favorite Agile methodology resembles which of these? Are specifying things with needless detail?


I don’t think productivity was ever the goal of this software. It was to have a record that is standard, digital, transferable, etc. Doctors fought it as long as they could because they knew what it meant for them.

I remember pretty early demos in early/mid 2000s when I was doing some clinical grunt work in college. I had written some software to make my department’s life easier so I was offered up as the hospital’s liaison for the software evaluation. This is when I formed my “never replace a terminal based app, with a GUI based app and expect productivity gains” theory. Everyone working in the hospital knew the terminal app, they type in some random 3 letter code and a screen would pop up. Then they would memorize how many tabs each field was apart from each other. Without a mouse, people could just hum along imputing data a blazing speed once some muscle memory was in place. Everyone had little cheat sheets printed out for the less frequently used commands/codes. When you replace this with a browser/desktop GUI with selectors and drop downs and reactive components of GUI, it tends to 1) require mouse usage for most people and 2) lose the ability to do this quick data entry I described. The pretty interface becomes a steady stream of speed bumps that reduce productivity. Since then I’ve witnessed it in banking and other industries too.


IMHO, this is because the people writting GUI's these days are mostly incompetent, or hamstrung by "web" technologies.

Early GUI's didn't have the problem you describe because they were designed as discovery mechanisms to the underlying function. AKA, the idea was that after clicking File->Save a dozen times you would remember the keyboard accelerators displayed on the right hand side of the menu. Or if nothing else, Remember that the F in File was underlined along with the "S" in Save (or whatever). Which would lead people to just press ctrl-s, or Alt-F, S. Then part of testing was making sure that that the tab key moved appropriately from field to field,etc.

I remember in the 1990's spending a fair amount of time doing keyboard optimization in a "reporting" application I wrote (which also had an early touchscreen) for use by people who's main job wasn't using a computer. Then we would have "training" classes and watch how they learned to use it.

So, much of this has been lost with modern "GUI's", even the OS vendors which should have been keeping their human interface guidelines updated, did stupid things like _HIDE_ the accelerator keys in windows if the user wasn't pressing the Alt key. Which destroys discoverability, because now users don't have the shortcut in their face. Nevermind the recent crazy nonsense where links and buttons are basically the same thing, sometimes triggering crazy behaviors like context menus and the like. Or just designing UI's where its impossible to know if something is actually a button because the link text is the same color as the rest of the text on the screen..


In my experience the rise of GUIs over TUIs they lost the command buffering. If you knew what you were doing with a well designed TUI you could hit a sequence of keys that would be buffered and "replayed" as the next screen(s) loaded. Hit a sequence of commands in a GUI and they'll just get lost after the first one as the app/website loads.


What you describe is the natural outcome of having a single message loop with synchronous handlers, and describes e.g. Win32 just as well - a sequence of keys would simply end up as the corresponding sequence of window messages in the queue, and processed in order.

Where we partially lost that is when UX started to get async. It's not fundamentally incompatible with well-ordered input messages, but in practice, people who write async code all too often forget that it's async all the way (and not just for their favorite scenario). And so you get travesties such as textboxes that let you type text into them, and then erase all that when the app or the page is "fully loaded".


Amen. I talked to some guys in their 20s in the oil and gas industry in Houston and they preferred using the old TUI systems because of this exact reason.


On a related note: the same is true for Keyboard Macros in Emacs, Vim etc. I often send a bunch of related, but each slightly different from the others, using Emacs keyboard macros and gnus. Felt great the first time I discovered it, saved perhaps 3 hours I would have spent writing and debugging a script.


The App Library feature is underrated. A clean screen and search is just so much better than having to scroll through page after page of icons.


Excuse me, what is TUI?


Terminal user interface


Thank you


I agree with you. Although I don't think it's incompetence so much as laziness. Not just "too lazy to make a good UI" but "too lazy to find out what makes a UI good." I've seen so many coworkers happy to slap some basic form together and expect that to be good enough.

I'm constantly writing UI for sports teams who do not at all like to waste time with these kind of fiddly UI elements and flows. Most of them would likely stick to Excel if our solutions are more cumbersome (which is a high bar to meet/beat, but rightfully so). They need to be able to easily get to data and relevant, connected pieces of data, quickly enter data into relatively complex forms, and have it all be clear, reliable, and fault-tolerant. This means making some tradeoffs, particularly around what is considered modern UI aesthetics, and doing things most UI developers don't need to do such as automating little things, adding hotkeys, etc.


> Most of them would likely stick to Excel if our solutions are more cumbersome (which is a high bar to meet/beat, but rightfully so)

Tying this back to the top-level comment: at some point I've realized that so many SaaS would be better off as an Excel sheet, because they're fundamentally just a slower, buggier, much less productive and uglier version of one. But the point isn't efficiency and empowerment. The point is control: the SaaS forces the user to a specific workflow. One that makes it easier for developers to develop (by constraining the problem space), or for company to monetize, or for corporate to retain legibility - but rarely to actually help the end-user.

As an end user from a generation that was taught Excel and MS Access at school, I'd go as far as saying that 50+% of web applications I use would be an order of magnitude more useful if the UX was that of Excel or Access.


> Tying this back to the top-level comment: at some point I've realized that so many SaaS would be better off as an Excel sheet

I preach this all the time. Anyone who has to enter data really wants excel and not the fancy wizardized stepped workflow that seems to be the norm nowadays


You have to (at least!) account for concurrent edit & access. I remember the hellscape that was emailing files like ProjectPlan_v6_beambotEdits_latest.xls to team members.


This is what MS Access was developed for.


Nowadays with team shared storage like Box, Dropbox, One Drive etc, it's "HEY EVERYONE ELSE CLOSE Project_Costs_and_Totals_11.22(15).xlsx I NEED TO ADD LAST WEEKS DATA"


> for company to monetize

I suspect an overwhelming majority of use cases is for this reason alone.


So what you're saying is HTML5 and server side rendering should be the go to before any client side junk.


I dunno if they are saying that, but I wholeheartedly endorse this idea. Add the "client side junk" (fancy Javascript stuff, etc.) as enhancement on top for those who want it after the required functionality is being properly and reliably served by the "core technologies". Serve the need properly first, then make it "nice".


The problem is that people make the decision about what to use as inexpert users and that pretty GUI with all the space and Next buttons looks so easy to use to them. By the time they realize they didn’t actually want it, it’s too late. This is the reason people stick with things like Excel, it’s easy to transition up the sophistication ladder.


I don't have much to add, but you really nailed the sentiment I was going for exactly. I have been lucky to both be serving a very small user group that I can work closely with, and one whose existing workflows still exist and they can go back to if ours suck, so we _have_ to be better.


I had a coworker tell me that javascript should be added like color: stuff that makes it nicer to use for those that have it turned on, but not necessary for those that don't have it at all (or can't see color).

Kind of a hard core position in my opinion considering it's ignoring the sometimes advantages / use cases for client side code (like complicated SPAs where the user is explicitly buying into the idea of running lots of code in their browser, figma comes to mind) but I still like to think about it here and there.


In the same sentiment of 99% of companies are not Google and do not need Google's infrastructure.

99% of web applications are not Figma and probably do not need any client side JavaScript. The amount of JavaScript that's actually just wrong is incredible.

1. I don't care what your client side validation thinks. That is my email address.

2. Why are you serving people all of the news story if you are going to then hide it? We can block the JavaScript and read the news.


Well both of your cases are things I happen to agree with just shouldn't be necessary anyway lol, like most of the times I'm giving up an email it's not because I want to get emails from these jerkwads that are trading some small amount of service for the right to blast me with marketing emails (and they ALWAYS ignore my choice on the "please don't send market emails" checkbox, if they even have one). And news stories that aren't served over RSS are ones I don't want to read... i hate the modern state of the internet-as-capitalist-entity.

So I don't disagree on merit alone, but it seems if we wanna do a good capitalism, we need to make sure people are actually giving us real emails apparently


But you don't need client side JavaScript for that.

In fact, it makes the experience generally worse.


I feel like the push to make software accessible (in a new-user, not disability, context) and intuitive has made complexity the enemy. Instead of having software that grows with the user's capability, features are hidden from the top layer of interactivity or just cut entirely.

I was at the post office here in Australia a few years ago and saw the screen. It was one of those DOS-era full screen red and blue text interfaces. She was flying through hotkeys and getting things done. People can learn, so much software treats them as infants.

And you know there's definitely someone looking at replacing that software with a modern GUI.


I did an internship there like 10 years ago at the end of my uni course, where one of my projects was to do a proof of concept of re-implementing their point-of-sale application using web technologies, maintaining the DOS-y look, feel, keyboard shortcuts etc.

At the time I had no idea why. In hindsight, it's hilarious. I'm guessing it was the result of a clash between someone that wanted to use modern tech for the sake of it and someone representing the users that told them to get fucked, probably went through 15 meetings before eventually getting palmed off on the intern so it didn't pick up too much steam (probably the only good decision that was made in the whole process).


For future reference (and anyone following along later), that is an "ncurses" terminal application.

You should see the customized JBHIFI terminal + keyboard.


I dont seem to have a shot of the terminal app, but I do for the keyboard:

https://photos.app.goo.gl/fLeDNp8rU7wN7siXA


what are those? searching for JBHIFI terminal only brings up a store in Australia of that name selling square POS terminals.

Bloomberg terminals also have a custom keyboard and are essentially a terminal program.


Yeah JB Hi-Fi is a local retailer in Aus (guessing it's like a Best Buy but a strong focus on music CDs originally), it's probably a reference to the software the staff uses. Haven't had a chance to look over their shoulder at that one.


So should we be doing demos in Bash, with an ncurses or gum workflow before going back to write the proper system in C?


I dont see the problem with this. What is "Gum workflow" in this context ?


Neither do I.

It was supposed to be read:

A [ncurses or gum] workflow.

Gum being an alternative method of making a Bash UI.

https://github.com/charmbracelet/gum


Thanks, thats a neat tool!


I remember that being a key difference between the graphical / modelling software SoftImage and Maya. With the former a TUI and the latter a GUI. The Maya approach won out because getting new people productive quickly was more important than their longer term productivity it seems. Maybe it has something to do with turnover but maybe there are people who can do good creative work but can't adapt to using a TUI or at least are frightened of it.


I think you’re onto something with this. SAP power users can input data at blazing speeds, because they remember so many of the codes. So this definitely isn’t just a GUI vs TUI paradigm.


It's also because it's enterprise software. Which actually isn't software, it's more of a platform. You have to do so much implementation detail that the GUI is just the result of some form-builder type module. Everything I've ever encountered that was enterprise software, felt like it's GUI was not made by humans at all. I don't actually know how they get built but they're almost never optimized for humans or the usage they're meant to benefit.


Nobody wants to pay for better GUIs in enterprise software, so no vendor puts any attention into them. An Enterprise Architect explicitly explained to me (when I was raising a point of choosing a software package that had much better UI) that good UX is a small factor and company (a bank in that case) would rather buy cheaper software and just have its workers suffer more, because it's deemed more cost-effective.


The definition of enterprise software is “the customer is not the user”. You don’t have to make the user happy, just the CxO.


And this is why the most polished part of most enterprise software packages is the dashboard/reporting function, the only part the C-levels might actually touch themselves.


Wouldn't productivity from better UI be something measurable that can be then advertised by vendors and extrapolated into savings for the customer? I feel like that could be a pretty compelling selling point to the CxOs.


How do you measure how much MS Teams sucks compared to Slack? All the CTO knows is that they have an enterprise license for Office along with Teams and the salesperson they talked to showed them how great it integrates with the rest of the suite.

In the EHR space, all the CMO (chief medical officer) cares about is it helps keep them in compliance. Why should they care if it’s hard for their office staff to use?


My company used UX as a competitive differentiator for Enterprise Software in the healthcare sector. It's so neglected by most software development companies, we were able to get the doctors as our champions to convince the bean counters to consider our products.


We did the same thing in the healthcare sector, but for safety event reporting! It's not like we even did anything groundbreaking with UX. One was just making it so it's easy to fill out a form quickly. You don't even have to touch the mouse if you don't want to. Users love it and it's increased event reporting by a good percentage.


This is true, look at time collection software or just about anything written with SAP.


I've not done that kind of SW in a long time, and never with someone else's platform. But that said, the reporting application I was describing above was a platform in the same sense. It was largely an engine for generating the forms being filled out by the end users. Which is why it there was so much effort doing usability stuff, because the underlying form descriptions had to have tons of optimization flags for doing things like list sorting common items to the top N items of drop-downs, or moving fields around in the form to match the ways the users thought about filling out the forms.

So there were two sides, the engine optimizations to assure things like tab orders on a form, and the was the actually writing the form descriptions. In the first couple organizations that adopted it I wrote the forms and the engine in parallel adding feature flags/controls as needed to support the desired UI outcomes. Later after I quit, the lady who wrote much of the RFP responses started writing the actual form descriptions because it was just as easy as drawing them out in the (visio?) plugin she was using with MS word and doing screen captures for the RFP. Then I guess because she knew how to do it, was doing the "tuning" as well.


> IMHO, this is because the people writting GUI's these days are mostly incompetent, or hamstrung by "web" technologies.

The latter is definitely not the problem. Even the Twitter re-design from a couple years back still supports all the old hotkeys.

All it takes to at least support a tab-based workflow is using the "tabindex" property if your form isn't logically laid out already, and the rest can be done by capturing hotkeys.

Even multimedia content can be operated using hotkeys. Youtube is a good example. There's no excuse but laziness and incompetence IMO.

[1] https://developer.mozilla.org/en-US/docs/Web/HTML/Global_att...


Its not always just about tabindex. For example it might be about understanding that there are multiple ways to fill out a form that makes sense. Then hiding/showing pieces as needed and/or providing hotkeys to jump from field 1 to field 5 because the user doesn't want to fill out 2,3,4 because they are optional. Its about keystroke optimization. Sure they can press tab 3 times, or they can just press ctrl-5 (or whatever) to get there.

If you watch people use the sabre command line interface (the one from the 1970s?), you can see some of what i'm talking about when people are just filling out the forms with the submission line, its less using the GUI and more just knowing some sequence of keystrokes that results in an action being taken.

AKA its possible to do both, without having the user wear out the tab key, or grabbing the mouse all the time.


I think it depends a lot on how it is used. In the case of sabre, that is somebody's job. They use that interface everyday so the user needs to be able to use the interface efficiently and is willing to learn the shortcuts.

A sign up form that a given user will only use once in their lifetime is another matter. It needs to be simple lest the user abandons the process. Most users in that case are not willing to learn keyboard shortcuts, tabbing over optional fields probably makes more sense here.


Modern web culture is very much the problem. The tech can be keyboard-friendly, sure... but in practice, even the most basic stuff, like Enter to submit forms, is often not working because the "Submit" button is not a proper button, but a bunch of JS.


It is a problem because with TUI keyboard is the first-class input device whereas with GUI and especially HTML it an afterthought most of times. Yes, there are exceptions like Twitter and Gmail and then there are millions other interfaces where mouse is the only way to navigate.


I think it’s less about the design being bad and more about simplicity as a goal overriding everything else.

So learning an interface is frowned upon. Every interface must be designed as if the user has never seen it before.

Which is ok, as long as there’s also an alternative path which may take time and effort to learn but leads to increased efficiency and productivity.

Unfortunately, that alternate path takes a lot of effort. And worse, it leads to very few extra sales. The company which puts in the effort to build a complex but efficient workflow in addition to the initial easy workflow will get beat in the market by competitors who only focus on the easy workflow that looks great in demos leading to the C-suiters to buy it.

The best example is the pioneer of both these trends. Apple. It used to be that Apple insisted on a single button mouse so the primary interface was extremely easy to use. Yet, it spent a lot of effort including keyboard shortcuts, which were prominently displayed, everywhere. Unlike Windows, which basically only had Ctrl/Alt modifier keys, Apple has had the Ctrl/CMD/Opt modifier keys as integral parts for a very long time, encouraging those shortcuts. Apple also put s lot of effort to make their, and 3rd party applications easily scriptable (the choice of AppleScript however always held this back). The most MS did was VBA for Office.

But once Apple entered the iEra, it realized it didn’t need any of this to sell products anymore. The massive lag in supporting basic keyboard shortcuts on the iPad when using a keyboard is one of the strongest evidence for this.


For Microsoft stuff in that time period, the scripting was meant to be done via OLE Automation (VBA is really just a scripting language with OLE Automation as the underlying object model). And it was much more pervasive than Office - remember the time when Microsoft products were all "Active ..."? Third-party apps were encouraged to do that as well, although few did.


> IMHO, this is because the people writting GUI's these days are mostly incompetent, or hamstrung by "web" technologies.

Completely agree with all of this.

Adding that it's not even web technologies that can't be used. My work Mac had an issue with a Bluetooth mouse, removing and adding a Bluetooth device with the keyboard is basically impossible using tab as nothing gets highlighted to show you switched to it and some of the crucial parts don't get toggled through.

Modern GUIs are absolutely not fit for purpose.


Having worked on enterprise systems, it's not that the developers are always incompetent, it's more that everything is top down designed by committee. Nothing can be harmonized as everyone wants their workflow in there, each one slightly different, there's political infighting, jobsworths, people scared to lose their jobs, all under a strict budget and deadlines to meet.

No one knows what they really want till the software starts getting written and someone finds out they can't do their job, then come change requests (made a lot of money off of them). You kind of just get numb to it all.


GUIs have really profoundly regressed. Go ready any UI design book from the 80s or 90s.

As you say the web is a culprit but so is attempting to shoehorn mobile designs into desktop.


This is nostalgia, you're remembering things as better than they were. Back then there were so many bad UI in software http://hallofshame.gp.co.at/shame.htm


Yes, there were plenty of bad UI's and they got called out on sites like that because of it. If you read some of those "bloopers" you realize that the functionality the guy is complaining about are pretty much the default broken behavior these days. For example: applications that don't honor system colors.

With some OS's you can't even have fine grained control over those defaults anymore. Dark modes have brought some of it back, so its better now than 5 years ago, but still worse than 20 years ago. But people pretend that having a dark/light switch is the same thing as being able to customize the color of just about every layer of the UI and have the vast majority of the applications honor it.


I suppose one way to save the situation would be to build libraries that allowed you to easily build tuis/efficient guis that interact with open-api or graphql endpoints? If there only was a way to encode the workflow in addition to just the apis it could almost be generated.


Could you recommend some well acclaimed older UI design books? I’m interested to learn what we’ve lost!


The Humane Interface by Jef Raskin


One other thing about the 90s data entry TUIs - we had very good tools that were specifically optimized for that, both in terms of ease of use for the developer, and in terms of efficiency for the end user of the resulting app. Remember dBase, Clipper, FoxPro etc?


Think how bad it's going to get when UX designers have only used phones and tablets.


They also fought it because they didn't go to medical school and survive residency to fill out forms all damn day—and they didn't used to have to, they had staff for that.

Then the computerized systems "replaced" that staff but all that really means is they cut the human time needed low enough that full-time workers weren't needed, but didn't eliminate it, so now that's another thing doctors have to do themselves.

AFAI can tell, the effect of tech overall is to cut some jobs while making the remaining ones harder and more stressful, while increasing so-called context switching.


> they cut the human time needed low enough that full-time workers weren't needed

No, that's not it at all. What GP is saying is that they cut the human expertise low enough that full-time workers weren't needed. The manpower savings never materialized because an app built for experts is faster than one built for casual users, and also because those experts, even with the high training cost, were ultimately cheaper per hour than the highly compensated people who now have to do the job because we 'made it easy'. First you devalue those experts by making their job harder, then you get rid of the job and make it someone else's, split between entry level staff and your most expensive employees.

> AFAI can tell, the effect of tech overall is to cut some jobs while making the remaining ones harder and more stressful, while increasing so-called context switching.

You still got there in the end.


> AFAI can tell, the effect of tech overall is to cut some jobs while making the remaining ones harder and more stressful, while increasing so-called context switching.

100% agree.

The Office suite is, in some ways, the worst thing that happened to corporations. Thanks to computers, everyone can now easily write a report, fill in a spreadsheet, or manage their meetings. Which means that everyone now has to write reports, fill in spreadsheets and manage their calendars. This used to be a separate job. You had people specializing in those tasks, and they were focused and efficient at it. Now, everyone does it on their own, and not only we suck at this, it's also distracting us from the "main" job we're supposed to be doing.

The older I get, the more I feel computer revolution was in big part a bait-and-switch.


Great point(s). There's no longer the barrier to entry. No longer a belief that just because you can doesn't mean you should. That behaviour gets reinforced because doing X or Y is easy, that lack of friction builds a bias that the end product / output is 10x better than it really is.

You're not the only one with that bait and switch feeling.


But the doctors almost always were taking notes anyway. My step father (A doctor) fought a losing battle against electronic records because he had _decades_ of paper records stored in the "records" room of his office. That was largely the responsibility of the front desk to pull the patients records and have them ready for him to read/check before seeing the patient. Then clean them up and refile them. Long term patients had pages and pages of hand written notes, prescription histories, etc.

So a part of the job has always been the record keeping, OTOH, as one of the other users mentioned, I've seen enough Dr's using their computer records systems to know that software is mostly garbage. The Dr's spend 2-4x the time dealing with the shitty UI as actually typing in the notes now.

(In the end he basically retired instead of convert to electronic records).


Sounds about right

For those that seemed to transfer successfully, I noted that at Mayo Clinic, the doctors use live dictation software and dictate at least some of their notes into the system while the patient is present near the end. This immediate review sometimes brings up a few new questions (from either Dr or patient), and a bit more notetaking. So, it looks like a very efficient system. They also have no apparent shortage of staff organizing things.

That said, I doubt every medical organization and office has the same quality setup as a top world-class institution. At some level of degradation, the system becomes more of a hindrance than a help, and that point is likely fairly near the top levels (so most of it is a hindrance).


Dictating while the patient is there is brilliant, because it double-checks both the doctor and the patient's understanding of what happened.


Forms give you validation errors or warnings instantly tho


Huh?

What form would give validation error or warning that a note the Dr is making either raises a new question by the patient or is in conflict with something the patient knows?

(& yes, from what I could see, there were also fields for patient data, date, etc, that are presumably validation-checked)


Yes, I thought so too

Edit: Also, I was generally quite impressed with their notes


I've never seen the same doctor twice in a row.

What I've noticed is that EMR has greatly reduced the amounts of screw-ups or delays caused by not having the right information at hand, or having to repeat tests. Also, since there's now a terminal in every examining room, I can see what amount of effort is required to use the EMR tool (Epic in the case of my provider), and it doesn't seem all that onerous. I can guesstimate the additional amount of time that they spend outside of clinic hours, completing their records for the day, and again, it doesn't seem onerous.

For a few years I had to fill out a lengthy medical history form, every time I visited a clinic, but that's pretty much gone today. My primary care doctor just retired, and her replacement took up the baton without skipping a beat. She can also easily delegate to her physician's assistant or nurse practitioner, so they can all work as a team, with instant access to the same information.

Now I have noticed something interesting. The urgent and primary care clinics all have a terminal in every examining room, and the clinicians perform their examinations while seated at the terminal, except when they actually have to poke around. That's where it seems quite efficient.

In the hospital wards, they still don't have a terminal in each room, meaning that each clinician has to look things up at centralized terminals, remember them (or not), and has no access to information. If they need some information, they will come back with it, next time their make their rounds, which might be the next day. And they screw up. My dad had an episode that took him through an ER, to a regular hospital bed for a few days, then to a rehab ward. I had all of his records at my fingertips thanks to MyChart on my laptop. The doctors and nurses were lost, they completely overlooked the documented diagnosis that was at the root of is condition, and didn't believe me about it.

Some of the nurses in the hospitals now have a terminal on a wheeled cart, that they bring on their rounds.

What I'm guessing is that in the days of handwritten records, the doctors were mostly winging it.


They’re taking notes with cheap pencils and paper! We should sell them an complex, messy to build, fleet of machines!

It’s like cutting off a head when stitching a cut on the leg was the problem.


Some healthcare provider organizations now employ medical scribes who follow physicians around and do all their EHR data entry. This is expensive, but can be cost effective because then the physicians have more time to perform billable procedures.


The least capable doctor's time is worth $300/hr. The scribe is paid what, $25/hr?

This is so much like hearing of engineers that will not hire a $20/hr maid due to egalitarian reasons so they squat in filth or waste all their free hours cleaning, all while capable and willing cleaners starve. Insane.


Any competent engineer should at least have a maid, driver, nanny, servant, chef, gardener, pool cleaner, a mistress, dog walker and personal assistant /s


You are overestimating how much least capable doctor makes (more like 100-150k) and underestimating how much somebody who can type medical information makes (more like $30-$35).


I live below the poverty level. I receive Section 8 funds for housing. The Section 8 inspections and quality standards are so important to me that I hire a cleaning service about 6 times a year to make sure everything in here is spotless, because I'm really not that psychologically or physically capable of cleaning everything, even if I had limitless free time to do it. The maids cost about $130 a visit and they're worth their weight in gold, just so I have peace of mind and a consistently clean place to live. The City thanks me for it, too.


Section 8 inspects your house to make sure you're cleaning your room? that seems... strange


Personal hours are not fungible. You can't replace a bit of cleaning time throughout the week with an additional $600.

Also there are plenty of costs to employing others besides the hourly cost. Large organizations have huge fixed costs to cover them. Consider:

- liability

- maintaining of knowledge and training to hire help

- skill to source and hire people


If so, that's hilarious, because that's precisely one of the jobs all these expensive, painful-to-use computer systems were supposed to replace. You'd take a year or two course at junior college, to learn shorthand and drill some medical terminology so you'd be less likely to make a bunch of simple transcription mistakes, then go to work.


It's doubly hilarious when you realize it's the inverse of what the office productivity software did to everyone.

Remind me again why do I have to manage my own meetings and prepare so many powerpoints and fill in so many forms as a... well, my position doesn't matter, because everyone is doing the same, no matter their role?


There's still a fairly large job market for medical transcriptionists, but that's a different job than being a medical scribe. Transcriptionists don't use shorthand any more, they mostly work from digital voice recordings. And they're typically not transcribing from scratch; now usually a voice recognition system does the first pass and then the human edits it to fix the ~2% errors. Transcriptionists don't usually work directly in EHRs, but their documents are fed into EHRs.


Microsoft recently bought Nuance for this very reason.


Ah this is great to hear. I been thinking about this approach for a while. Great to hear it’s a Thing.


Sounds like modern startup devops-without-devops culture


Shift left amirite? Same with DBAs.


Don't need DBAs if you're hiring 10x full stack developers.


Seems an artefact of doctors not being employees.

If their employment status was the same as everyone else’s, there wouldn’t be any effort to replace admin staff with someone getting paid 10x as much unless there was actually a 90% reduction in work (doubtful).


You can reduce 90% of the work - but if the remaining 10% is shifted to someone that gets paid 50x as much, it's still a loss.


I, yes, that was their point. You're just saying the same thing with different numbers.


Laughs from academia....


Trick is to be hourly!


That and the staff actually doubled over the same period


As someone who worked in the electronic health records industry, closely with design teams, and has thought more deeply about certain aspects of this problem than anyone else in history (not exaggerating), I think you're missing major factors.

First, yes, productivity was one of the goals of the forced move to electronic health records systems. The federal government passed the HITECH Act in 2009 creating economic incentives for doctors to switch to EHRs because it would be better for public health, Medicare billing, and also because it would supposedly unlock doctors to spend more time with patients and be more productive, via the use of technology.

The reason that third thing has failed, to my mind, is largely because the government, in the same act, required a HUGE list of requirements be met by the software designers making the EHRs. This list, by law, needs to be prioritized in scrums over customer requests and design thinking. Sometimes it makes good UX impossible.

Ironically, the government, hearing this feedback, actually added a new requirement to the list: "Safety-Enhanced Design" [0].

Go read that regulatory requirement and see if it makes any sense to you. That's why design sucks in EHRs.

[0] https://www.healthit.gov/test-method/safety-enhanced-design


What do you see as the way out of this mess?


Hmm. There are many layers of problems in the healthcare system, EHRs with bad design only one.

I will say that small, private clinics who only accept private insurance don't necessarily need to use a federally certified EHR. They are legally allowed to build their own system. One example is OneMedical. When you ask the doctors/PAs/nurses at OneMedical about their EHR, they actually love it.

I think the first step would be validating my anecdotal experience here by polling doctors using non-certified EHRs to see if they like them more. If they do by an overwhelming amount, I'd first take that evidence to the Office of the National Coordinator (the government agency created by the aforementioned HITECH Act who makes the certification criteria). I'd tell them that if they vastly simplify the criteria and add more flexibility, they'd make doctors happier.

I guess the problem is that happy doctors, unfortunately, isn't necessarily aligned with the existing federal laws. Probably you'd need to pass a new statute at Congress, this time around with the benefit of talking to a lot of people with experience making EHRs.


Do you think this is feasible and could happen within the next decade given moderate effort?

I considered targeting standardization in health tech for a while and uncovered a similar layer of red tape as you describe. We can't continue to let our medical system continue to lag behind the rest of the world's and good EHRs are a part of that.


I don’t think productivity was ever the goal of this software. It was to have a record that is standard, digital, transferable, etc.

Going a little further, this was appealing in part to avoid simple medical errors & oversights. Losing the record, mixing up records, incomplete history, and so on. Eliminating medical error is incredibly valuable but doesn't show up as "productivity".


This is amusing as, 10 years ago, my wife (a decade-plus under 60, even now) showed up to a consultation with a doctor who remarked that she looked very good for someone over 60 and who suffered from a series of conditions that she did not have but showed up in "her" medical records.


My wife is in 30s but has had a lot of women's health stuff going on the last decade. We stay completely within the same "healthcare network" of hospitals precisely because they actually use the same system and all doctors can access it (obviously we like the providers as well!) But even for basic procedures we could save a little on like lab work or imaging by going out of this network we've learned it doesn't really work as promised. It's still hard to get your records to stay together unless they're in the same company's database is what we've learned.


Yes. All the systems are set up this way.

The problem is: how do you allow departments to retain their fiefdoms in a world of centralised data? The answer is to spend a fortune on management consulting.


Or nationalize the documentation infrastructure. This is not a problem in the developed world. i give my doctor my tax ID and they can see my entire patient history, all of my medication, and relevant notes from other providers.


That won't happen in a federation like US or Canada, sadly. The states (provinces) would have to either agree on an infrastructure (hah!) or agree to give that power to the federal government (OMG lol).


I work in EHR as of this year. About a week into the new job, one of the older software engineers made a comment about how what we do should really just be a function of the federal government. I was like, "but wouldn't we be out of a job of they did?" And he said, "sure, but we'll probably be dead by the time they figure it out anyway".


If you're in the US, the federal government actually created one one of the most widely used EHRs and set of clinical applications in the world, called VistA. It was made for the VA but is used outside of it.

https://en.m.wikipedia.org/wiki/VistA

It's actually public domain and open-source. Unfortunately the VA is in an ongoing project to replace it with Cerner Millennium.


A project that's not going well. VA system might be MUMPS based, but it's very good, as you say.

The big issue from my EHR-adjacent vantage point is that all the money is in making giant, entrenched EHRs, whereas the future should be in lighter-weight services that conform to standards, and competition is between those services. That way you could gradually slice off VA functionality piece by piece, and have a successful, more efficient, gradual migration.


The UK has the NHS and does not have this in the slightest : - )


We do the same. My wife sees tons of suboptimal healthcare delivery due to lack of doctors having the necessary information. In our current area, it is easy to find doctors they use mychart and interface with local hospital, so if we were to end up in the hospital, our medical history is immediately available.


The GUI apps have the benefit of being easier for onboarding. We've redesigned the workplace to deal with constant employee turnover.

I guess they also make more sense to management since it looks like something they could do themselves, or at least understand.


You can have both. GUIs were a breakthrough because they enabled much better discoverability, allowed images in the UI and so on. But they were also designed to be fully keyboardable and low latency.

Web tech broke all that:

- UI was/still is very high latency. Keystrokes input whilst the browser is waiting do not buffer, unlike in classical mainframe/terminal designs. They're just lost or worse might randomly interrupt your current transaction.

- HTML has no concept of keyboard shortcuts, accelerator keys, menus, context menus, command lines and other power user features that allow regular users to go fast.

We adopted web tech even for productivity/crud apps, because browsers solved distribution at a time when Microsoft was badly dropping the ball on it. That solved problems for developers and allowed more rapid iteration, but ended up yielding lower productivity than older generations of apps for people who became highly skilled.


Well browsers solved multiple other issues too: cross platform apps, updating all clients in a single place, sharing data between devices, and the most important for many developers - switching software from an ownership to a rental model, killing piracy, and easy access to user metrics and data.

All of these (except logging on to the same data from all my devices, which is nice) benefit the developer at the expense of the user.


> All of these (except logging on to the same data from all my devices, which is nice) benefit the developer at the expense of the user.

Glad you pointed that out. And, in the most prevalent application of Conway's law[0], those changes enabled and are entrenched by the "agile" practices in software development. Incremental work, continuous deployment, endless bugfixing and webapps fit each other like a glove (the latex kind that's used for deep examination of users' behavior).

It also enables data siloes and prevents any app from becoming a commodity - making software one of the strongest supplier-driven markets out there, which is why the frequent dismissal of legitimate complaints, "vote with your feet/wallet", does not work.

----

[0] - https://en.wikipedia.org/wiki/Conway%27s_law


Yes, "updating all clients in one place" is what I meant by distribution. Windows distribution suffered for many years from problems like:

- Very high latency

- No support for online updates

- Impossible to easily administer

Cross platform was much less of a big deal when web apps started to get big. Windows just dominated in that time. Not many people cared about macOS Classic back then and desktop UNIX didn't matter at all. Browsers were nonetheless way easier to deal with than Windows itself.

Agree that killing piracy was a really big part of it. Of course, you can implement core logic and shared databases with non-web apps too, and the web has a semi-equivalent problem in the form of ad blockers.


You missed privacy. The user lost privacy with webapps.


I figured that came under "easy access to user metrics and data", but I did consider some kind of rhyme linking piracy to privacy but it was a little early in the day to commit that sin. It's probably worth mentioning twice anyway.


> HTML has no concept of keyboard shortcuts, accelerator keys, menus, context menus, command lines and other power user features that allow regular users to go fast.

HTML has had a limited concept of accelerator keys for years, but it's not pretty:

https://developer.mozilla.org/en-US/docs/Web/HTML/Global_att...


This is a good observation. Constant employee turnover also reduces worker productivity, as it means most current employees are juniors in their role (regardless of what their title says).


Problem is the GUI could have shortcuts for everything, but usually won’t.

It doesn’t help that the evaluators for a new system will also approach from the perspective of a new user, even though none of them will be a new user in some months.

I’ve so wanted to create auto-hot-keys for many tasks, but end up having to use (x,y) clicks where I get boned every design touch-up (deliberate or side-effect of another change).


> never replace a terminal based app, with a GUI based app and expect productivity gains

I can imagine this being true. It seems that almost the whole software industry has failed to grasp the distinction between an appliance and a tool. An appliance you expect almost anyone to be able to use without training. A tool, well you are expected to learn how to use it, and after that, you are much more productive than before. And most software seems to be moving towards appliance.


I like this dichotomy. I'd want to add a third, the product. The product has ego, it needs to look nice, it needs to demo well, it is marketing. This is what the auto industry has become since the model T and it's what software has become since it was a tool. The problem is with software, things like productivity typically take a hit as it moves further from tool to product. More so when the domain is something like EHR or ERP or E anything.


>This is when I formed my “never replace a terminal based app, with a GUI based app and expect productivity gains” theory.

Not in medicine (run a small e-commerce business selling mostly used video games), but definitely noticed the same thing for us.

We have some terminal-based Python scripts I wrote to automate a lot of the data entry tasks like listing and shipping (entering tracking numbers, printing labels).

Everyone that uses the scripts is initially apprehensive, but then after maybe a day of getting used to the terminal turns into a powerful data entry God and they love it. Even had an employee gush about our shipping tool to a random supplier.


Back in the late '80s the government department I worked at had a dedicated data entry team with their own system (hardware and all).

The greenscreen data entry was highly optimised to not even require tabbing to different fields, but just run the values together in a large single field and the software would split them into fields and validate them.

I assume it was very fast and efficient for the experienced operators.


"Fun" fact: the Therac-25 tragedy was in part caused by this type of usage - folks who know it so well they just blast through the screens from memory. But the software in question wasn't resilient to this use-case, and apparently resulted in an inconsistent state.


Good example.

---

The system distinguished between errors that halted the machine, requiring a restart, and errors which merely paused the machine (which allowed operators to continue with the same settings using a keypress). However, some errors which endangered the patient merely paused the machine, and the frequent occurrence of minor errors caused operators to become accustomed to habitually unpausing the machine.

One failure occurred when a particular sequence of keystrokes was entered on the VT-100 terminal which controlled the PDP-11 computer: if the operator were to press "X" to (erroneously) select 25 MeV photon mode, then use "cursor up" to edit the input to "E" to (correctly) select 25 MeV Electron mode, then "Enter", all within eight seconds of the first keypress, well within the capability of an experienced user of the machine. These edits weren't noticed as it would take 8 seconds for startup, so it would go with the default setup.[3]

---

... which allowed the electron beam to be set for X-ray mode without the X-ray target being in place. A second fault allowed the electron beam to activate during field-light mode, during which no beam scanner was active or target was in place.

Previous models had hardware interlocks to prevent such faults, but the Therac-25 had removed them, depending instead on software checks for safety.

The high-current electron beam struck the patients with approximately 100 times the intended dose of radiation, and over a narrower area, delivering a potentially lethal dose of beta radiation. The feeling was described by patient Ray Cox as "an intense electric shock", causing him to scream and run out of the treatment room.[4] Several days later, radiation burns appeared, and the patients showed the symptoms of radiation poisoning; in three cases, the injured patients later died as a result of the overdose.[5]

---

In response to incidents like those associated with Therac-25, the IEC 62304 standard was created, which introduces development life cycle standards for medical device software and specific guidance on using software of unknown pedigree.[7]

https://en.wikipedia.org/wiki/Therac-25


This sounds like poor consideration for edge cases - not really a problem with the UI or people clicking through it too fast. Anything that could be interpreted as remotely fatal should've shut the machine down.


The control software should not be physically able to command the hardware to enter an invalid state. You can do that by only exposing the 3 valid modes to the software or only enabling power to the emitter if every piece of hardware is in the correct place when the software request arrives.

You also have a hardware lock on the power - this can be as simple as a hardware timer (a RC circuit siffices) which limits how long the emitter can be on within in a given window to be safe.

Never trust the software. If you must trust some software, create a minimal set you CAN trust which isolates the rest of the software from the hardware.

You are correct, the discussion about how to exercise this bug (fast UI, blah blah) is interesting to hear but totally irrelevant to the lesson (don't trust software).


They basically did no testing at all on that machine, and reused the previous software which relied on hardware safety interlocks which had been removed from the newer model. It's literally a textbook case of how not to do mission-critical software.


It sounds like we should consider and test the possibility that users of the software will become extremely familiar and want to use it much more quickly than we anticipate.


At Uni as a summer job I worked processing Corporate Actions for a large custodial bank. We used exactly the same kind of system where every action was 4 characters. I can still remember some of them despite it being 10 years since I did that job. Even more importantly, the screens were trivially scriptable so lots of the grunt work could be handled by writing export scripts, pulling a bunch of data into excel, processing it and occasionally posting the results back the same way.

Absolutely no way a modern system could be half as efficient, short of completely automating the whole job (which involved a lot of communication with other parties and basically freeform restrictions).


You could provide a terminal UI from within the GUI (it's not unheard of and can work quite well)


Alas, most modern software doesn't come with the option of a GUI. It's a HTML document pressed into service as a GUI with a greater or lesser degree of success.


I see. Well... piss.


Lol!


This memorization and strict adherence to past ways of doing things killed me as a developer. I was tasked with maintaining and customizing an Enterprise Resource Planning system. It was a terminal based app. Sometimes system upgrades would add a field to a screen. For example, perhaps country code was added, split from the phone number in the customer screen. I would frequently be requested to suppress the upgrade, or move the new fields so they didn't 'ruin' peoples memorized routines.

As the company customized more and more of the code base, upgrades to the system became more and more difficult. Every upgrade required a manual comparison of custom code to be merged with the baseline code. This led to skipped upgrades, and eventually a cessation of upgrades. Of course, after no upgrades over a long period of time it was eventually decided to move to a new ERP system.

I am still appalled at the things sacrificed to prevent disruption of a small group of peoples work flow.


But CLI apps should be extensible in the same way that .bashrc is.

If it can't be upgraded, it's not because of its a terminal app, it's because it's poorly designed.


> It was to have a record that is standard, digital, transferable, etc.

Considering how often I have to fill out the same goddamn forms (sometimes literally down the hall in the same building as another doctor), I think that goal failed miserably.


No practice or healthcare system accepts anyone else's records. There is supposedly a way for the patient to release records and have them sent around, but none of the doctors I've asked will accept that sort of nonsense. It's "NIH" for healthcare - if they didn't generate the record, they don't want it.

While in the hospital, the phlebotomists came after me for routine lab work at hospital prices. I declined and then I had a conversation with the nurse about releasing my labs from June to them. On script, she said "Mmm, 4 months is kinda old! We'd rather do our own!" so I filled out the ROI anyway, and curiously nobody offered to draw my blood again.


> I don’t think productivity was ever the goal of this software.

Thing to remember finance/economists/rentiers have a different definition of efficiency and productivity than you do. In this case the productivity has to do with billing not the uninteresting things that doctors do. By reducing the cost of billing and forcing doctors to document more things to be billed more money can be extracted.


>I don’t think productivity was ever the goal of this software.

I'm not entirely sure about this. During the early digitization era productivity was a big driver. Modern word processors are a godsend if you've ever tried to typewrite a document for publication (forget anything with complex formulae) or dealt with actual physical spreadsheets? Office itself and it's now many clones is a fantastic set of tools for productivity.

My opinion is that we've created a world in technology now that drives technology for the sake of tech and financial drivers. I regularly deal with people who think it would be a great idea to build a system to automate some aspect of business that's already well optimized or generalize something they thing is general but is really quite niche.

There are certainly cases where it makes sense to develop a system around something but you need to consider the full cost/benefit tradeoffs, not just benefits which industry tends to do.


> [...] they type in some random 3 letter code and a screen would pop up. Then they would memorize how many tabs each field was apart from each other. Without a mouse, people could just hum along imputing data a blazing speed once some muscle memory was in place. Everyone had little cheat sheets printed out for the less frequently used commands/codes.

This comports with my most recent experience using SAP in 2018. I know, I know, SAP has GUIs and such now. This well known and profitable corporation under the Blackstone umbrella, though? Nope. It was exactly as you describe.

Those who had the time-in-service or the mentality to accept it excelled at their job, but uniformly skewed older (late 40s and up) or younger (under 25). At the time, almost everyone aged in between was entirely befuddled by it all.

Context: supply chain, procurement, purchasing, logistics, maintenance, work orders, inventory


> It was to have a record that is standard, digital, transferable, etc.

Which translates into productivity. If something is standard, digital and transferable it means you can increase the rate of output in relation to its input (which is the definition of productivity).


Right, but it's the records that are standard, digital and transferrable; not the work. So what you end up optimizing for is producing paperwork.


huh? if the records are "standard, digital and transferrable", it means all of the work associated with those records is sped up.

- Need to retrieve past doctor visits about a patient? person at front desk no longer needs to walk to the folder closet, then scan the whole thing to find your name and then read through all of the documents to find the relevant visits. just click a button.

- How about getting the prescriptions provided to you from a previous doctor? Reduction in time to phone / fax the previous doctor. just click a button.

- Want to check if your insurance covers your procedure? Receptionist calls the carrier, sits on a 6.5 minute customer service wait queue, then gets the info versus 1-click.

- and, and...

It was always about productivity.


The problem is that you've optimized time savings for the cheapest people for a hospital to employ at the cost of time spent by the most expensive people a hospital employs, eliminating a handful of cheap jobs while making the expensive jobs both less efficient and happy.


It is more productive if the person just knows if the procedure is covered because the insurance companies have stable standards and trust the medical providers rather than having it all be JIT decisions based on rules that either constantly shift or are so vague/low trust as to be "you medical person yourself can't decide if this procedure is covered, you have to call us."

And back in the paper days, the staff would pull up the records for the days appointments. ER visits would have less data but normal medical care would be fine.


> if the person just knows if the procedure is covered because the insurance companies have stable standards and trust the medical providers rather than having it all be JIT decisions based on rules that either constantly shift or are so vague/low trust

None those are related to use or lack of use of technology. Those are purely bureaucratic rules setup by insurance carriers.

> And back in the paper days, the staff would pull up the records for the days appointments.

And sometimes those papers would get lost, or maybe they're still sitting in the folder on a door because someone forget to clean them up, or they were in the wrong order so it took the person longer to find the person's name, appointments would shift, etc. etc.

I can't believe I'm having to explain to someone the productivity advantages of a system of record to a technology focused crowd...


> None those are related to use or lack of use of technology. Those are purely bureaucratic rules setup by insurance carriers.

They are very much related to use of technology, because they are enabled by technology. The degree of bullshit paperwork every white-collar worker has to deal with nowadays is a direct consequence of computers making it possible to make us do that work, and for the recipients to process it.

The benefits of those processes are whatever they are designed to be, but this creates a false image of net productivity, because the costs are now hidden, smeared across everyone's workload, adding to a vague sense of dissatisfaction and low productivity. In contrast, if you tried the same processes few decades ago, it would mean hiring dedicated people on both ends, and the costs - as measured by their salaries - would be clearly visible.

> I can't believe I'm having to explain to someone the productivity advantages of a system of record to a technology focused crowd...

You don't have to. But you're missing the disadvantages of the situation when maintenance of that system of records becomes a job distributed across everyone. It's not the digital recording per se that's the problem, but the fact that everyone is now also their own secretary.


> The degree of bullshit paperwork every white-collar worker has to deal with nowadays is a direct consequence of computers making it possible to make us do that work, and for the recipients to process it.

These are bold claims backed up by little to no data other than your anecdotal observations. Productivity has generally been on a steady upward trend in the US since it was first measured in 1947. My own professional service business, which does require a decent amount of "bullshit paperwork" would not have been possible at the scale it achieved without technology.

> In contrast, if you tried the same processes few decades ago, it would mean hiring dedicated people on both ends, and the costs - as measured by their salaries - would be clearly visible.

Ever seen Mad Men? There was literally a full floor full of human beings typing out bullshit letters on typewriters because computers didn't exist in the era.


> Ever seen Mad Men? There was literally a full floor full of human beings typing out bullshit letters on typewriters because computers didn't exist in the era.

That's my point: those people received salaries for typing out those letters, making the cost of it clearly visible to the business.


So your point is "we visibly saw the cost before, but no longer see the cost now with tech. Therefore we can conclude that the invisible cost now outweighs the visible cost...because ummmm it's not longer visible?"

Explain that one to me.


> and trust the medical providers

$68 Billion in medical fraud in the US

> https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6139931/

Part of the opioid crisis caused by basically bribing doctors

Yes I’m well aware that when drug abuse was happening in the “inner cities” where the government looked the other way because it was more concerned with propping up countries during the Cold War, the same people who want to treat drug addiction like a “disease” when it’s happening in “rural America”, it blamed “single mothers” and “lack of morals”.


That will only be a speed-up if the time saved from easier information retrieval is smaller than the time spent in increased paperwork, which it may or may not be, but is an assertion that needs justification.

In general, I'll note producing documentation is fairly slow and tedious. It takes something like an order of magnitude longer to write a sentence than to read it. So this optimization is only going to be a productivity boost if this paperwork is accessed repeatedly, dozens of times in the course of treatment (the productive thing).


> easier information retrieval is smaller than the time spent in increased paperwork

What paperwork creation increased as a result of digital record use?

I'm beginning to think y'all are conflating the increase of documentation with the use of digitalization. The two aren't mutually exclusive.


> I'm beginning to think y'all are conflating the increase of documentation with the use of digitalization. The two aren't mutually exclusive.

I think what we're talking about is a clear (if non-typical) example of the Jevons paradox[0]. Digitalization makes creating and processing paperwork more efficient, allowing the overall organization as a system to support/afford more of it. As a result, amount of documentation and form filling increases.

----

[0] - https://en.wikipedia.org/wiki/Jevons_paradox


Hilariously that wikipedia page mentions nothing about productivity. In fact an article that talks about Jevon's paradox says this:

According to the Ford Motor Company, its fuel economy ranged between thirteen and twenty-one miles per gallon. There are vehicles on the road today that do worse than that; have we really made so little progress in more than a hundred years? But focussing on miles per gallon is the wrong way to assess the environmental impact of cars. Far more revealing is to consider the productivity of driving. Today, in contrast to the early nineteen-hundreds, any American with a license can cheaply travel almost anywhere, in almost any weather, in extraordinary comfort; can drive for thousands of miles with no maintenance other than refuelling; can easily find gas, food, lodging, and just about anything else within a short distance of almost any road; and can order and eat meals without undoing a seat belt or turning off the ceiling-mounted DVD player.

A modern driver, in other words, gets vastly more benefit from a gallon of gasoline—makes far more economical use of fuel—than any Model T owner ever did. Yet motorists’ energy consumption has grown by mind-boggling amounts, and, as the productivity of driving has increased and the cost of getting around has fallen, the global market for cars has surged. (Two of the biggest road-building efforts in the history of the world are currently under way in India and China.) And developing small, inexpensive vehicles that get a hundred miles to the gallon would only exacerbate that trend. The problem with efficiency gains is that we inevitably reinvest them in additional consumption.[0]

In other words, you're too narrowly focusing on "miles per gallon", not the fact that "any American with a license can cheaply travel almost anywhere, in almost any weather, in extraordinary comfort; can drive for thousands of miles with no maintenance other than refuelling; can easily find gas, food, lodging, and just about anything else within a short distance of almost any road; and can order and eat meals without undoing a seat belt or turning off the ceiling-mounted DVD player."

For example, in the show Mad Men, you'd see a whole floor full of human beings who's sole job was to type out letters on typewriters. Now, those jobs are obsolete and the white collar worker is responsible for them. Still bullshit paperwork, but the white collar worker now has to do them.

The overall trend for productivity since it was tracked in 1947 has been on an upward trend with its sharpest 1H downward trend ever this past year. Are you seriously suggesting white collar workers magically got more bullshit paperwork just these past 2 quarters?

[0] - https://www.newyorker.com/magazine/2010/12/20/the-efficiency...


That also never happened, did it?

Is there a "standard" medical record, or does each system implement its own proprietary format? Are the records transferrable? If so, why am I asked to fill out a complete medical history form on paper every time I visit the a doctor, as if I'm a new patient, when all the doctors I see are in the same network and presumably use the same EHR system.


You are optimizing the downstream consumers of the records not _necessarily_ care, which is what you probably _want_ to optimize.


I used to work in Healthcare software (Not Epic).

Productivity is indeed a selling point.

I will also tell you that EHR software is universally hated by doctors. Does not matter who makes it. The company that cracks that will make billions.

One interesting idea was a voice assistant wired up to take inputs as doctors did their work. I don't think it went anywhere (yet).


I work in it too. And the US govt is not approving or even looking to approve new EHRs. The bureaucratic hurdles (and regulatory capture) are such that it is no longer feasible in this country. I would write one in a heartbeat if it wasn't a doomed venture.


There are lots of new EHRs being approved. Just go look on CHPL and you can see that there are a 100 new EHRs that have made it through the 2015 Cures Update certification. Its hard and expensive but it can be done as I can attest to getting our product certified just this year.


> I don’t think productivity was ever the goal of this software.

Well, EHR is a glorified billing platform.


We can use an all HTML, Javascript-free interface that people can still memorize and quickly Tab through.


But that's not what anyone was selling at the time. I'm sure complexity has only increased since then.

It was pre-AJAX and pre-"Javascript being useful", I think even pre-Firefox and was IE6 only. So it was loading java applets and stuff to just get some basic functionality


Another reason is to satisfy insurers increasing demands for documentation to backup billing.


Mouse moves are crack for ml algorithms if the interface is maintained somehow.


Was your hospital by any chance using Meditech as the terminal based application?


Sounds like Bloomberg.


> People in tech keep thinking more tech will solve problems and they keep underestimating the flexibility of the old models.

Related to this, but in a completely different context, I have had similar thoughts lately when eating out at restaurants in Spain. It's incredibly frustrating from a customer's point of view when the waiter taking your group's order has to use a newfangled tablet or phone-like device and tap through each individual order, often depending on the peculiarities of the app's UI and how the designers expect the process to carry out: “Are you all having the set menu? No? OK, first I need to know how many of you are having it?" [taps count on screen] "Right, now I need the starters but ONLY for the set menu orders..." etcetera—you get the idea. Then, while going through this unnecessarily slow process, God forbid someone who ordered from the set menu wants to change their main while the waiter is already taking the à la carte orders.

Meanwhile, in restaurants that haven't unnecessarily techified the procesz, the waiter can take the order in the way that's most practical given the circumstances and that best fits his way of taking notes. Ah, but then how does the order reach the kitchen without the tech? I have no idea, all I can say is that it worked fine before these things were put in place, and the manual system is by far the quickest and most flexible from a customer's point of view.


Classic business process mistake of trying to change a verbal contract into a form-letter.

"I'd like a quarter pounder with cheese and fries" is utterly unacceptable for buying a house or taking out a car loan, but it's the ideal way to order lunch. The people marketing, designing, and writing the application software have never worked in the business, of course, lack of experience has never made people like that pause, so they have peculiar ideas resulting in enforcement of weird and unproductive business processes.


Exactly why I cannot stand using the kiosks to order food at a fast-food restaurant. They take what used to be a five-second process to verbally state an order, and turn it into a multi-minute agony of taps, reading, canceling suggested upsells, etc. before finally getting to completion.

I guess the restaurant saves having to pay a person at the counter to take the orders, at the expense of massive customer frustration to the point where I hardly ever go to these places anymore. And then they wonder why their year-over-year sales are declining.


Our nearest McDonalds has a Siri-like virtual assistant thing taking orders at the drive-through. They've had to add taped-on paper notes telling people what it expects them to say to end the order. If you order anything with a number in the name, it may give you that many of it instead of one of it. I don't know how well it does at modifying mistakes but I'd bet the answer is "it can't, it just tells you to pull forward and talk to a real person".

It sucks.

And yeah, the damn order-kiosks manage to take saying the words "large black coffee" and turn it into a two-minute process.

Like "automated" checkouts, they're not automation, they're just making customers do more work than paid workers used to, to achieve the same outcome. The work's still happening, and is less efficient, the businesses just aren't having to pay for it. That's not automation.


> They've had to add taped-on paper notes telling people what it expects them to say to end the order.

Voice assistants are still stuck at that 90% point of "good enough to be impressive, bad enough to not work in practice". The selling point of free-form voice recognition is that you don't have to learn the language of the computer. I still believe this is bullshit. You have to do it anyway, but there are no references for it. Hell, even the developers don't know what the "language of the computer" is, because it's defined as "whatever the ML model doesn't choke on".

It would be much better if we stuck to structured natural-language queries with controlled vocabulary[0]. People would quickly pick up on it, and even if it sounded a bit artificial (not much worse than interacting with voice assistants sounds anyway), it would at least work.

> Like "automated" checkouts, they're not automation, they're just making customers do more work than paid workers used to, to achieve the same outcome. The work's still happening, and is less efficient, the businesses just aren't having to pay for it. That's not automation.

The older I get the more I see it, that it's literally most software in a nutshell. Starting with Office suite - yes, the word processors, calendars and spreadsheets - that killed dedicated secretarial jobs and distributed the work they did evenly across everyone.

----

[0] - https://en.wikipedia.org/wiki/Controlled_vocabulary


> and turn it into a multi-minute (...) suggested upsells

This is how the restaurant chain deploying those kiosks sees it.

> the restaurant saves having to pay a person at the counter to take the orders, at the expense of massive customer frustration to the point where I hardly ever go to these places anymore

This is self-correcting over time, as more and more places adopt such systems. It's just another supplier-driven market. Fast-food restaurant meals are not really commodities. If I'm in a mood for, say, a McDonald's cheeseburger, there is no other vendor I can get it from[0], so I'm going to suffer through the kiosk experience. It's calibrated to be just about bearable enough that I'm not going to skip the meal because of it.

Everything seems to become more and more like this, in every market - burgers, fridges, cars, phones, webapps, everything. As annoying and abusive as possible without making it a deal-breaker, offered on a supplier-driven market, so the non-annoying, non-abusive alternatives don't exist.

----

[0] - Yes, Burger King next door may have similar cheeseburgers, but they feel and taste different enough to be non-substitutable. But even if, guess what, they also have those stupid kiosks.


Market for lemons, or the old saying about bad money always pushes out the good money.

For another example see touch-screen automotive UIs, universally hated by everyone except the bean counters, so we're stuck with them forever.

Also WRT the BK next door, its probably 5 minutes away by car or 30 minutes away on foot, which explains the corporate push toward "Walkable cities" and so forth. At my last onsite office job, there was a deli on the first floor and we were essentially captive customers; in theory there were plenty of places to walk within a half hour walk, but in practice its raining and 40 degrees so we were forced to shop at the company store, so to speak.


The trade off isn't in terms of the order, it's that you get back the time waiting in line behind several very slow people.


You can also order in the app from the comfort of your home, car, bus, sidewalk, thereby skipping the "standing and huffing at the kiosk" scene entirely.


When I was in Israel almost every fast-food resturant had a kiosk like that, and they were all super easy to use and responsive. This might partially be an implementation problem.


Here in BC, Canada, many places don't even provide you a menu. You're expected to scan a QR code on the table, then use the website it leads you to. So the table fumbles around with their phones, using a website that usually has a terrible UX, and then the server arrives and enters your order on their tablet.

At least they don't require that we install an App. But I'm sure someone's thinking that would be a good idea.


QR codes mean there are no menus to clean off.


It also means they can data-mine you and upsell you crap. The tech stack is outsourced to some shady third party, and you either get a shitty webapp, or a PDF not suitable for mobile, because the restaurants still occasionally need the paper menu, and why do the work twice.


It also means analytics for everyone at the table, not just the person who paid and not only their credit card number.


That's just shitty software though. Good software gets out of the way and improves something. We could speculate on how to fix that scenario, but there's probably no incentive - in my experience there are fixations amongst tech people on profit-less ideas that end up getting squeezed awkwardly into applications such as bill splitting, digital ordering etc.


I completely agree. Software Engineers are no less prone to “when you only know how to use a hammer, everything looks like a nail” as anyone else.

There’s so many things we keep trying to shoehorn tech into that don’t need it— electronic ordering/serving food, planning a small gathering of friends, making a smoothie [1], “smart” fridges/toasters/stoves… these are all adding unnecessary knobs and bobbles to things humanity has gotten by just fine with for ages (the first since the dawn of civilization!)

As a general rule to “will this be tech useful” I think in terms of scale— is this new tech enabling/helping me to do/manage something 10x-100x better than I could with existing tools? Sure, I can organize a single dinner/cocktail party of a couple dozen people via paper invites or text messages and phone calls to caterers, and using tech for that is likely introducing unnecessary overhead, but if I’m a planner organizing many weddings of 100+ for a living then, yeah, obviously a party-planning management software will be of use.

If not, its value is likely not worth the hassle.

[1] https://amp.theguardian.com/technology/2017/sep/01/juicero-s...


Having been employed in tech for over 20 years, I have found myself leaning further this way over time. The simplest solution is usually the best.


Waiters put the bits of paper they're writing on on a rack in the kitchen. Other times, they shout out the orders and remember which table gets what.


And shreds it all at the end of the day before the tax guy shows up.

Though Quebec, Canada, had such a problem with “zappers” that would delete orders from the electronic system that every restaurant now must be online with the tax authority and every receipt has a tax authority response code on the top.


What upsets me are those dumb QR Code menus. Battery dead? Out of service? No food for you amigo.


Or if you just don’t have a smartphone. Obviously that’s unthinkable though.


> I've read that doctors now spend as much as 50% of their time documenting their work. Companies such as Epic, which provide the software that hospitals use to build databases of patient data, have been big winners in the new world of hospitals-depending-on-software. But did the doctors become more productive? By almost any measure, they became less productive.

It’s less the software, and more the users, site-specific configuration and the environment they work in.

Non-US Epic users spend 20-60% less time on various EHR activities than their US counterparts. One of the most dramatic differences is time spent on ordering, which you would think would be as optimized as it could be.

Time spent documenting was 40 minutes/day for US users and 30 for non-US on average. Maybe some spend 50%, but that’s far from average.

https://jamanetwork.com/journals/jamainternalmedicine/fullar...

Another study found US clinicians write 4x longer notes (cited in above).

Now, does any of this improve clinical outcomes?


> Another study found US clinicians write 4x longer notes

I wasn't aware of this, but as the spouse of a medical provider I know that most US providers are burdened with an ever-present worry about malpractice.


This. So much this. In residency we are taught “document, document, document” and “this is a medical legal document” which leads to defensive medicine. Another point of why we spend more time documenting is billing. Coders/billers continue to come back and asking us to add more details about a diagnosis. More details = more charges to bill or up level. So the next progress note or office visit, I go back to add more. More time is spent fighting the notes. Terrible EMRs that destroy notes is one that leads to more time spent as well. Looking at you Allscripts (aka Allshits in my office). Overall it’s a sad state of medicine in the USA, which is terrible as when I was younger the whole point of medicine for me was to help people and focus on the patient and the issues that are ailing them. Now, patients are still important to me but it’s a race to the bottom trying to document while in seeing the patients so I can go home without paperwork and live my life. I got bills to pay, a 6 figure student loan that will take me another decade to pay off. At some point, it’s all going break down. Few of us are doing the concierge direct primary care model to avoid all this which will unfortunately lead to more health disparities and inequalities.


> concierge direct primary care model

How does this reduce/eliminate the “better document this thoroughly in case I get sued” work?

Or are the legal worries overblown/over-relied upon for over-documentation?


Direct primary care (DPC)/concierge is mostly cash only so you document as you see fit and are not held to the rules of having to use an EHR/EMR. Many of those who do this model are on paper as they are more “old school” at this time. Younger physicians would document more just because they are taught this in residency but some of the malpractice lawyers we had presentations with often said document well but don’t over do it as over-documentation can mean more things to be picked at when your are being sued. That said, over documentation helps billers code for every single thing they can and want more over documentation. So in general, they will hassle you to over-document. They will send you messages in the EMR, send you emails, even sometimes use the pager service until you update your note.

I think the legal worries, when it’s in concierge medicine, are over blown because at that point you are seeing a physician that is doing more quality than quantity to make ends meet. You don’t need to document to appease the biller but document enough to know what you did and so that you could share your chart notes with other providers in case the patient ever moved to another provider or (super rare) if they were admitted into the hospital and they needed some documentation for whatever reason. DPC, you are the boss and control how many patients you want to take on. In general private practice, you are at the mercy of the hospital system, payers, etc etc and are pulled in 7 different ways to make money which only benefits them and not the physician. Even then, with primary care, you are told you are a “loss leader” since we aren’t specialists and generally dumped on in the medical field.

In “private” practice, you need to see 20-30/day patients to make your salary (break even) without bonuses within a hospital/health system. That means about 400-450 patients per month. Major payers limit you to 3,000 patients under your care overall.

Studies show that 1,000:1 should be the max patient:physician ratio but many of us are around the 2-3k mark. In a true concierge medicine/direct primacy care model, since it’s mostly cash/subscription model, most of physicians average 300-500 patient panel per year which is about 4-6 patients a day. If you want to see more and make more on the side, see 400-600 patients. Are you an older doc who wants to practice but not see many patients, see 2-4 patients a day, 1-2 days a week and cover costs and still make a comfortable living.

Like I said, something is going break in the medical system and it’s only a matter of time. When it does, the healthcare disparities and inequalities will really become more apparent. Hospital systems/admins/insurance companies are now leaning on mid-levels like NPs/PAs as a bandaid to the issues of patient access but what will happen when you burn them out too?

/rant

Tldr, concierge medicine, your own boss, do whatever you want with your charting. You can chart 5 words or 5 paragraphs. Most physicians will chart enough to document well for medical and legal reasons and not have to worry about billers muddying the waters to over-document. Quality time with patients = less legal risk.


Gotta write as much disparaging stuff as you can on the patient to instruct future doctors not to provide medical care!


To clarify, the length was 4x longer. Some of the discrepancy is attributed to more keystrokes, but a lot of it is copying/auto-inserting stuff.

These automated analyses don’t capture whether the extra content is beneficial or not (it might be!).


> I've read that doctors now spend as much as 50% of their time documenting their work. Companies such as Epic, which provide the software that hospitals use to build databases of patient data, have been big winners in the new world of hospitals-depending-on-software. But did the doctors become more productive? By almost any measure, they became less productive.

I don't think "by almost any measure" is right. I think in a very narrow sense they've become less productive (they see fewer patients), but by your own admission they're building databases of patient data, which you seem to suppose are only useful to the likes of Epic, but obviously Epic has customers--notably healthcare researchers use this data to improve patient care, develop new medicines, and to precisely identify which medicines are likely to help on a particular patient (and which medicines may even harm them!). This is stuff clearly benefits society, and doctors' role in this should be counted as "productive", although we can quibble about the relative value of facilitating healthcare research versus seeing more patients.

Note that this isn't meant to vouch for Epic--I work for a company that consumes their data and anyone who has to integrate with them has nothing good to say about the software, but the role it plays is still incredibly important.


> notably healthcare researchers use this data to improve patient care, develop new medicines, and to precisely identify which medicines are likely to help on a particular patient (and which medicines may even harm them!)

The majority of the notes being written by doctors now is boilerplate. A lot of it is copy-pasted. It's written because of insurance companies (which have incentive to deny claims), because of liability (which gives incentive to leave a lot of notes behind to make it looks like you thought about everything under the sun even if it wasn't applicable), and because of well-meaning but ultimately overly broad laws adding additional requirements even when they don't quite make sense.

I'm sure there is a treasure-trove of valuable data in there, especially compared to when it was all hidden away on physical paper. But you could probably reduce the paperwork that doctors do these days by a factor of 4 and not loose anything of value.


> But you could probably reduce the paperwork that doctors do these days by a factor of 4 and not loose anything of value.

Maybe, but this sounds like some vague hunch based on ???. I very highly doubt the healthcare industry would tolerate doctors wasting ~37.5% of their time (75% of paperwork time is wasteful * 50% of doctors' time spent on paperwork = minimum 37.5% of doctors time wasted). Doctors are expensive, so recouping anywhere near 40% of their time would be a priority.

It seems more likely that the paperwork is actually pretty useful (but the utility is not obvious to the lay observer), or at least useful enough that the wasted time isn't significant to the healthcare industry (which is already struggling with margins and personnel).


Based on working in the industry, and hearing from healthcare practitioners not quite first-hand, but second-hand.

You mention profit efficiency, but all three of my points make sense even in light of that: (i) insurance is literally the way doctors get paid; (ii) lawsuits are hella expensive, and (iii) regardless of profit incentives you can't not follow the law.

The software that doctors use is terrible. It's a perfect combination of extreme complexity, domination by just a couple companies (Epic and Cerner), legacy software (some still written in mumps, I hear!), and tons and tons of regulation.

https://en.wikipedia.org/wiki/MUMPS


The Veteran Affair's legacy system is still written in MUMPS.

I believe the VA has poured literally billions over the last decade to fully modernize it. It hasn't happened yet to my knowledge.


>I very highly doubt the healthcare industry would tolerate doctors wasting ~37.5% of their time (75% of paperwork time is wasteful * 50% of doctors' time spent on paperwork = minimum 37.5% of doctors time wasted). Doctors are expensive, so recouping anywhere near 40% of their time would be a priority.

Doctors are expensive, but malpractice lawsuits are more expensive. Documentation is extremely important for lawsuits. If you get sued because a patient you saw last year later died and they're alleging improper health care(just to use a random example), it's highly dependent on having meticulously documented notes that document every single examination finding and treatment administered. Your memory isn't going to be accurate, and the prosecutor is going to be looking for any errors in documentation they can use.


>I very highly doubt the healthcare industry would tolerate doctors wasting ~37.5% of their time

I am sorry, have you been in the workplace in the past 20 years and have you tried adding up all the meeting that get shoved into your calendar? I am lucky if 50% of my day is not taken away from me

To imply that someone from higher up wouldn't tolerate' that most of my time is spent in meetings is simply laughable, they are the ones creating them!


I don't dispute the value of documentation for CYA. I'm saying that if CYA were the motivating use case, electronic medical systems would look a lot more like a Word document than like Epic (Epic is designed to standardize patient histories so they can be analyzed for research, not for paralegal convenience).


Medical decisions are based primarily on financial profit, and patient outcome data is not required to determine which medicines are most profitable.

The point of extensive documentation is shielding from the worst of malpractice lawsuits. The legal system is still of the legacy opinion that doctors have a responsibility to their patients as opposed to the more modern understanding of responsibility toward pharma company bottom lines, and all patients legally deserve the 100% successful participation trophy, so a documented decision with only 95% chance of success means insurance payouts about 5% of the time, unless its carefully documented it was all the patients fault or at least the MD could not have known the outcome in advance.


> patient outcome data is not required to determine which medicines are most profitable

This is blatantly false. Pharma spends tons of money to buy this data for their armies of researchers in order to determine outcomes. There's an entire very lucrative industry (that I work in) which exists to source this data from hospitals and refine it for researchers.

If the point of documentation was CYA, then you wouldn't need complicated systems like Epic to standardize the documentation and make it available for electronic processing (you would just have some paralegal pour over the records of the individual patient).


This is a really good point about how we myopically understand the value stream of a process. Often, steps that we feel are bureaucratic waste provide a lot of value to someone else in the process.

With that said, I think most healthcare is correct to take a "patient centric" approach. What the OP seems to be making is a "doctor centric" take and, if one was to be overly cynical (I'm not), your post may skew to the side of a "researcher centric" or "societal centric" approach. Doctors should do what's best for their patient, not necessarily what's best for society, or themselves, or a lawyer, or a research lab. It's easy if you work in one of those tangential areas to take your eye off the ball.


I don't think documentation precludes doctors from caring for their patients, but it does limit the number of patients they can handle. This implies that healthcare is more expensive, which maybe seems like a bad thing for patients, but I think it's more of an indicator that we need to find a way as a society to pay for the societal good that is data collection and research--"medicare for all" is one conceivable incarnation.


>but it does limit the number of patients they can handle.

Or, by extension, it limits the amount of time with each patient if they have a throughput constraint to stay solvent.

Tbf, I’m not sure the data supports the claim that doctors spend less time with patients, but the increase in documentation does seem to correlate with doctor burnout.


Both can be true, and greater systems of medical research and analysis don’t necessarily lead to greater on-the-ground treatment.

As you’ve pointed out, access to those information systems is critical. I’d add the distribution of that information as well as the right economic incentives to participate in using that information.

I’m not sure we’ve really got any one of those things right.

Edit: adding a bit of humanity to the system, as the OP is hinting at, could very much be a part of the fix.


> Both can be true

Not really. You can't say "doctors have become less productive" without accounting for the value incumbent in the increased documentation effort.

> greater systems of medical research and analysis don’t necessarily lead to greater on-the-ground treatment.

Maybe not "necessarily", but in practice they do. Perhaps not in every incidence, but broadly the analysis results in better outcomes or else there would be no economic incentive to facilitate medical research ("the incentive is to sell more drugs!" <- insurance companies aren't going to pay for those drugs if they aren't proven to work).

> Edit: adding a bit of humanity to the system, as the OP is hinting at, could very much be a part of the fix.

That's not how I understand the OP, but I doubt anyone would object to "adding a bit of humanity" (abstractly) to healthcare unless it implies a reduction in empirical rigor.


> broadly the analysis results in better outcomes or else there would be no economic incentive to facilitate medical research

This is true to a degree, but outcomes for real healthcare rely on much more than research, as you’ve indicated.

Documentation is part of that research, of course, and whether they have short-term or long-term effects for researchers’ ability to work out better treatment is relatively lossy.

Actual treatment also includes the rest of healthcare (training, hell, even their housing costs), and rules-based or centralised administrative systems backed by insurance don’t necessarily create the right environment for that information to be propagated more widely.

People training to be health workers don’t use the frequency or quality of medical research papers to decide whether to become a doctor.

I think there’s a view you can take on the information topology here that’s a little odd in how it’s currently set up — documentation for front-line workers and information wealth for researchers feels like it’s relatively polarised.


The last good, independent physician I had was a fellow who played Chess on the weekends downtown. His practice moved about 5 times while I was a patient and he finally was snapped up by the VA.

He habitually called me "friend" and was very frank about my insurance not paying for stuff I was asking about, which I appreciated. He also once profusely apologized to me for placing a computer in between us. He said the new requirements of his practice made it so he had to pay more attention to the computer than to me, and we were both sad about that.

The doctors I got after that make no such apologies.


Reminds me of this anecdote: https://notalwaysright.com/trying-to-get-a-word-in-until-you...

It may or may not be true (I suspect some of the stories on that site are made from whole cloth), but it speaks to your post.


That documentation time is largely driven by the insurance industry which was really painful before these systems. It’s almost shocking how much more productive doctors are inside the VA.


Insurance is part of it, but not all of it. Government regulation is also a big part. EHR mandates under the Obama administration (? who I was generally supportive of, so not a criticism of his presidency in general) created a kind of "false pressure" to move to EHR immediately, rather than "naturally" adopt it at an organic pace, adopting whatever is most beneficial due to demand. I'm not anti EHR, but the way those systems were adopted were definitely forced onto providers top-down, rather than bottom-up like traditional hospital record systems. Hospitals scrambled to implement them in time for deadlines, and there was no room for pushback against poorly implemented structures that were pushed on hospitals essentially.

More recently in my field I've seen additional layers of documentation requirements that have nothing to do with insurance, that are entirely state law.

I have no doubt in my mind that if EHR rules didn't exist, they would have been adopted much more gradually, and selection would have been dictated by the ability of EHRs to supply features in demand. More competition would have existed and it would have cost less. Maybe some government regulation would have been needed in terms of interoperability standards but it could have been rolled out much much much better.

I don't think people fully comprehend the cost overruns associated with adoption of EHRs under government mandates, or how big of a shift there was from records being in-house flexible, and provider and patient-driven, to out-of-house inflexible, and IT-corporation-driven.


To play devil's advocate without the stick of Medicare funding being at stake, doctor's offices and hospitals will defer EHR upgrades until the heat death of the universe. HIPAA was passed in the 90s with a safe harbor for faxing and that's still the standard method to transfer a medical record.


I admit I don't know HIPAA very well, but could it be that the requirements for fixing are straightforward, or significantly more so than for other methods? If I'm the lawyer with a role in HIPAA compliance, I want the staff using a no-liability compliance appliance, not screwing up with email attachments in the clear or botching veracrypt containers or anything like that. Simple training: medical records go back on the shelf, in the shredder, or in the fax machine.


> It’s almost shocking how much more productive doctors are inside the VA.

Cerner has a contract to fix that.


What?


Cerner got $10B a couple years ago to prevent the VA system from working. Its happening per plan, so far.


I am seeing all this blah blah blah about how everything pins down to insurance.

But if medical pricing was such that insurance wasn't required, then patients would better audit their own care receipts and this fraud issue would eliminate itself, and as a side benefit we'd have sane pricing for medical care.


Word. Insurance is the greatest scam of all time.


I'm a web developer, and about 2 years ago my company implemented a new note taking policy that takes up a very large portion of my time, makes me feel like I'm untrusted and I just don't feel as happy as I did before. We are required to write a note of everything we do through the day with time spent and submit it by the end of the day. This is in addition to our project management system where we already track our time and write our detailed notes on what we do for each task. Prior to the policy I was able to finish work on time, now I dread the note portion of my day and I usually end up spending an hour or so at the end of the day to write up my "journal" of the day. I also have zero extra time which I used to use to learn new skills and techniques. It's an absolute waste of time and is destroying my urge to learn new skills that used to help improve the company's processes and efficiency.


When I started my new role with my employer, I did this using a browser extension and a really neat time-tracker app. I forwarded every day's tracking data to my mentor and so he was able to see with a really fine grain what I was working on, and how long each task took me in a given day.

Our apps security policy put the kibosh on that and I had to destroy my account once I realized it would not be feasible to keep it going forward.

My employer is really conservative about allowing 3rd-party apps, which is great for security, and puts us in a veritable Stone Age of productivity.


"Just a billing platform with some patient stuff tacked on"

https://www.youtube.com/watch?v=xB_tSFJsjsw


Zdogg, MD made my day!


I agree. Here in Finland the public healthcare organization in capital region and surrounding areas chose Epic as supplier of their new system. It has been a disaster, massive complaints from doctors about how unproductive it is to use, and also some issues that endanger patient safety. Apparently it's also programmed with MUMPS, which doesn't exactly sound a great idea in 21th century.

I'm not sure whether this choice was a case of incompetence or corruption, but the end result is clearly a giant waste of money. Maybe it generates a lot of data, but efficiency would be way more important for an organization like this which is chronically underfunded and staffed.


> I've read that doctors now spend as much as 50% of their time documenting their work. Companies such as Epic, which provide the software that hospitals use to build databases of patient data, have been big winners in the new world of hospitals-depending-on-software.

My daughter works at Epic, and she explained that one (though not the only one) of the big reasons health care is so expensive is because Drs have so many record-keeping requirements, and one reason they have these is because of liability. It would greatly help if Americans weren't so lawsuit trigger-happy. The real winners are the lawyers (and insurance companies).


Americans aren't actually don't sure that often, a topic addressed in The Myth of the Litigious Society by David Engel [c.f. https://blogs.lse.ac.uk/lsereviewofbooks/2017/02/01/book-rev...].


It's the ease with which one can sue, and the damage it does even if it's baseless, that poses a continuous threat.


Hi. Medical professional here.

You are high if you think we are better off with paper than an ePCR.

Sometimes we have to default back to paper, some times the pt record system goes down.

When that does happen, it's a literal fucking nightmare in the ER.

We plan on it happening and we train for it happening, but ePCRs have made our lives so much fucking easier. Simple fucking fact.

Have you ever had to read the pissed off notes in a hand-written chart from the last nurse who has been working 48 hours straight, taking care of 12 critical care pts? Huh? Give that a try sometime.


I talk a lot about this flexibility gap in my day job in UX. Getting an organization onto a digital platform is great but a lot of them don’t recognize all the small ways that their current system’s flexibility is helping them.

With big systems though I honestly think GUIs can only go so far, even at their very best, and any system that is required to be complex at some level will require expert knowledge of the system itself. That’s extra work and an extra burden for someone with the critical experience that an organization relies on.

For doctors and the like it would make sense to try a system where the critical person/expert has an assistant who is a systems expert and does a lot of the needed data entry and the like. Doctor doesn’t have to worry about the system, they can talk to the assistant who manages all the extra work. If the system needs to change for any reason the assistant manages that and the doctor doesn’t have to worry about it.

I think of this assistant role as the human API layer. It’s not far off from some social programs like insurance navigators, who help individuals find health insurance, including working through options and even—critically—filling out forms for folks.

ETA: It’s a thing! I didn’t know: https://en.wikipedia.org/wiki/Medical_scribe


This resonates strongly and I'm surprised more people aren't talking about it.

I recognized this problem in the 90's because I worked at a manufacturing plant where the software, even then, was getting in the way of workers needing to do something a bit non-standard (they developed workarounds over time).

It's also why I'm a big fan of the 80% solution, I think there's a level of hubris involved in going for the 100% solution for everything.

To your point about trust, it's something I've been thinking about recently. Not trust, but the authoritarian nature of software, which expresses itself and is purchased for the ability to distrust. You'll see this in things like debates about whether or not developers should be allowed to affect the deployment via yaml files or not. What makes it difficult is there is often a legitimate need for these sorts of systems, especially for regulatory compliance, but more often than not they just get in the way of actual work.


The big reason EMR is so overbearing is to optimize billing… as far as economic statistics, that should show up as positive even if less patient care is actually delivered.


> I've read that doctors now spend as much as 50% of their time documenting their work. Companies such as Epic, which provide the software that hospitals use to build databases of patient data, have been big winners in the new world of hospitals-depending-on-software. But did the doctors become more productive? By almost any measure, they became less productive.

I'd ask another question: did lawsuits decrease? I'd imagine that a lot of this software is to avoid lawsuits. America is especially litigious and that's got to correlate strongly with the increase bureaucracy.

I also suspect that a factor at play is that people are losing trust in the whole system. Most people now know that productivity has skyrocketed while salaries have remained relatively flat. With increasing economic disparity (not even just the West) it is no wonder that people become less productive. Who tries hard at a game that they believe is rigged against them? (doesn't matter if it is or isn't, just the belief)

> People in tech keep thinking more tech will solve problems

Because historically it has. But there are different types of tech. Tech enabled the modern world. It is the new medicines we have to cure illnesses that devastated populations. It is the chemicals that enable us to grow enough food to sustain our populations. It is everything from a wheel to the computers we use to make more efficient wheels that use less resources. But it is also naive to think that tech alone can solve every problem. It is also naive to think that tech can't create new problems. To create tech that solves problems we need to think long and hard about the intricate complexities involved and gather the expertise from relevant domains (an often missed, but essential, component). The other problem is that people hand wave away things like climate change saying "tech will solve it" rather than investing in said technology and waiting for it to magically appear. I do think tech is an important tool in solving many of the problems we face, but you're right that they are not all technology dependent (which is a continuous scale of weights, not a binary option).

Also, we're a tech forum. Peoples' expertise here is in tech. So they see things through that lens and it is also very likely that the most we/they can contribute to solving these problems is, in fact, through technological means. The trick is to remember that tech isn't a cure-all and that the problems we face are exceedingly complex. Over simplifying is often harmful.


This has crept into schools as well. At many (most?) preschools around here have apps to document the kids' day, they send out almost daily emails and reminders about various things. If you have kids at preschool, regular school and after-school centers you're getting a constant deluge of information. Add to it that they all have different apps that don't work well (schedule, attendance etc), so of course they use email in addition.

I love our current preschool precisely because they use very little technology: we actually talk to the teachers during drop-off and pick-up if needed, and get maybe one email a month. Their expressed philosphy is to be "present" with the kids and spend as little time as possible on other (administrative) things.


We just had a baby this month, and I was shocked by how much time the medical staff was spending entering data into Epic. So much that they couldn’t actually fully concentrate on giving medical care.

Everyone was very busy but it was very hard to get actual care.


>Software doesn't know when its rules should be broken.

Just to provide an example of this I ran into today: I'm doing a medical physics residency, and my supervisor was explaining that a new "fail-safe" incorporated into the software that reverted the collimator after every scan was now making the phototimer tests take twice as long, because we had to go back into the room and reset the collimator repeatedly. We tested a machine with the new system and one with the old system and it did in fact turn a 15-minute task into around 35 minutes.


I agree to a large extent with your comment. The "office" structure has changed a lot. Secretaries ran the show, and pretty much hand-held everyone in the office and kept the ship pointed in the right direction. Law firms and political offices still have this "outdated" model because if it ain't broken, don't fix it.

Now, your company pays insane amounts of $ for software, systems, and data! You have to do your job, schedule your meetings, keep up with the ungodly amount of email, instant message, SLACK, enter the data in your CRM, talk on the phone, schedule the video meetings, etc. while also keeping track of your personal phone because your partner called or the school called and your dog is sick. There's just so much distraction now by not having "office hours" and having our whole lives in our pockets and constant reach that I think productivity has to go down. It's impossible to focus on anything for any amount of time anymore with how accessible we are expected to be.


Last time I had a blood test it took 2 minutes to take my blood and 15 to enter crap in various forms on the computer. And I'm not even in the US. There was a lot of clicking involved.

To contrast, my first job was an accounting program. We spent weeks on making sure everything works via just the keyboard and some operations are as streamlined as possible. Because in some cases it was going to be used by people creating hundreds of invoices per day.


Insightful...

Figuring out the right amount of automation in business is hard. Too much automation makes the company too rigid to the point of breaking but too little reduces productivity. This is a dilemma that factories/companies have to constantly re-learn. We see it regularly. A new manager comes in and decides that a company can save a truck load of money by completely automating production. But soon finds out that there are too many variables in life and that total automation can't work. The goal should be to find the sweet spot that produces the maximum while being flexible to change. But I don't think this is why productivity is dropping.

I suspect it's related to the reduction of reliance on foreign factories. The US wants to reduce the reliance on foreign manufacturing and bring all of that production back to the US. That will take time, capital and ability to learn how to do the work here. That will hit productivity hard. All than change won't happen cost free in terms of productivity and assets. The change started a few years ago. We are beginning to see the result.


Having once worked in the EMR/EHR space, a big thing to consider is some companies come in with their own workflows, processes, and ideas that they want to push onto physicians while other companies are way more accommodating in building "bespoke" solutions to specific problems.

The latter in my experience ends up providing better results to physicians as they have been employed as domain experts in building the software solution to their specific workflow. I've seen it done for in ophthalmology, specific disease/injury specific radiology, and diabetic specific checkup and appointments where I've seen as much as a 75% reduction in the amount of time the physician has to dedicate lookup up info, cross-referencing, and documenting.


> other companies are way more accommodating in building "bespoke" solutions to specific problems.

Then you upgrade and everything breaks!

But it’s an age old battle for and against standardization. I just walk around with several charging cables because each is “the best” for charging a small li-ion battery.


More correctly, fundamentally software can never know when its rules should be broken.

Computers are just state machines, without determinism and a few other properties they do not perform work. Determinism means there is only one next possible state/action.

Breaking rules would imply there are two or more possible states or actions, and that is a class of problems that computers cannot solve (they talk about this in compiler design courses). Most people subscribe to magical thinking because they don't know how computers work, or what they are even.


Are we sure the documentation isn't coming as required from the insurance companies?

I know many Drs and especially nurses who CYA on all their documentation otherwise insurance will try to pin them on an adverse reaction.


Payers in general (not just insurance companies) require high levels of documentation both to prevent fraud and to increase care quality. Most healthcare providers are highly ethical and only act in their patients' best interest. But there are always a minority of bad actors who will try to boost revenue by submitting claims for procedures that weren't medically necessary, or weren't performed at all. So the system needs checks for that in order to hold down costs for everyone, and prevent iatrogenic harm.

You will also find many cases where even good providers let things slip through the cracks and fail to give some patients the appropriate level of care. For example, diabetics should generally receive annual foot exams, eye exams, and hemoglobin A1c tests. If the payer doesn't see evidence of those in the EHR then they can prompt the doctor to resolve that care quality gap.


Well said. Particularly on the value of trust within systems.

In case it's of interest I wrote an article a couple of days ago on how "Digital Systems Fail Institutions" [1].

[1] https://techrights.org/2022/10/26/when-digital-systems-fail/


I rarely go to my doctor but the few times i've done its more data insertion than medicine.


Capitalism is trying to move to a model where everyone is atomically isolated, an independent contractor making anonymous sales in a marketplace. That means lawyers, contracts, and rigid rules and documentation. The more pure a market is, the more lawyers and rules you need to make it work. The goal is to eliminate the kind of fluid high trust social networks that characterized work life in the past. Partly because if everything is more rigid, it's easier to exert control, partly to prevent things like unions from forming. They want a perfectly abstracted workforce, free from the messy details that force you to treat your workforce as made up of people. Notice that we went from "personnel" to "human resources" to "human capital management". That's the Ayn Rand utopian dream made real. The idea is to turn people into units of production to be bought and sold, and you can't have a market for a commodity unless they are all the same. So, you must be made identical, so that the price of labor means something. Every criticism leveled against systems of control in the book "Seeing Like a State" apply to capitalism too. When people talk about a free market, remember that they're talking about a free market for labor more than anything else. You are the product in this scenario, not the customer.

But even here, the purity of the labor market is hampered by rules designed to protect the laborer- osha rules, overtime rules, rules about how much a person can be controlled, and so forth. If you eliminate all those, and focus only on the making labor freely able to be bought and sold, you can begin to see what a completely pure free market for abstracted, commodity labor would look like.

It is this: The most pure example of a free market for labor is a slave auction.


To get anything done, it seems I must speak to >=2 people on any customer support line.


There are huge productivity gains in private practices that eliminate web pages and email and switch to paper in filing cabinets only. This is also why fax machines still exist and are exclusively used by medical practices.


The article is about 2022 specifically, obviously pandemic related.


Message passing was basically a solution for this in the tech world. Send messages to everyone and if they implement that message then they act.


my experience at a healthtech startup is that any kind of cohesive product design gets perversed by nonsensical legal requirements and leadership pressures to check the boxes of comparable products because that's how the product they've used in the past did it


> I've read that doctors now spend as much as 50% of their time documenting their work.

What's the source?


You’re blaming tech for legal issues and requirements.


is this really caused by tech or by more regulation and demands from insurance companies?


Brilliant observation.

In a way, all might be a huge early optimization.


Don't you worry, the next gen of block chain apps which are built on a bedrock of implicit trust, will make your concerns moot.


US Worker productivity is 0.7 standard deviations below its average over the last 3 years. It is 3.6% below all time high. It is higher now than at any point before July of 2020. YOY Productivity growth has dipped negative and then went back to positive 20 times in the past 22 years.

To be concerned about the current level of productivity requires either the attention span or the intelligence of a goldfish.

https://tradingeconomics.com/united-states/productivity


It's just the corporation friendly mass media trying to combat calls for better worker treatment/compensation. This is just the next single from the album that brought us "quiet quitting", "the great resignation", "millennials are lazy", etc. It's important for the backers of these media outlets to float these stories out there, lest anyone become sympathetic to workers in light of the facts that minimum wage hasn't kept up with either inflation or productivity, that corporations are engaging in profit inflation, that the fed is intentionally raising rates to wrest back some power from workers, and so on.


Yeah, this "new" trend of "quiet quitting" sounds so silly. People have been behaving like that for a very long time and they've probably always done it.


It’s worse than silly. I had to ragequit this (https://hbr.org/2022/09/when-quiet-quitting-is-worse-than-th...) article from HBR on “quiet quitting” after reading:

> Quiet quitters continue to fulfill their primary responsibilities, but they’re less willing to engage in activities known as citizenship behaviors: no more staying late, showing up early, or attending non-mandatory meetings.

Yes, we should all bereave the withering of classical virtues—like sacrificing our health and time with loved ones in order to provide free labor.


Such nerve. Instead of the talking about the downsides of increased surveillance and quantification of worker performance, blame the worker for not going above and beyond. As an employeee in this new landscape, of course you'll save powder for your official requirements, that's OKRs and KPIs working as designed.


Also the cries for returning to the office


Literally one of these on the front page of the NYT today, fully of chirpy anecdotes about people ditching the sweatpants and silly comments about how much our dogs will miss us when we go back to "real work".

You may kindly shove it, NYT. I'm not going back.

https://www.nytimes.com/2022/10/29/technology/return-to-offi...


Years ago the NYT was great. Now it's so cringy that half the time you wonder if it was hacked / hijacked by TMZ. Sadly, in terms of trust and credibility TMZ wins. It's never anything less or more than it promises to be. If only other media outlets were as honest and had as much integrity.


I agree but I’d say it’s become more like something that resembles Huffington Post than TMZ.


I mean, NYT also is telling us that lace is in style for men right now sooo if you have to ditch the sweatpants for the office maybe consider a lace overcoat? https://www.nytimes.com/2022/10/31/style/mens-wear-lace-fash...


really a bit hard to believe that these kinds of articles aren't paid for by some industry group


It is. The NYT Real Estate section is a pretty sickening read for most of the 99% who aren't totally disconnected from reality and don't have a million plus in cash to toss away on a second home.

Every time I've read that section, it felt like product placement for luxury home realtors and commercial property owners.


Do you believe they aren't for some reason? It's not like they release whose buying ads if you ask.


I guess it’s helpful to own all the media when you need to spread your propaganda around.


Yes, this is "proof" that remote work is less productive.


> that the fed is intentionally raising rates to wrest back some power from workers

The fed is increasing interest rates to keep dollars attractive as a world currency and allows the US governments to keep increase national debt whithout sending interest rate of through the roof.

It may seem counter-intuitive, but letting inflation going out of control would have a major negative impact on demand of debt issued in dollar or, to make it short, US government debt.

The only alternative would be to fund government spending with government revenues. Which means, increase taxes for categories of revenues which amount the most in national revenues.


Just remember who owns the WaPo and it makes a lot more sense.


> "In the first half of 2022, productivity — the measure of how much output in goods and services an employee can produce in an hour — plunged by the sharpest rate on record going back to 1947, according to data from the Bureau of Labor Statistics."

It annoys me that an article about a big drop in workplace productivity doesn't actually say how big the drop is. If it's only 3.6%, then that explains why they glossed over that part, as it would have undermined their narrative.


It serves the important function of giving journos an opportunity to interview business consultants about what CEOs think of their employees' poop breaks.

Which, in generating lots of hate clicks, is a huge economic boost in terms of Nonfarm Business Sector Labor Productivity!


> In the first half of 2022, productivity — the measure of how much output in goods and services an employee can produce in an hour — plunged by the sharpest rate on record.

Given the sharpness of the drop, the timing with wonkiness in a whole slew of other economic indicators, it's bizarre to me that the author (and experts interviewed) would immediately assume that something changed with workers vs the metric is behaving weirdly.


Thanks for adding some context. The "max" or 25 year view on that chart provides some perspective. Still unusual that there's been a recent decrease, but almost every measure of the economy has seen some really weird values in the past 2 years.


I think a chart of the job titles of the dead / disabled from COVID with numbers would be incredibly useful.

My niche has shot through the roof for demand salary and remote with no sign of turning back.

My only macro explanation is 20% left the gig totally.


The Washington Post is Bezos's mouthpiece - lower worker productivity hurts his bottom line so we have to suffer through his paper complaining about it.


Also Bezos would like to normalize the intense surveillance, speed focus, and corner-cutting that leads his warehouses to have injury rates 80% higher than the rest of the industry.

https://thesoc.org/amazon-primed-for-pain/


Always read the comments first.


usually it's the other way around


It’s almost never the other way around


Shut up and work harder nerd.


> To be concerned about the current level of productivity requires either the attention span or the intelligence of a goldfish.

It seems like the median journalistic piece assumes this of the reader anyways?


And we still have GDP growth and no real signs of recession.


That article was awful with just guesses left and right on causes, without even measuring how the number is calculated.


*mentioning rather.


> requires either the attention span or the intelligence of a goldfish

Good thing I have both.


Wow. Much needed context.


These things always really need a giant flashing neon note that "productivity" doesn't mean how much workers get done but how much money is made off of what workers get done. They're only loosely connected, and most productivity gains have come from workers having to "do less" to "make more".


Weird story: From the mid-1970s to the mid-1990s, as-measured productivity significantly declined and then stayed at a lower level. This was during the initial few generations of technological impact on industry, including "just-in-time" inventory which kind of requires computerization. Yet, at this same time, "bosses and economists" were seen in public wondering if computers weren't a net negative on industrial production.

In addition to being weirdly defined, productivity is, as the graph demonstrates, very unstable over the short term.

If you want a longer version of the graph in the article, see "The 1990s Acceleration in Labor Productivity: Causes and Measurement" from 2006 (https://files.stlouisfed.org/files/htdocs/publications/revie...), page 190 (10 of 22).


> "just-in-time" inventory

That really was a net negative eventually. Covid managed to completely wreck the worldwide supply chains because of that idiotic approach. God forbid anyone keep any buffer in case anything happens.


It's definitely not an idiotic approach, and companies do keep buffers. In the most advanced cases, probabilistic models are devised to estimate how big these buffers should be. Asking for companies to keep buffers for unpredictable "once in a century"-type events is unrealistic.


Yeah that's the problem, those estimates are usually arounds zero it seems. I mean sure on paper it checks out to be most profitable and most of the time it also works in real life. But you end up with this rube goldberg supply chain machine that can't be stopped or you apparently end up with a cyclic dependency problem and you cannot restart your production.

We build structures to take a one in a ten thousand year flood or earthquake, but it's too much to expect corporations to keep more than 2 weeks of stock? Sure.


Efficiency argues for keeping those buffers as small as possible. In a series of "normal years", a business that keeps their inventory down to what they need in the "normal case" will be more efficient than one which keeps a larger inventory to handle more rare events. As a result, the former will out-compete the latter in the market. (Inventory, in this example, is just one "buffered" resource that needs to be managed correctly under differing circumstances.)

Efficiency, past a point, is therefore the enemy of resiliency.

Until that day that something bad happens.

And then you have an issue where one business may be prepared for the bad event, but something downstream of it is not; they can produce all the widgets, but can't ship them anywhere for example.


Also, to some degree, in events like these what inventory is needed is not predictable, because humans are unpredictable and irrational at times.

Panic buying of paper towels early on in the pandemic, as an example, was not predictable.


> Panic buying of paper towels early on in the pandemic, as an example, was not predictable.

What? It was entirely predictable, except maybe for those high-end executives who live in hotels and/or with full-time house service, and are thus completely detached from life of ordinary people.

The panic buying was limited to the most obvious category of goods: basic consumables with some degree of shelf life, prioritized by survival and then comfort. This means food (particularly canned, shelf-stable, or freezable - plus baby food and pet food - and ingredients, including flour, yeast and baking soda), hygiene (soap, toilet paper!, and - perhaps specific to pandemic - hand sanitizers and masks), fuel, household cleaning (including cleaning agents and, surprise surprise, paper towels), comfort consumables (coffee).

It's not hindsight on my part - think of how people individually decided what to buy. They didn't buy whatever they see everyone else buying. They just asked themselves: what do I eat? How do I keep myself and the household clean? What else do I buy on a regular basis? What do I need to maintain a semblance of my current lifestyle? The answers, prioritized, were what everyone then went to stock up on.


That there are downsides is not in dispute, but you’ve not shown that it’s a net negative. Perhaps it is, perhaps the shortages we experienced because of a very unusual event like COVID outweigh all the day to day advantages of having little slack in the system. But you haven’t shown that.


>From the mid-1970s to the mid-1990s, as-measured productivity significantly declined and then stayed at a lower level*

*in the USA.

The post-bretton woods era is one of globalization, with American jobs being sent overseas (to more productive labor forces)


> "productivity" doesn't mean how much workers get done but how much money is made off of what workers get done

This is so true. The amount of bureaucracy has actually increased. This makes every worker work more. But this bureaucracy is unproductive work, thus does not lead to a rise in income (for the company).

E.g. my healthcare provider uses fax machines (yes that FAX) to communicate with insurance providers. Fax is asynchronous and without confirmation/tracking of work done. Often, the fax is sent but the other side simply files it in a random place or forgets to process the work. So, I (the patient) now needs to follow up for weeks with insurance and healthcare provider to check on the status of that FAX.

This is unproductive work and yet, it is taking a toll on every individual involved in this process.


But it increases GDP! Yay!


It doesn't increase GDP if they get less productive.


Many things increase GDP without being good. For instance Oil spills increase GDP as suddenly a bunch more (cleanup) work is being done and paid for.


There's an old joke about an "economic hero" being a wealthy man going through an ugly divorce while dying of cancer.



It's even stupider than that: it divides this figure by a largely fabricated estimate of how many hours people actually worked. This is a SWAG metric that is largely made up. The commentary is most likely irrelevant.


This is not how productivity is defined or measured.


I'm really curious how you think it's defined or measured then. I'm obviously abstracting a bit, but a lot of people in the replies here seem to think it's related to how much time you spend watching cat videos on company time and it's definitely not that.


It has nothing to do with what I think. Another commenter has already posted the definition from the Fed (FRED).

Productivity is not a measure of prices, profitability or anything similar. It’s a measure of output per unit of input.

Classic HN downvote fest because people incorrectly disagree with a factual post.


You were probably downvoted because you made a curt statement with no backing arguments like in this follow-up post.


from FRED: "The efficiency at which labor hours are utilized in producing output of goods and services, measured as output per hour of labor."

The solow residual is technically total factor productivity but is generally accepted as labor productivity. it's just an accounting identity that is estimated along with GDP and other vaguely useful but not very accurate measurements like the unemployement numbers.


Then perhaps you'd like to enlighten us?


Workforce productivity at the national level is typically defined by some measure of output, the amount of goods or services produced (typically GDP), over some measure of input, the number of hours worked/workforce participation.


So, how much money is made off of how much workers work?


No, not at all: for example, you can be a non-profit and still contribute to GDP, since you're still creating economic activity. Heck, even what the government does contributes to GDP, and that's not making money for anyone.


I think this is exactly what parent meant. "How much is made" doesn't strictly mean "profit" in terms of a for-profit institution. The net output of a non-profit is directed somewhere, either internal or external to the entity, and that can loosely be considered "making money", or at least in the sense I believe parent meant.


GDP is denominated in dollars, so this seems to be a somewhat vacuous position - yes, that's how we measure economic activity, but it doesn't have to involve money changing hands.

Productivity is based on the value of the work done, not any profitability assessment. The original post which set off this chain asserted it was about not how much workers get done but how much money is made off of what workers get done ... which is unambiguously wrong.


> The original post which set off this chain asserted it was about not how much workers get done but how much money is made off of what workers get done... which is unambiguously wrong

I don't believe the difference is consequential here, since the originating point still holds even using your definition. I wouldn't say it's "wrong" so much as imprecise, as the way I interpreted the statement would encompass your more detailed description.

It's like when I ask people "how much money" they make, I intend them to include non-cash compensation in the number (in dollar equivalent), and pretty much all do without additional prompting.


Yes. My point was that when people read these articles they think of a much more casual definition of productivity that has more to do with a sense of "getting things done," but the word is jargon for something that has little to do with that.

I was playing loose with the jargon meaning for sure, but I'm pulling out to what articles in the Washington Post or other economics-focused media really care about: the impact to corporate bottom line.


So, how much money changes hands off of how much workers work?


No, it's a measure of a weighted average quantity of output of goods and services (not of money) compared with the quantity of labor input.


"Quantity of output goods" is measured in dollars, as mediated by the current price, no?


Not in general, no. They do count the dollars, but they also measure the dollar-to-quantity ratios of various goods and services. The final productivity measure is based on these adjustments.

So if the amount of money that is exchanging hands goes up but the amount of goods and services produced stays the same, then the measured productivity does not go up.

You may be thinking about how GDP is calculated, specifically regarding government employees. For this category of spending, the "quantity" measured for the dollars-to-quantity ratio is simply the number of government employees. So as long as the government is hiring more people, the money they spend on those people counts towards real GDP, regardless of what those people are doing.

However, government spending is not used in calculating productivity, which measures only certain parts of the private sector where it is possible to also measure output of goods and services instead of relying on measures like 'employee counting'.


If this is a good faith question, you can answer it by reference to any good economics text; or even, you know, Wikipedia. But it feels like maybe it's not.

Here's another example: you volunteer at a homeless shelter, where you serve food on a soup kitchen line. You have contributed to the GDP of the United States. By all means, feel free to fit this into your preferred framework.


What I'm doing is putting how you describe productivity into the form from the original comment. And I'll stand by that last version: How much money changes hands off of how much workers work?

The input to labor productivity is how many hours are worked, correct? And no one is measuring output in terms of the number of bowls of soup produced by homeless shelters; those are converted to dollars based on a current index price.

So you have economic activity, how much money changes hands, compared to labor inputs, how much workers work. Simple?


> And no one is measuring output in terms of the number of bowls of soup produced by homeless shelters; those are converted to dollars based on a current index price.

The Bureau of Labor Statistics does have multiple teams dedicated to documenting how price is related to quantity of output. They don't literally count bowls of soup at every homeless shelter, but they do document millions of price vs quantity measurements on a regular basis. This data is then used in the calculation of productivity.


Your fundamental confusion is that you keep equating economic activity with "money changing hands", which is wrong.


Ok, here's a question for you: back in ancient days, most women worked at home. Is taking care of your own children and your own household an economic activity as would count in GDP, for example? Is someone making a bowl of soup for their spouse different from someone making a bowl of soup at a homeless shelter?

"GDP measures the market value of the goods and services a nation produces. Unpaid work that people do for themselves and their families isn't traded in the marketplace, so there are no transactions to track. ... The lack of reliable data influenced the decision to leave household production out of GDP in the internationally accepted guidelines for national accounting." (https://www.bea.gov/help/faq/1297)

"Economic activity" that does not equate to "money changing hands" in some form, isn't an "economic activity" that counts for GDP or productivity, right?


Yeah I don't think this is true. GDP is an economic measurement, not some kind of intrinsic thing.

So you can make a beautiful thing for your home - not GDP. Make it and give it to someone - not GDP either.

Pretty sure money has to change hands, or in the case of government, we measure it as $ spent.


Indeed, the value of volunteering is not counted in US GDP.


This is just factually incorrect. Volunteering at a homeless shelter does not contribute to US GDP.


Perhaps people have had to expend more energy just keeping their personal life together in the last few years. People with children have had to deal with the constant school closings, childcare facility closings, etc. and that has taken its toll. They may have family members who got Covid or had treatment for other ailments delayed by the pandemic's rush to treat Covid patients. They could've experienced a huge shift in the switch to remote working in 2020, and are now expected to make another huge shift back to in-office working.

This doesn't even account for the incredible decline in civility from customers if you work a customer-facing job. The slightest inconvenience or mistake can end up in a tantrum by an American adult that only sometimes gets captured on video. And in the meantime, a bunch of people walk around opining that "Nobody wants to work anymore" as if they deserve to be waited on hand and foot regardless of circumstance.


During Covid, people were hiding in their homes, quarantined, with nothing to do but work.

This year, companies expect workers to return to office, despite little change in conditions, except now we have to deal with all of the above issues you've just mentioned, AND the fact that employees have now proven they can work remotely perfectly well.

It should neither be surprising that in a system where healthcare is tied to employment, that productivity jumped while people were locking themselves in their houses from a plague, or that productivity dropped afterwards, or that it might drop given the complete callousness of our current system.


All of what you've both said, plus the number of people who thought or still think COVID-19 is "no big deal" who now have a pulmonary deficiency and long-term mental fog.


Indeed.

In support of this—part of it!—here’s one paper of many: “The Neurobiology of Long COVID” by M. Monje, neurobiologist at Stanford, and A. Iwasaki, immunologist at Yale.

https://doi.org/10.1016/j.neuron.2022.10.006

People with measurable neurobiological issues will show a measurable productivity drop.


Prevalence of "Long COVID" is hard to measure at this point, but based on symptom reporting, about 15% of people who test positive for COVID have some symptoms more than 2 months later.[1] More data is available for people who have had heart and brain MRI scans, which show inflammation. It may be possible to diagnose this with a special eye exam that detects inflammation in the eye's blood vessels. The tests are still very experimental, but there are objective measures of damage.

[1] https://jamanetwork.com/journals/jamanetworkopen/fullarticle...


In support of this, I want to mention that there are known and measurable biomarkers that connect reported symptoms to undeniable physical symptoms.

Low cortisol is very common. That’s just one example. With low cortisol it’s hard to feel good or function 100%.

One paper of many that show this kind of relationship (a preprint): “Distinguishing features of Long COVID identified through immune profiling”https://doi.org/10.1101/2022.08.09.22278592

It’s also important to note that there was a study that seemed to show that Long COVID was just anxiety or in people’s imagination. Because people who said they had Long COVID didn’t have SARS-CoV-2 antibodies. The thing is that not everybody seroconverts. Not everybody actually produces antibodies after infection; We simply can’t exclude Long COVID diagnoses based on looking at those particular antibodies. So that study can’t show what it has been said to show. (I can provide sources for this but am out of time. It’s on my honor.)


The best explanation I've seen thus far, assuming there aren't other complicating factors (which there often are) is that the immune system stays hyperactive in some people after infection. This causes excess production of IL-6 and IL-10, which in turn causes something called the kynurenine shunt. That in turn results in lowered central serotonin rates in the brain and heightened glutamate. This off balance results in lowered dopamine as the production of serotonin and dopamine are comingled. Dopamine usually breaks down to your typical epinephrine and norepinephrine levels. But since dopamine is at a lowered state, you get less epinephrine and norepinephrine. I'm guessing, but do not know the pathway, that lowered epinephrine and norepinephrine explain the lowered cortisol.

Either way, the treatment should be the same. Take a corticosteroid to lower the IL6 & 10 levels, ensure the patient is getting proper sleep and nutrients, and not doing anything to exacerbate lowered serotonin/dopamine levels, and cross your fingers.

I've started to hear of companies that are testing for elevated cytokines as a proof of long covid, and my family member's hospital has started educating their staff on cytokine release syndrome in the rare patient who has a reaction to a vaccine shot, both of which, for me, imply the industry might be moving in this direction for explanation. fingers crossed


I mean it's such a small percent do you really think it would show up in national data?


What is such a small percent and where are you getting your data?


"Long COVID" and medical journals.... Even the highest estimates only have it as a very small percentage of symptomatic patients which is already a subset of the general population.


I've been reading it's 10 to 20% of people who test positive have symptoms at least a month, and as much as 5 to 7% three to six months later. The US workforce is around 160 million people. About 60% of the population is estimated to have had SARS-Cov-2 at least once.

60% of 160 million is 96 million. So 5% of that is 4.6 million people or so with at least a solid portion of a year of symptoms.

There's mounting evidence that some of the damage could take years to recover or even be lifelong.

The current word from the Mayo Clinic is about 20% of folks have what they're calling Post-COVID Syndrome between one month and one year after an infection, with at least one symptom likely from the viral infection. https://www.mayoclinic.org/diseases-conditions/coronavirus/i...

If we look at 20% of 60% of the workforce that's around 19 million people symptomatic for over a month after the initial positive test. Twelve percent, almost one in eight people in the workplace.


My understanding from other articles is its currently 1 in every 3 people currently unemployed cite long covid as the primary reason for inability to return to work, though that data may be a few months old now.


If you change it instead to be a subset of the working age population, does the percentage look higher?


The difficult part of it is: for some people it is literally nothing. We had it now for the fourth time since 2020 even though we’re properly vaccinated and careful as much as life permits.

It’s a bit worse than a cold but much better for than the flu. So, yes, for us life just goes on with COVID. No need to change anything.


That is a problem. It's a dice roll each time a person is infected depending on a huge number of factors including what strain they've been hit with. Reinfections can give worse odds each time. People who got lucky once or twice before might be more careless thinking their luck will continue and end up screwing themselves.

https://www.webmd.com/lung/news/20220707/each-covid-19-reinf...


"Reinfections can give worse odds each time."

Not necessarily worse, but there is cumulative damage.

In the UK, about 7% of COVID cases that get medical attention are now re-infections. That number increases with time; it was only 4% last February. Immunity from getting COVID is time-limited and not that long. One study said "time between reinfections ranged from 90 to 650 days, with the average being 343 days".

Over a decade, this might gradually debilitate most of the population.


I'm hoping that over the coming decade we get better defenses. Improved vaccines, treatments, and a greater understanding of the virus could prevent a lot of that harm. It hasn't been long enough to see what the long term consequences of infections or even our vaccines and current treatments will be for that matter, but I'll stay up to date on vaccines and continue taking sensible precautions against catching or spreading the virus. It's the best I can do to help my odds given what we know currently.


There are many, many scientists who are attempting to warn us that isn’t nothing. The literature is piling up.

Paper: Immunological dysfunction persists for 8 months following initial mild-to-moderate SARS-CoV-2 infectionhttps://doi.org/10.1038/s41590-021-01113-x

Paper: “Excess risk for acute myocardial infarction mortality during the COVID-19 pandemic”https://doi.org/10.1002/jmv.28187

Paper: “p53/NF-kB Balance in SARS-CoV-2 Infection: From OMICs, Genomics and Pharmacogenomics Insights to Tailored Therapeutic Perspectives (COVIDomics)”https://doi.org/10.3389/fphar.2022.871583

SARS-CoV-2 directly and indirectly interferes with p53 expression and balance.

On p53: “p53, cellular tumor antigen p53 (UniProt name); p53 proteins are crucial in vertebrates, where they prevent cancer formation. As such, p53 has been described as "the guardian of the genome" because of its role in conserving stability by preventing genome mutation. Hence TP53 is classified as a tumor suppressor gene.”https://en.wikipedia.org/wiki/P53

The literature goes on and on and on. People really are not okay after contracting this virus. I know plenty of people who simply are not recovering after COVID illness. Friends. Close family. Kids.

We are being warned.

And crucially: We can clean this crap out of the air. Nobody has to breathe in SARS-CoV-2. We absolutely must demand that something is done. There’s lots and lots that can be done.


All of these seem to say the risks of complications are extremely rare? I mean it's endemic at this point so it's really really shitty if you are one of the unlucky few but what can we even do?

> We can clean this crap out of the air. Nobody has to breathe in SARS-CoV-2.

?? What do you even mean. Maybe if the vaccine prevented spreading the virus we could but until we develop that I don't see how that is possible.


With SARS-CoV-2, complications are more common than death, and it’s still among the top reasons why people die.

Re. cleaning it out of the air: It’s literally airborne and it can literally be cleaned out of the air.

When infected, people emit tiny fluid particles which contain the virus. These are so small that they don’t fall to the ground. Filter materials like in HEPA filters and respirator masks strip these particles out of the air. Very effectively. (Surgical masks don’t work. Same kind of material, gaps at the sides. It doesn’t work.)

We can literally clean the air. We do not have to breathe this virus in. That’s just one of the things we can do.

Paper: “Effectiveness of HEPA Filters at Removing Infectious SARS-CoV-2 from the Air” – https://doi.org/10.1128/msphere.00086-22

Paper: “The Removal of Airborne Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) and Other Microbial Bioaerosols by Air Filtration on Coronavirus Disease 2019 (COVID-19) Surge Units” – https://doi.org/10.1093/cid/ciab933

“Ten scientific reasons in support of airborne transmission of SARS-CoV-2” – https://doi.org/10.1016/s0140-6736(21)00869-2

Why am I saying all of this?

Because I have a moral obligation to do so.

Why do I know this?

I don’t know anything. I’ve just read those scientific papers.

Why did I read them?

Because high-risk pregnancy.

But does it apply outside of high-risk pregnancy?

Yes.


A national program to make every indoor space have proper ventilation / air purification.

It's shocking how much this can help with other things e.g. asthma or allergies.


I would not wonder if you throw so much research time to a normal flu you will find similar things.


People do study that. There's known to be a significantly increased risk of heart attack and stroke in people who have recovered from influenza, but that risk returns to baseline after about two months. The big risk is in the first 7 days.[1]

[1] https://www.nejm.org/doi/full/10.1056/NEJMoa1702090


There's been a serious research effort against influenza for a century now.


I everytime sayd it's not a big deal. Got my vaccine as it was ready but don't done anything else to don't get it. Now it's 2022, never had COVID, or maybee I don't noticed it. So, yes, not a big deal for me.


I'm not 100% sure if this is satire but it brought me joy.


good thing there's the safe and effective vaccine tested on mice and N number of boosters.


HN aside, most people don't have jobs that you can do more of at home than at the office / factory / lab / school. Heck, with school it's quite obvious that "teaching hours delivered" sustained during pandemic, but "education learned" dropped by probably half.


I think you’re really on the right track with caregiving, and would add the blind push to force people back into offices without any recognition of the costs of those policies (or, often, perceptible benefits). Going into the office is fairly expensive in any case but it especially pushes parents towards needing daycare and aftercare services which were already expensive before the pandemic and became more so after a non-trivial number of providers found other jobs, became too sick to work, died, or decided the health risk wasn’t worth it after seeing that happen to other people. Our local parents group has had stories about people choosing not to go back to professional jobs because the employers insisting on RTO weren’t paying enough to make up for that, especially if they weren’t accommodating when someone’s schedule is disrupted.


You're right about daycare/school closings. Even now, essentially post-covid, if our two-year-old gets COVID, that's a 10-day quarantine from daycare.

That's 10 days where one of the parents has to work from home and be horribly unproductive because they are watching a child at the same time. And you can get COVID repeatedly. It often from the daycare itself, but also from a sibling who is in school. Even with full, boosted vaccinations, they can still catch it. They don't get very sick, but they have to quarantine.

It's unsustainable.

I'm sure that's not the only cause, but it's definitely a factor.


This is exactly why many people intentionally avoid testing themselves or their children. If you don't have a positive test then officially you don't have COVID-19 and can continue your normal life (symptoms permitting). (I'm not claiming that this is a good practice necessarily but it's what most parents do.)


They could do home tests though. But PCR would be “on the record”


It depends if you are willing to lie about it. I won't lie.

But the huge, unavoidable disruption that comes from a positive test result certainly doesn't encourage frequent at-home testing. I think at this point most daycare parents test when the school tells them to, and don't test otherwise.


Not only that, but sometimes even the threat of an outbreak can hamper the availability of childcare. Last winter, a few staff members were exposed to a close contact, so they held them out of work as a precaution - but that resulted in one of the rooms having to close for a week.


I assume the parents and staff at my daycare just have a collective unspoken agreement to not test for covid.

The remaining Covid policies are stupidly inconsistent anyway.

RSV is far more dangerous to children, but that is allowed to go unchecked. Hell, you have to pay $250+ just to get tested for it. Rhinovirus/influenza/norovirus/rotavirus/other coronaviruses are all OK, with kids leaking from both nostrils in the classroom.

But one kid or adult gets Covid and things have to close? Covid tests are paid for by government, but testing for all the other viruses costs hundreds of dollars? What a farce.


> Covid tests are paid for by government, but testing for all the other viruses costs hundreds of dollars? What a farce.

Could you imagine if there was some test that showed you all the viruses that are circulating in your system at any given moment? If we applied the same rules as we do for covid to such a test, people would literally never be able to leave their house...


> Covid tests are paid for by government, but testing for all the other viruses costs hundreds of dollars? What a farce.

None of the other viruses have caused a pandemic, and changed the course of human history (except influenza, but that was almost a century ago).

Have people literally forgotten that entire countries' health systems were overwhelmed by this virus? Seems like it would be prudent to keep infected individuals out of our public school systems for a few more years.


RSV is overwhelming some hospitals:

https://www.reddit.com/r/medicine/comments/ydhvbs/how_are_my...

Although, the whole hospital volume thing is now wrapped up in pay shortages and rationing of healthcare, so do not really know how much the spread of virus is contributing and how much the lack of resources being put towards healthcare, and specifically pediatric healthcare is contributing.

> Seems like it would be prudent to keep infected individuals out of our public school systems for a few more years.

This is not possible unless you quarantine toddlers and young kids from each other for years, and I am not willing to pay the price of my kids’ development.

Not to mention that parents need to earn incomes to shelter and feed their kids.

Also, I have had 5 Covid vaccine injections, my kids have had 2, but there exists no RSV vaccine. And RSV is specifically bad for kids, causing a fever almost every time. Given the lack of voting power for those affected, I imagine there is not much political will to spend money on expediting R&D for an RSV vaccine. Just like how pediatric healthcare gets reimbursed at 60% to 75% of adult healthcare.


> I am not willing to pay the price of my kids’ development.

I mean, sooner or later the people working in schools are going to be fed up with parents' constant refrain of "The whole world needs to sacrifice their health so I don't have to keep my kids at home for an extra week."

Somewhat similar to the healthcare workers' issues you mention in your comment.


Between pinkeye, RSV, influenza, hand foot and mouth, it's just one more thing your kids can get at daycare.


sounds like washington state lol. one of the reasons we moved from there.

now at my kids preschool, if a kid gets sick they stay home, if they are better the next day they come back. no PCR tests, no missing 2 weeks if any member of the family was sick, its great


Would you be willing to share the preschool? I would like to avoid sending my kids to a preschool that pretends asymptomatic spread of diseases doesn't exist.


“Expected to make another huge shift back to in office working”

Well, you kids have fun.

sound of me closing the door in my pyjamas with a nice hot coffee in my other hand


Yes. I will WFH just for the perk of having good coffee!


[flagged]


Nobody asked you to make a dickhead remark


>This doesn't even account for the incredible decline in civility from customers if you work a customer-facing job.

Not sure if you're in a customer facing job, but I've personally noticed that there seem to be noticeably fewer Karens now compared to pre-pandemic levels. (There was a spike during the first lockdown, but that died down within a few months.)

Everyone seems to be used to random disruptions now, and I think all of the campaigns about retail worker abuse have really made customers stop and think.


Lets not discount the people who populate the reddit/r/antiwork forum. Noting like wasting oxygen the rest of the world needs.


I never heard about them until I saw that news interview. What a wild ride.

https://www.youtube.com/watch?v=3yUMIFYBMnc


antiwork started off good but quickly went downhill.

I abandoned it some time ago because it is now a "pro-union eco-chamber".

I once advocated serious changes to the labour laws are what is needed, and was inundated with "join a union" posts.

It really speaks to just how out-of-touch these people are.

Unions only care about large corporations because they can get a lot of members and union dues. The issue with this logic is that small businesses are often totally out of the unions reach and laws benefit EVERYONE. The other issue they overlook is Union contracts only apply to those in the union and can change greatly from one union to the next.


I'm not unionized (I cannot because none of them are purely work focused here and I deeply disagree with their other views), but here is an observation from Norway, 20 years ago:

AFAIK, unionized companies in Norway statistically were more profitable than ununionized ones.

This might of course be because it is more tempting to unionize when there is a lot of money to be had, but I remember one extra detail:

In between (friendly) ribbing I also remember the union people here being focused on working efficient so that our bonus would increase :-)


> I once advocated serious changes to the labour laws are what is needed, and was inundated with "join a union" posts.

Strong unions are the way to get changes in labor laws and how to get collective action.

If we could all just magically come together without any organizational structure in order to effect change in the government and laws we would have already gotten it done.

There isn't a way out that doesn't involve organized collective action and something that looks very much like unions. Individualism doesn't work to effect the kinds of changes that are necessary.

People were inundating you telling you that you needed a Union because what you want is a magical pony, and unfortunately you're the one that is out-of-touch (although in another sense you're not at all out-of-touch since most people believe that there simply MUST be some way to effect change without collective action--which continues to fail but nobody will give up on the dream).


You want everyone to join in unison for a collective effort to change the laws for everyone, right?


I just don’t get why you wouldn’t choose someone a little… brighter to represent you.


IIRC most users and moderators or /r/antiwork were opposed to anyone doing interviews on behalf of them. The person who did just unilaterally decided to do it anyway.


I did not take a position on unions one way or the other on my post there.

As i said, i just found it shocking that "join a union" was their only response.

Labour laws impact everyone, union contracts don't.


Unions have bloc power to influence laws.


I think part of the issue is that it had its reputation destroyed destroyed almost as quickly as it was built. It was fascinating to watch it unfold.

I do not believe for one second that the person selected for this interview was a good representative for the cause ( although I am sure there are some goofballs, who can't seem to present a coherent case even on hostile territory that is Fox News ). I did not follow story that closely, but I believe the person that took the interview got serious blowback from its reddit friends.

The sad thing is a lot of people will form their entire opinion of it based on this one interview. To me it is OWS all over again ( I do not count BLM as that since it had Democrat political machine, and later business machine, backing ).


Nobody wants to work anymore, am I right?


I'm pretty sure I know why...

The working environment changed. Bit by bit I moved from an environment with gentleman's agreements and so on towards something in which everything was codified, safetyism became rampant, etc.

In the UK over the past few years I went from going into a nice office in a number of beautiful old buildings, getting my stuff done with cameraderie, having lunch together and playing board games, having a laugh in the board room, perhaps a pub visit after work - to sitting alone at home for 8 hours staring at a screen.

So I quit. My productivity - gone. And I fear it's never coming back unless that environment comes back; because I can't effectively function at a "job" with zero human interaction for an entire day, I've had to replace traditional work with other activities.

But is it just me? My friends in retail and other low skill jobs - half the workforce seems to have disappeared so they're all being asked to do far more than is realistically possible. My friends who are in education - the entire job changed, no more giggling and laughing children, you were playing a video game with half the class absent. Perhaps they're back now - but they're dysfunctional because their development was neglected for years. My friends who are in law - the entire job changed, no more travel, no dressing up, sit at home with a screen. My friends who are in medicine - christ, let's not even go there, eh?

I can't speak for those people. But I know that I need a reset, because this "new world" is one I'm just not built for.


> towards something in which everything was codified, safetyism became rampant

Those "gentleman's agreements" were not that great if you happened to be a woman, gay or any other minority. The upsides to HR, employment regulation and so on has been making the office a far better place to work for a lot of people.

While I agree that there are rampant problems in a lot of sectors, from low skilled to medical, there have been some wins. My team went fully remote for 2 years and now most people still work 3 days a week from home. We were able to build up a large and talented team during the lockdowns with most of my co-workers and those I manage 3 timezones away. We adopted more flexible working hours and we've never been happier. My manager can take time in the morning to get his kids to creche and I can take a longer lunch to check in on elderly relatives. I no longer spend 2 hours a day stuck in traffic. Our productivity has skyrocketed. We may be privileged tech workers, but the change in work styles has definitely boosted our company as a whole.


> Those "gentleman's agreements" were not that great if you happened to be a woman, gay or any other minority.

You just say that like it's a fact, but... why would that be the case?


Because one can break gentlemen agreement against people with lower social status. Also, because low key bullying or lying about or whatever of people with lower social status is acceptable for gentlemen.


Again, you say that like it's a fact. Gentlemens agreements can be broken with everyone. Low key bullying or lying about is not acceptable for anyone for gentlemen. You seem to have a misunderstanding what "gentlemen agreement" means. It's not limited to upper class men, or men at all. Everyone can have gentlemens agreements with anyone, it means "two or more people who implicitly agree on a thing".


But what about the so on?


[flagged]


I feel like this is a very real concern? In general I'm in favor of playing fast and loose with rules, and don't put a lot of stock into codifying anything because I feel like it's overall a huge negative when you're trying to get things done.

That all being said this particular arena is one that's so fraught with tiny little edges that all stack up to benefit certain classes of people unequally it unfortunately feels necessary to be explicit and precise. Ultimately this power is a zero sum game, and for and a more equal world means we must take power from some people and give it to others. That's almost never a thing that happens voluntarily, even if the will is there. It's very easy to argue for a status quo that benefits you, especially if the advantages you gain are easy to lie to yourself about.

“It is difficult to get a man to understand something, when his salary depends on his not understanding it.” ― Upton Sinclair


I totally agree. Where I work, in-person work became optional. Strangely, everyone 35 or younger decided to work from home (most of them don't have kids, which would be the one decent excuse), but the older people mostly come in. At 34, I'm sort of in-between, but enjoyed bantering with people my age or younger.

So I get the worst of both worlds: my boss can still come in my office at any random time and bug me about whatever, but not the social life or the ability to bounce ideas off each other when designing a new system. All-online communication simply does not work for creative or complex tasks.

The youngsters get practically nothing done -- I worked with them for several years before the pandemic, so I know for a fact that their productivity specifically went down 90% -- and guess who gets to pick up the slack? It turns out that even relatively motivated PhD students actually need in-person accountability and direction, or they just spin their wheels at best, or goof off at worst. No matter what excuses they make, it's not good for them in the long run, since it will be reflected in their CV. I'm not against fun, even during work hours, but you have to get the job done.

It's a medical research institution with a small clinic, so there are, as you suggest, additional issues there. I think science, in particular, requires in-depth, in-person conversations and that is where most of the really good ideas come from.


This has been my experience as well at 32. The commercial real estate firm I work at has a 4 days in the office policy, so we have a fairly robust social atmosphere. You can't design a building on a webinar, you need to sit together in a conference room, roll the blueprints out on the table and point to things, sketch changes, review pro formas.. it can't be replaced digitally.

The young people we're getting are like they're from another planet. They think it's' fine to come in late and leave early every day, they only do the bare minimum of work assigned and show zero engagement to help the firm beyond the scope of their assigned tasks. They're all coming from colleges that were remote or jobs that were work from home. How can you learn as a young professional in a work from home setting? You need to sit in on meetings, phone calls and discussions, you need to absorb the whole office around you, not just sitting alone at your computer.


> You can't design a building on a webinar, you need to sit together in a conference room, roll the blueprints out on the table and point to things, sketch changes, review pro formas.. it can't be replaced digitally.

just curious, why do you think that is? I spent several years working on project sharing and data visualization features for an architectural CAD program. you could show/hide/recolor all your objects on the fly to emphasize key details, sketch on top of viewports, play around with a clip cube in 3D (personally fixed a lot of bugs with that one...), and sync all your changes back to the main project file to share with colleagues. some of these features were a bit rough around the edges, admittedly, but I always got the impression they were pretty popular with our customers. I'm a little stunned to hear that you and your colleagues just want to print out a couple viewports and look at them on a piece of paper.


I agree, and adding to this another thing I find now is it's very very hard to spot young talent now if they're fully remote. In a team of 100+ people how can I remember who's performing well, who's good at certain tasks etc when all you are to me is a muted microphone and an avatar with no camera showing. Working in an office with others post COVID isn't about what we're asking you to do, it's about what you bring to the table and offer others. Communication problems among the younger workers (I mean lets say under 25 who've hardly worked in an office before because of covid) are a real problem now. Some don't know how to act, how to behave, how to muck in and be part of a professional workforce with people who have 10, 20, 30 years more experience than them.


>It turns out that even relatively motivated PhD students actually need in-person accountability and direction, or they just spin their wheels at best, or goof off at worst.

You don’t have the data to shows that in-person is necessary. And there are plenty of remote-only companies that make it work. It sounds like you’ve decided that there can only be one possible solution and then given up. That kind of thinking might explain a whole lot about the situation.


> But is it just me?

No. I go into the office every day because I feel the same. I need to get out of the house and see people. (It is OK if you, reader, do not feel this way! People are different!) There's a handful of other folks who come in every day. I'm starting up a project this week to revive our office culture a bit, to try to spring back from the COVID devastation.


I think a lot of us work-from-home preferrers also like (good) offices better than WFH, just not to the tune of hundreds of dollars and tens of hours lost per month.


Bus pass here costs $90/mo and I get to spend ~80 minutes per day reading books. My office is across the street from the library, it's pretty great :) It's no accident my house is next to a bus line, that was a major factor when we were buying.


I wouldn't want to force you to stay home, just don't force people to go into the office :)


Where did I say anything like that?


Hello fellow space traveler.

I appreciate your message. Good luck.


I've been working from home since about 2017. I kind of echo this. In the beginning it was nice, I had a HUGE productivity boost. After a while I became disconnected from everything. The other day I realized the only adult I talk to reguarly in person is my wife. Somtimes on weekends I talk to a store clerk. Zoom meetings don't fill that missing spot, it's just work, and I have very little personal connection with any of my coworkers... Days and weeks and months seem to merge.

I'm an introverted person, I like solitude. But I guess all things in balance, it would be nice to talk to another adult once in a while about something that's not work.

I got really into Crypto, because Crypto seems to have a heavy focus on community. It was fun to fly to NY and talk to others in the community. But then I flew home, and the spot was missing again.


Exact same boat dude. I miss the before-fore times a lot and still cannot believe how we got to this position.

Still trying to figure out whats next for me. This WFH shit is gonna stick around in our industry for a while... and it just isn't compatible with how I function. I have no clue what to do next.


I'm sorry to hear that. It really is difficult.

Just wanted you to know that you're not alone.


I don't consider myself particularly outgoing, but I'm glad to be back in the office 4 days a week. I really enjoy collaborating in person over a laptop or whiteboard, and shooting the breeze with my colleagues.

That said, I don't have an overly long commute, so I totally get it for those that are able to be more productive due to not spending hours in the car every day.

I notice that when I'm working from home, I'm much more likely to goof off and be less focused. Plus my wife being there is always distracting me with one thing or another. I'm pretty sure I'm quite a bit less productive working from home.


> Plus my wife being there is always distracting me with one thing or another.

More generically: young kids require attention, then we grow up and most adults desire attention but only get a little.

Even very high status individuals often seek attention (Elon & sinks?). I wonder how much of our status economy is about getting attention?

Giving great attention can be quite the aphrodisiac.


I love that this comment defines productivity as lunches, jokes, playing board games, and going to the pub. Though, I agree all those things are more fun than sitting at a computer.


Productivity went up during COVID. It is going down when we're making people return to the office


You have a very diverse group of friends!


The very important context is that productivity sharply rose in 2020, in the middle of the Covid recession. The recent drop in productivity is mostly just a return to pre-pandemic baseline.

The simplest explanation is probably just that the workers who were laid off during Covid were the least productive, and so average worker output went up, then fell as they re-entered the labor force. Or maybe even just there was some sort of abberation with how the complex statistic or productivity was calculated (e.g. inflation was actually here earlier than measured by CPI, and output was deflated incorrectly).

Either way, this is much more likely a pandemic related disruption and return to normalcy, rather than an indication that anything fundamental is "broken"


Maybe an even simpler explanation is that productively is roughly calculated as GDP / employed workers.

Productivity almost always goes up in a recession, especially one accompanied with massive layoffs (like early COVID).

Productivity is going down right now compared to 2020 because we are pretty much at full employment.

These productivity measurements aren’t really tracking individual productivity at all.


I’m not sure if this is your point, or just that it’s population level, but there is certain to be lag in the measure. I.e. if I contribute $10B to GDP in Q3, and lay off 10% of the workforce, my productivity looks great, but a large portion of that contribution will stem from work done by a larger workforce in Q2, Q1, and before, perhaps well before.

So the measurement during periods of recession or expansion will always be artificially elevated or suppressed.


Are you sure it isn't the design of the GUIs in tools HNers have vague insights into?


I'd argue it was also due to lack of alternative options for leisure time. When everyone was stuck at home there's only so much Netflix the avg person can watch. People likely spent some extra time on work because it was something to do.

Now with everything reopened there are many more options.


There is a similar effect in some countries with strict labor laws which make it difficult to fire employees. Unemployment rates end up being high, but productivity is also high because companies are only willing to hire the most productive workers and won't take a chance on anyone else.


Similar things happened during the Great Depression. Many things made in that era before the war are highly sought after since the craftsmen who had jobs at the time were often among the best in their field.

Musical instruments from the 30’s are legendary.


Yale’s new employee tour was like that, too. They walked us around campus and one of the things they highlighted was how much nicer the Depression-era stuff was than the decades before or after because the amount planned suddenly bought a lot more and they apparently had a windfall in the form of top-notch Italian stone workers who’d immigrated because the job market was even worse back home.


Note that this is like, how every recession works -- we lay off people, and productivity spikes. The fact that we did the same thing during a government imposed recession shouldn't be surprising at all.


hours worked went way down but dollars made stayed the same because of the handouts. The productivity metric had nowhere to go but up.


Did you even bother to click on the article before typing your opinion of what happened?

Their chart shows we are far below any sort of pre-pandemic baseline.


it reads "annual percentage change in labor output"... which I take to mean "rate of change of per capita productivity" over time, not actual measured productivity.

So that graph does not say we've dropped below pre-pandemic levels.... according to this report (https://www.bls.gov/news.release/prod2.nr0.htm): "Output and hours worked in the nonfarm business sector are now 3.1 percent and 1.5 percent above their fourth-quarter 2019 levels, respectively."


Not OP, but your reply comes off as pretty rude, especially as I think its pretty off base.

The chart is showing "annual percentage change in labor output", not "gross productivity" which I think supports the OPs point.

If it went up 10% in 2020, and 6.3% in 2021 (or whatever the graph is showing), just because there is a -7.4% drop in 2022 doesn't mean its "far below any pre-pandemic baseline".

In fact it's probably ABOVE pre-pandemic levels, even with the drop. I can't be certain from graph.


The chart only shows change, not the actual value. If you look at a chart of the values you'll see that it's true we've just returned to ~2019 values: https://tradingeconomics.com/united-states/productivity


Perhaps people are at the point of revolt over the ever widening productivity vs pay gap?

https://www.epi.org/productivity-pay-gap/

Productivity kept on climbing and wages stagnated post ~1980

Time for the workers to reap some of that benefit.


It could also be a generalized disillusion in the system. I believe what pushed americans through for generations was the american dream. No american believed to be poor, they were all simply "future millionaires".

But what if people realized that it was only a delusion, that it can never be that everyone is rich, because then who would do the dirty jobs? There is no social pyramid without a base, this system is litterally designed to have a class of poor people forced to do shitty jobs to survive.

If you take away the hope of a wealthy future, there are no reasons left to slave away your life on a corporate ladder.


consider that small business people had done their daily things for thirty years, not been chatting on the Internet; many of those local biz people relied on walk-in customers, and many of those local biz people are part of the Boomer generation. Those people paid their bills and participated in the general economy.

At the same time, corporate outsourcing reached epic proportions, with the associated transfer of power in the HR and Exec realms.


> https://www.epi.org/productivity-pay-gap/

The history of that EPI report is useful to know. Early versions of it showed a large gap between rising productivity and stagnating employee compensation.

Some critics then pointed out problems with the analysis [1]. The report performed an apples-to-oranges comparison of productivity of (A) all non-farm workers adjusted over time with a (B) GDP deflator-based inflation index compared with the compensation of (A) a limited subset of employees adjusted over time with a (B) CPI-based inflation index. A more useful comparison would use the same inflation index for both data sets and would exclude from the productivity measure the workers that are also excluded from the compensation measure. When this is done, the growth in productivity and pay rise and nearly in lockstep, thus effectively refuting the majority of the point that the original report was trying to make.

Since then, the EPI report has been updated to be more nuanced, which can be especially seen when one expands the 'click here for more...' sections. The new conclusion from the report is that productivity and compensation increases have been increasing primarily for a subset of workers while a broad subset of workers have not seen large productivity and compensation growth. This is true (as far as I can tell), but it's also a different story with different policy implications than the original story which implied diverging productivity and pay within the same set of workers.

[1] https://www.heritage.org/jobs-and-labor/report/workers-compe...


It's supply and demand - the supply of workers has doubled the last decades while demand has remained roughly the same. I think it's a miracle that salaries are so high currently.


This is actually contrary to Econ theory and empirical data. See https://news.ycombinator.com/item?id=33394486

The trend is towards less immigration and thus lower demand for goods/services and lower supply of labor.


I think he is mainly talking about woman entering the workforce since 1970ish, which massively increased the labor pool.


Assuming that's true--and I'm not sure it is--mass retirement of baby boomers, which has already begun due to forced retirement during the pandemic, is going to absolutely decimate the labour supply, which has enormous knock-on effects (including a rise in inflation).


You can go back to the year 2000 and find Fox News and CNN talking heads warning us about the impending doom of baby boomers retiring and taking the economy with them. Any year now...

What has actually transpired in the meantime has been record breaking bailouts, corporate handouts, and profits, while workers pay remains in stagnation and housing market inflation goes through the roof (because of slow development, IMO, attributed to NIMBYism, mixed with a nationwide inability to build densely or build public transportation infrastructure).

edit/ And let's not forget, there have also been 2 disastrous, major wars, one of which inarguably never should have occurred.


> You can go back to the year 2000 and find Fox News and CNN talking heads warning us about the impending doom of baby boomers retiring and taking the economy with them. Any year now...

Yeah. And now it's happening.

Back in 2000 the average baby boomer was 35-55, far from retirement age.

The average baby boomer is now 55-75, and after COVID forced a ton of them out of the labour force, they're choosing not to come back.

See, you can report about a thing that's likely to happen in the future before it actually happens, and in the intervening period, while it may not be happening, that doesn't mean the reporting is wrong.

Or are you also one of those types that thinks the media was overblowing the whole global warming thing because they deigned to report about it before we saw some of the more dramatic and visible effects?

Frankly, I don't know what you're going on about in the rest of your comment. I made no claim that baby boomers aging out of the workforce explains All The Things. I certainly didn't make the claim that it explains trends in the economy up to this point. My point is that it's now a major factor in the economy going forward and we can expect major changes as a consequence.



I doubt it.


> Productivity kept on climbing and wages stagnated post ~1980

The growth of government sopped up the difference. Nothing the government does comes for free, and then there's all the additional costs of complying with regulations and doing all the paperwork.


For the people who don't like my post, where does all the money come from that funds the government? Nothing is free.


Please stop being the highlight of zero sum thinking.

Let's make up a hypothetical example. The government says you have to install a safety rail and the amortized cost is $1 a year.

You: OMG, this is going to cost $100 over the next century, a huge loss, I am destroyed.

Reality: Johnny doesn't fall of the equipment being coming paralyzed (costing you an immediate $200 in lawsuit and payout fees) and is able to produce economic product over the next few decades bringing in $400 to the economy. Net win for everyone.

That's where the money comes from. Or would you rather be like Russia where you have a giant potential economy that outputs less than Italy and doesn't give a damned about corruption and has terrible quality of living standards/longevity?


Here ya go (not a hypothetical example):

https://slate.com/business/2022/10/san-francisco-toilet-mill...

They beat out Seattle, that spent $250,000 on a portable toilet a few years ago.

On my own street, the city water outfit installed a fire hydrant. It cost $10,000, including architectural drawings of the installation. When the crew came out to install the hydrant (the main water line runs under my property) I asked them if they'd seen the drawings. They said "what drawings?" They'd never seen nor talked to the engineer, nor had any idea there ever was one. I asked them what the hydrant cost. They said $2,000. They had a machine on the back of the truck that was able to dig the hole, drill the main, and clamp on the new hydrant in 15 minutes.

This was 20 some years ago, back when $10,000 was real money.

The IRS now requires any business that sends out payments to an individual of more than $600 per year now has to file 1099s. The threshold used to be $20,000. A lot of ebay-ers are in for a big surprise. Do you have receipts for what you paid for items you sold on ebay?


The guy installing the meter was told where to install it, they are not the architect that make sure it actually works if they install top many so it's not a surprise.

It's no different than me 20 years ago installing a server for a client. I would go out slap it in and turn it on. I did not architect the applications on it, nor configure the firewall rules on the router for it to work.

News articles are written about exceptional things, not the mundane.


I saw the drawings. As an engineer, I can assert they were completely custom made and completely unnecessary. You don't need architectural drawings to determine if the flow is sufficient.

Besides, everyone knew the flow was sufficient because it would be the only hydrant upstream of a flow reducer in the main line. I.e. there was plenty of pressure in the line.

The drawings were completely superfluous to a bog-standard install.

> News articles are written about exceptional things

That's true. The hydrant wasn't in the news. Want another one that wasn't exceptional enough to make the news? A nearby one mile stretch of road has been undergoing repaving for THREE YEARS now.

BTW, did you read the article about SF? It blames the permitting process which takes forever and $$$$. That affects everything in SF.


Anecdotes about corruption and mismanagement do not address the underlying point that government spending is not zero-sum.

That hydrant was probably too expensive (maybe; insufficient data to say for sure). The damage if a fire breaks out and no municipal fire system is available in a modern city is catastrophic.


> Anecdotes

I gave real examples, not hypotheticals.

I never said government spending was zero-sum.

> That hydrant was probably too expensive

The bill was $10,000 for a $3,000 job.

> The damage if a fire breaks out and no municipal fire system is available in a modern city is catastrophic.

At $10,000 a pop there'll be a lot fewer hydrants installed, and hence greater risk of catastrophic fire.


Anecdotes aren't hypotheticals; they're single data points with insufficient signal to predict a trend or pattern.


You can believe the government is efficient if you like. The fact remains that there's been massive growth in government, and that is going to be paid for out of the economy.


Maybe I've just spent too many years in government contracting, but when I hear that a government program is growing, I hear economic stimulus, not pulling something out of the economy.


I wonder why those countries that "stimulate" their economies tend to do poorly.


Do they?

China's GDP growth consistently out-paces the US. Canada continues to do well. Germany has consistently positive GDP growth with a few exceptions related to international downturns. And, of course, if you're talking about countries that use economic stimulus, you'll have to include the United States, which has the largest GDP in the world. I think your claim requires a tighter definition of "economy" and "do poorly" to hold merit.

Everybody is doing more poorly than growth estimates, but (a) COVID just happened and (b) everyone always does worse than growth estimates; growth estimates are the rewarded metric and therefore incentives are high to over-estimate it in extrapolation as another form of incentive (i.e. aim higher than you want to hit).


Regardless of personal beliefs, there's the other issues here. First, what does "massive growth in government" even mean? Do you mean in government meddling in people's personal lives, like restricting bodily autonomy for women or banning books? Yes, I agree. Do you mean in government spending? Sure, agreed, the numbers agree. But you seems to be referring to some other kinda vaguely defined concept of like... government bureaucratic-ness? I'm not really sure what you mean and how to measure what you mean.

The second issue is your overall thesis seems to be "the government is inefficient," which is fine to say and to criticize, it certainly is inefficient by many measures, but I feel like you're hinting at some kind of alternative that I can't possibly guess at. Usually these discussions take two routes: the capitalist one, wherein you argue that private industry is more efficient, which even if we accept that efficiency is the only valid measure of how we should choose to do things (amazon builds road faster and cheaper than government: then puts a toll on it so only the rich are allowed to use it. this is bad), is probably not true[0] or possibly the opposite of the truth. Maybe you're more anarchist in your leanings, in which case you're arguing for more distributed and local management of these kinds of services? That could be a really interesting conversation, is that what you're suggesting?

[0] https://gsdrc.org/document-library/is-the-private-sector-mor...


First, what does "massive growth in government" even mean?

1. the amount of money it spends

2. the amount of interference in the workings of the marketplace, usually called "unfunded mandates" where they put burdens on businesses

> I feel like you're hinting at some kind of alternative that I can't possibly guess at.

A much more limited government.


1. the amount of money it spends

yes, government spending increases, always. I'd need to be convinced this is inherently bad lol

2. the amount of interference in the workings of the marketplace, usually called "unfunded mandates" where they put burdens on businesses

What about subsidies, when businesses only exist because of government intervention at all? What about Harley Davidson? The banking sector?

> A much more limited government.

How limited? Is it allowed to prevent monopoly? You seem to know enough about the subject to know that making the state too limited will merely lead to the establishment of state power by monopolistic corporations instead, from plenty of historical examples.


Caltech basically ran on government money so you can thank them for your entire undergraduate education (yes I know you paid tuition but it was no doubt minimal)


And, of course, we're ignoring the elephant in the room if we don't mention the Internet started as a government project.


My guess it's a game against inflation. People understand they're getting paid less for their hard work, and not motivated enough to work harder every day. For comparison, I'm spending x1.5 more on groceries than 1 year ago. Some items even x2 more expensive, comparing to last year.

Salaries aren't growing that fast.

Personally, I love what do, and this fact didn't affect the ability to work. However, I can imagine some people can get seriously affected by that.


While Groceries have been impacted by inflation, the big chains in the US have also been just raising prices as indicated by their increased profits recently: https://www.wcpo.com/money/local-business-news/kroger-profit...


Profit margin would be the relevant metric, not profit. And as you can see, profit margin has not increased:

https://www.macrotrends.net/stocks/charts/KR/kroger/profit-m...

Of course, a business with sub 3% profit margins maintaining sub 3% profit margins is hardly news, but inciting anger for no reason does result in more clicks.


Could be a factor at tech companies where lot of compensation is in a form of equity options. Especially smaller / medium sized tech companies have had their stocks go down by 50-80% in some cases. So now you’re earning much less than you thought you did plus inflation is out of control further reducing your real income.


I think there's a very obvious reason for the "productivity drop": Return to Office

I remember tons of studies showing that remote work had caused unprecedented increases in productivity so it seems plausible to me that companies ending it caused the productivity drop which caused the minor recession earlier this year (the big one due to the Fed's rate hikes is, I believe, still in the future). The article itself even acknowledges that remote work increased productivity yet ignores the 50 foot tall elephant in the room that is RTO.


The BLS recently reported that at its peak, only 7% of Americans were working from home, so this demographic is probably too small to have much large scale impact on the productivity numbers. If there is stagnation in the productivity numbers, we should look at what the other 93% of workers are doing.


According to this it was 35% at the peak, and 7% is the current figure: https://www.nbcnews.com/data-graphics/data-remote-workers-de...


Damn. Was it really that low? I ran into census data stating the following:

"Between 2019 and 2021, the number of people primarily working from home tripled from 5.7% (roughly 9 million people) to 17.9% (27.6 million people), according to new 2021 American Community Survey (ACS) 1-year estimates released today by the U.S. Census Bureau."

I Initially thought maybe the issue is average reported over years, but that does not seem to match either so I went to BLS[2] and the 7% number is there, but it is for for April 2022 and not its peak.

Chart data suggests double digit average.

Please correct me if I am misreading something.

[1]https://www.census.gov/newsroom/press-releases/2022/people-w... [2]https://www.bls.gov/opub/ted/2022/7-7-percent-of-workers-tel...


How did they conduct the measure? 7% is nowhere near what the real estate picture says.


Yeah no kidding. If it were demanded of me to RTO after this time of working remote, they wouldn't get quiet quitting, they'd get active sabotage while I search for a new job. There's no excuse for torturing ICs with RTO for reasons of executive vanity.


Wait, are you saying you would actively sabotage your place of employment because they asked you to work in the office? Did you have the same feeling before WFH was even a thing?

Edit: Also, what does "torturing ICs with RTO for reasons of executive vanity" even mean? Have we come so far that working from an office is now "torture"? If so, the level of privilege is simply astounding...


I am not the parent, but I believe I can answer, because my thoughts on the few days that I do venture into office border on postal. In my defense, I hate people who invented mornings with a passion reserved for people, who hurt puppies for amusement.

When I see phrases like "executive vanity", I immediately picture a manager, who saw the productivity numbers gained and decided that those do no matter ( and some of us did go above and beyond specifically to show that there is a benefit to WFH for companies ) and instead opted for butts in chairs for no other reason than "I want to feel like I still matter". Some executives refers to this as management by walking.

<< Have we come so far that working from an office is now "torture"?

It is kinda like this. We all know WFH is possible. We know it results in good results for the company and yet management opts to RTO. They want me to commute, force me to get up extra early, almost kill people on the way two work if I do drive by car ( because I am sleepy; not because I am postal ) and somehow match the productivity that was happening precisely because we implemented WFH? Yes, by comparison it is torture. I had a glimpse of heaven and it was forcibly taken away from me ( it wasn't taken from me, but please understand that this is the mindset ).

edit: And, if you buy into one specific poem I will not name, hell is everyone but heaven and therefore torture.

Whether you agree with it or not is irrelevant. Put yourself in their shoes. See the world through their eyes.

If not, a good chunk of the people, who managed to keep the companies afloat during covid will bounce ( and some did ), because those are the people that actually knew what to do when SHTF.


And yet my company likes to remind people on a daily basis that it's a privilege to be allowed to work from the offices (because they just refurbished them). Literally their wording. Completely detached from reality


It's because RTO hasn't actually happened yet. Most companies either keep WFH or go with a hybrid solution. My theory is that the productivity boost wore off as soon as WFH became the new normal, and went negative due to obvious reasons (easier to slack off). Classic honeymoon phenomenon where everyone was ecstatic to work from home initially but got used to it and now take it for granted.


I agree it is almost certainly related. In that a lot of the reporting on rise of productivity that came at the beginning was just as explainable by noise as the current drop is.

That is, I can't rule this out. But I would also not bet against reversion to the mean. Such that the actual waterline on productivity is probably not known, just yet.


This could easily be an accounting anomaly. Productivity is defined as real GDP / hours worked. If you shrink the denominator - say, by causing 25% unemployment with lockdowns - while also boosting the numerator with government spending, you get high productivity, which we had during the pandemic. If you then shrink the numerator (say, with high inflation, which translates a given nominal GDP to a smaller real GDP) while increasing the denominator (through record low unemployment), productivity will drop.

IMHO traditional economic metrics are not well adjusted to an economy where output is largely independent of effort or hours worked, particularly not when measured on a quarterly basis. I work for a (remarkably slow and bloated) tech company. If I choose to do nothing other than post on Hacker News, that will not become apparent in company performance for ~2 years, because that's the median time for a project to make it out to market and start having an effect on consumer behavior. The company could lay me off and it wouldn't affect the bottom line at all, but it'd boost productivity. Conversely, if I hire a new person, I don't see significant gains in output for ~2 years, but employment and hours worked has gone up, and so productivity is down.

The limiting case for this is algorithmic cryptocurrency trading with mark-to-market accounting (of which there was plenty in 2021). Here, you have computers trading virtual assets back and forth at ever increasing prices. Because prices are going up, the value of everyone's assets increases, and with mark-to-market accounting you'd show a profit. And yet nobody is employed and no real work is being done. Productivity is effectively infinite, but it means nothing.


Per-capita wealth probably a more accurate, better indicator of how well the economy is doing compared to productivity.


The reason for anything large and complicated is very nuanced and probably has a lot of small causes that we might all disagree on, but I'd say that the overarching reason for this social malaise might be that a lot of people are feeling a lack of hope for their future.

If you think you're working towards something good: a family, home ownership, kids, a reasonable amount of life enjoyment in terms of leisure, a stable society, and a pleasant retirement where you can enjoy seeing your grandkids and participate in some hobbies for a decade or so before your mind or body collapse....you might be willing to push yourself to achieve as much as you possibly can, even if you're a lowly cashier or janitor.

But who wants to go the extra mile for this degenerate and hopeless society where your money is being destroyed and you have grave concerns about many things? Whether rightly or wrongly, everybody is seeing fucked up things in the world and many people are feeling much greater concern about the future than we've ever seen before. This isn't a recipe for going the extra mile at work or harnessing the energy of society to achieve something great.


You’re 100% right. That plus the fact that my employer tried to fire me for not taking an injection, I’m good on chilling.


> In the first half of 2022, productivity — the measure of how much output in goods and services an employee can produce in an hour — plunged by the sharpest rate on record going back to 1947, according to data from the Bureau of Labor Statistics.

> The productivity plunge is perplexing, because productivity took off to levels not seen in decades when the coronavirus pandemic forced an overnight switch to remote work

It also comes at a time when many employers are shifting back to hybrid schedules and RTO, despite employee claims that remote work allowed flexibility helped them work more efficiently

.

What a mystery.


I'm of the opinion that WFH has been a net negative to productivity. Measuring productivity was much easier for managers to do when everyone was in the office as employees would find it more difficult to hide attempts at slacking off. Those who are less productive when WFH are less willing to admit it and I believe it's the reason why whenever the topic of WFH comes up, it's drowned by the voices of all those who say it's been absolutely great for them.

I don't think anyone is going to argue about the conveniences this WFH culture has brought. And I'm certain there quite a number of people who can prove how much better their work has been because of this shift. Those who are being far less productive, though, are kinda ruining it for the rest of us and lots of managers know it. I think Satya is on point when he talked about what employers say about productivity and what their managers think is actually happening.


Measuring productivity is hard in and out of the office.

From personal experience, it was incredibly easy to do nothing when working from the office while remote you’re more held to your deliverables. In an office, it’s very common to see people look busy, but are just doing unrelated things.

The only difference is that in an office people mask their lack of productivity by pretending to be busy, whereas remote you don’t have to do that.


Why is measuring productivity easier to do in the office? Don’t your coworkers have deliverables? How does being in the office make it easier to tell if those deliverables are complete?


Eh. I could offer anecdotes ( as I am sure many would as well ) about office life and how much time is spent pretending to work by engaging in vaunted water cooler discussions that recent anti-WFH corporate propaganda had the balls to brand as spontaneous collaborations. And I am saying this with over 15 years of experience in corporate bullshittery.

I honestly do not count "is he in chair staring at screen" as productivity. I count successfully finished projects as such, which brings to question what those managers are trying to measure if they are clearly failing so badly at this. I said before and I will say it again. Managers in US had an easy ride for the past 7 or so decades. Now they are actually asked to work instead of 'managing by walking' and they can't handle it. They need 'throw away their skill set due to Covid'. See twin trail of tears in my eyes?

Managers have all the tools they need and then some. The sheer amount of corporate monitoring software on corporate lappy is beyond astounding. Our manager does not seem to care, but I am horrified to think what other managers use it for. It is very, very intrusive. In other words, if managers can't measure even with those ridiculous and invasive tools at their disposal, what are they doing exactly?

<<I think Satya is on point when he talked about what employers say about productivity and what their managers think is actually happening.

Satya may be repeating what the managers are saying, but managers are saying that they have to actually work for morale and motivation. The horror.

<<Those who are being far less productive, though, are kinda ruining it for the rest of us and lots of managers know it.

Well, is it not manager's job to motivate and coach the less productive employee?


If this is true, why was productivity so high pre-pandemic when everyone was in an office?


What everyone is ignoring is that financial speculation leaks into productivity numbers, and we went into a speculative boom during covid and have been in a speculative bust during this recent period.


This article and the chart within it reference the rate of change in productivity, but not the raw productivity number.

Without seeing those numbers, my assumption -- and what seems to be implied here -- is that productivity rose in early 2020 with remote work, and it is now dropping to pre-pandemic levels.


The "quiet quitting" strategy was not well accepted and now they are trying this.

> Since the pandemic started, “the link between hard work and reward has been broken”

More like "since the 1970s"


https://www.bls.gov/productivity/graphics/2022/graphic-4.htm

"Real compensation" is "Employer costs for wages, salaries, and employee benefits", adjusted for inflation.


my 2c is nihilism and cynicism are why.

Why bother, nothing means anything anyways? Why bother chasing the dream, if it doesn't come true?

People have increasingly been losing meaning and purpose in their lives. Old dreams like home ownership, family (including extended), close knit friendships, and eventual financial freedom/security have been supplanted by travel (broken by inflation & lockdowns, made fake by social media), fur babies (due to gender disparagement and estrangement), workplace politics / friends for a season (as we move for work, and have increasingly tighter requirements for friend groups due to technology (ie you can find people who agree with you online, so no need to befriend the neighbor with slightly annoying political beliefs), and life time of debt and little investment value[1].

So people are simply burned out being asked to pursue a long run strategy w/ a "promised" dream. The people delivered on their end of the bargain, but the dream makers didnt.

[1]: https://www.macrotrends.net/2324/sp-500-historical-chart-dat... From late 1999 to 2016 returns on .inx after inflation have been down or flat. People no longer believe the "Buy the index and you will see profits" after 20 years of contrary evidence. Over the past 30 yrs the index using the "8% per year" usual claim, should have returned 10x but it actually only returned 1/2 that.

Why try, it's not going to work anyways?


In addition to the other good reasons listed here: Inflation.

Most of us are now paid less in real terms for our efforts than we were 1-2 years ago. If you convert your labor to something invariant, like carrots, you just aren't given as many for a day of hard work. So, to the degree that we are rational beings and have agency on the matter, we scale our output accordingly.


> Since the pandemic started, “the link between hard work and reward has been broken” for many workers, Buber said, resulting in “curbed ambition.”

Over the course of my 10+ year programming career, I've seen this effect steadily increase. Promotions seem to come or not come regardless of how hard you work. Looking back at older books about the tech industry (and comics such as dilbert), it seems like this effect isn't new, but could be something that ebbs and flows over time.

> Productivity tends to move in cycles of 10 to 20 years

See this is what I mean. Perhaps there's a megatrend going on here. Promotions go to those who don't deserve it, so companies self-destruct in their own incompetence, and a new crop of companies rise up, promoting those who are actually productive, and they reign supreme for 10-20 years, before they too become bloated and start promoting under-performers (who look good on paper). And the cycle repeats itself.


The article doesn't talk about how productivity is measured and the actual sensitivity and accuracy of the measurement(s). As others have pointed out, measuring productivity is a little hazy and probably is influenced by subjective bias (hey a recession might be coming... hmm, I feel like less work is getting done at my office).

I wonder how much bias affects the reported measurements? I doubt businesses outside of manufacturing can actually discern a 2% change in productivity, when screening for other factors, and some of them can't discern a 5% change.


> I wonder how much bias affects the reported measurements?

They don’t, because productivity is not self-reported. It’s also not measured the way you imply.


Our past productivity also came with some of the longest working hours per year of any nation, iirc. Our pay never got improved and our living conditions worsened, we work too long, and we're sick of killing ourselves for corporations to become insanely rich. Of course we're not productive, we're sick of the bullshit.


Lots of people are productive WFH, but loads are taking it easy. Often in my team Fridays people barely dial in, maybe are connected a few hours max. Thursday afternoons now are dead as people get ready for the weekend... Job market is still strong so no one really cares that much as we know we're difficult to replace.


Excellent, this provides productive evidence for a 4 day work week if everything still runs with folks checked out on Fridays and they’re not in the office performing work theater.

https://www.4dayweek.com/

Labor power in the face of a recession comes from dwindling labor supply. 360k boomers retire each month in the US. 1.8 million people over the age of 55 die every year. Not enough folks to backfill.

https://archive.ph/2022.10.27-015740/https://www.businessins...

https://www.pewresearch.org/fact-tank/2020/11/09/the-pace-of...

https://www.daytondailynews.com/local/rising-number-of-baby-...


>this provides productive evidence for a 4 day work week if everything still runs with folks checked out on Fridays and they’re not in the office performing work theater.

Not exactly. Unless 100% of workers are only working 4 day work weeks, it's impossible to account for the confounding variable of other workers picking up the slack when analyzing these trends from a macro view.


"Everything still runs" could be applied in a lot of situations, but it's not necessarily a desirable state.

Everything still runs at McDonalds if there are half as many cashiers, but it will take a lot longer to get your Big Mac.

Everything still runs at the hospital if there are half as many nurses, but the level of care is much worse.

You get the idea.



>Everything still runs at McDonalds if there are half as many cashiers, but it will take a lot longer to get your Big Mac.

With the McDs i've went into recently it appears everything still runs even if there are zero cashiers being that you can click your order in a big touchscreen.

Turns out that cooks seem more important that cashiers. Course we'll see see how much of that can be replaced by robots in the next few decades.


i don’t think anyone doubts that certain jobs can be automated away, but what concerns me is how many people point at “lowly cashiers” and gloat “look at how stupidly greedy you’re being by asking for a livable wage you lowly cashier.”

the reality is, companies absolutely will save significantly _more_ money by automating office jobs that cost significantly more than a cashier.

i’m not a luddite. i live, breathe, play in and firmly believe technology has _potential_ to liberate us, but anyone who celebrates that job automation will harm “the poors” but fails to see that obviously companies will automate their expensive mid/high tier employees asap is allowing their class snobbishness to blind them.

everyone in tech is familiar with the joke, “don’t screw with me or i’ll write a bash script that can replace you.”


I’m a programmer and honestly really hope that this job gets automated away. Obviously I also hope we figure out how to support people who lose their jobs to automation, but sitting at a computer for 8 hours a day is not how humans were meant to spend their lives.


If this were to happen, then wouldn't people just start "checking out" on Thursdays?


I'd like to agree but the article literally says employee output fell.


My remote employers have - and continue to - effectively subsidized a lot of my personal ventures in the last decade.

Personal projects (software and art), learning skills (programming, instruments, video editing), improving at video games and sports. Not to mention leisure. All done "on the clock." To be honest, maybe the majority of the 40hrs I get "paid" for is actually used for this instead of company output?


Just one guy's opinion, but to me it sounds like you're making intelligent, healthy, and entirely logical choices about your life and how you choose to spend your limited time.


absolutely - your consciousness is your most precious resource!

the hardest part is the tension with the external guilt - but that is the point of the guilt after all


My in office employer (prior to covid) was subsidizing these things for me. Your tax dollars hard at work.


I switched jobs at the beginning of the year, and still haven't figured out if people here just don't work on Fridays or did that start with WFH?


Anecdote of one, but as someone whose job involves a lot of "pinging" and "chasing" people for approvals, code reviews, sign-offs and so on, I have found during the last couple of years, it's nearly impossible to get a reply out of people on Fridays, so I've learned that if I need someone's response by end-of-week on some topic, I need to do my heavy-chasing on Thursdays or I'm not going to get it.

This is very different from pre-WFH where I could physically find the person on Friday and stand there until they did whatever needed to be done.

I'm also a huge proponent of our new WFH world, but even I notice and can admit this disappearance of Fridays.


The thing is, Bob, it's not that I'm lazy. It's that I just don't care.


It’s the phones. Our attention spans and ability to focus have been destroyed by constant little hits of dopamine.


It's not just the phones themselves, it's the modern social media apps (TikTok etc.). But I agree it seems to be twisting things up.

I think a closely related issue is how everyone, everywhere, all the time seems to be arguing about political ideologies, or making every issue about red vs. blue or liberal vs. conservative or racist vs. antiracist or whatever other way people want to split others up and then talk about it all day.

I get that it's an election year in the US and people have done this for a long time, but I swear it feels like it is reaching an absolute fever pitch these past years that is different than before.

Because it seems so all-consuming, I think it distracts people from work even more than in the past.


This! All of the companies kept shortening their dopamine cycle to compete with each other and now people can’t focus in real life for more than 5-10 seconds before they are mentally searching for the scroll button to find a new interaction.


"hits of dopamine" is not really how it works. Drugs like methamphetamine or cocaine directly give you "hits" of dopamine. Perceiving a stimuli through your senses does not directly manipulate the dopaminergic neuronal populations. It's just like any other stimuli where, if the stimulus is actually intrinsically rewarding, eventually the dopaminergic populations in your brain begin to associate the stimuli with potential for reward. This leads to an increased perception of the salience of that particular set of stimuli.

This is very different from cocaine which can cause humans to perceive stimuli as more important and potentially more rewarding even without any rewarding component to the stimuli at all.

Using a computer is not a drug. Stimuli on a screen are no different from stimuli from looking at something else. Using a smart phone does not give you "hits" of dopamine. Stop conflating normal environmental stimuli with drugs that act directly in the brain. It is dangerous because the way governments deal with drugs, and the very real addictions possible, is violence. Bringing violence into this non-violent non-coercive context is immoral.


Umm, you might want to read up on dopamine and how it works. The Huberman lab podcast has a great episode.


I tend to stick to journal articles. You should checkout the review articles at the Berridge lab, https://sites.lsa.umich.edu/berridge-lab/publications/


I agree with your ultimate conclusion ("Using a computer is not a drug"), and that this is an important consideration, but to play with the forest-trees thing here, the stimulus I think most people perceive and/or conditioned to is the alert tone and/or vibration, which I believe has been argued to (not in their words) have some inherent salience, once the user is conditioned by carrying a phone around for a while, at least.

I believe this is one of the avenues argued for "Tech Addictive"/"Screens Bad" - that the intrinsic value of bzzzzzt could, at least hypothetically, be as high as, say, nudes or an "omw" text, or even your dealer texting he's 5min away; and that this inflated value is in turn projected, however briefly, onto every once-in-a-lifetime sale and useless 3am app notification about an icon set update or something.

There's also obviously the much-written-about addictive UI/UX features employed in various places. I vaguely recall one or two unfortunate email chains, maybe, but am assuming most product teams didn't go into meetings with nefarious intentions of getting their users psychologically addicted.

Nevertheless, addictions can be triggered by adjacent things, and however little dopamine "all the little red little circles all over the place" release in my grandmother's brain is probably very different from a chainsmoking coke user taking a swig from his bottle as he picks up his phone to see: - 32 New Facebook notifications! - Your dispensary order is ready for pickup! - sexybabe_notabot69 liked your profile! - Your bank account is overdrawn! - 18 new Twitter notifications! - ALL NEW SLOT MACHINES! NOW WITH DIFFERENT KINDS OF FRUIT AND SHAPES! - You're never gonna learn Spanish if you keep doing drugs, Carl! - DON'T MISS OUT! JIMMY BUFFET LIVE AT THE CASINO THIS WEEKEND! - Your order has shipped! - YOU'RE GONNA LOSE YOUR VIP STATUS IF YOU DON'T COME BACK HERE AND GAMBLE - Re: Hey - THIS WEEKEND ONLY!!!! ANNUAL ONCE-IN-A-LIFETIME SALE!

I think I could probably make the argument that maximizing for, say, MAUs/DAUs, is essentially an addictive cycle - a la "valueless reward" - in the business process, probably citing lots of business types who have written lots about how optimizing for the wrong metrics will leave your company broke and homeless too.

So, I guess I'm saying "Using a computer is not a drug", particularly as you used it, is nearly indisputably true, but somewhat misses the conversation being had (however dumb), and that it's worth looking at all of the links in the causal chain and examining how, for example, alarm fatigue and <sleep stuff> compare and contrast (and occur comorbidly with) actual addictive and/or depressive syndromes - for exactly the reasons you listed, like:

"We've found that homeless people using Facebook are xy.z% more likely to relapse on heroin, don't understand statistics, and therefore don't allow our clients to use the internet, except for this one from 2005 that lets them digitally sign the form we need to get reimbursed for the bed."


If you want to argue this then at least use the correct description of the proximal cause, "hits of glutamate in the shell of the nucleus accumbens and ventral tegmental area". It doesn't roll off the tongue and focusing on the neurochemistry ignores the context. So maybe it's better to just say it simply, "If you enjoy doing an enjoyable thing and you do it a lot you'll anticipate liking it more than reality provides on average."


But why now? It seems like that would have happened sooner.


I suspect that phone usage went up substantially during the pandemic, and that had somewhat of a lag effect to show up in these reports.


+1 I find watching TikTok or YouTube Shorts to be damaging to my concentration compared to what I consider (for myself) to be healthy activities like reading a book, watching or listening to an interesting interview or educational like philosophy, etc.

I try to fight back by only having TikTok installed on a Chromebook that I don’t often use and for YouTube Shorts, I count the number I have watched.

On my cellphone, if I want to waste time, I prefer a quick game of Chess.

I strongly recommend the https://freedom.to service as well as their podcasts.


Phones were invented this year? Or why did they have an effect only just now?


Phonebad isn't just a city in India.


Actually, this seems correct to me.


My 2 cents (although a bit of a salty take): In my last 3 roles, rewards like promotions go to those who create their own personal brand and win the popularity contest. While sometimes that correlates with actual productivity but often does not. So often I see folks do the bare minimum in these work cultures which seem to be getting more prevalent.


I have a hunch, The past 10-15 years has seen a large slew of mergers in all industries, (tech, telecoms, pharma, food, grocery, etc) that have created shared monopolies in most sectors of the economy. Given that news, monopolies function is not to deliver productivity for productivity's sake, but to extract profits from inside that sector. Most of these companies are sprawling behemoths where managers are encouraged to grow headcount(as it makes them look more important) while still keeping the profit machines flowing. I'm not the least bit surprised as if we had a country where smaller players were duking it out in their respective sectors, you would get faster innovation.

Bottom line, more competition is good for end customers, and good for productivity as well.


A couple of reasons I can see, just my opinion

* My buddies in trade-based fields say they are working harder than ever, as their backlog has only increased in size

* Less scrutiny when working from home, more fooling around, less work

* Feeling that "grass is greener" elsewhere, therefore no longer being committed

* Polling has shown that their ideal job for Gen-Z is being a CEO of a company, but few seem motivated to start their own business as those stats have been down. Clearly there's a gap in motivation or understanding


Business formation surged since the pandemic, but Gen Z is going to have a hard time with it because they’ll be more broke than millennials


Only if they need a lot of capital.

Everything else about starting a business is far, far easier than ever. Almost every state in the US has streamlined the business formation process, made it cheaper and faster, etc.


Except for the fact that you won't have healthcare


They have access to the same health insurance, it is just expensive, which goes back to needing capital.

It is trivial to go to healthcare.gov and buy the same health insurance an employer subsidizes for employees.


> It is trivial to go to healthcare.gov and buy the same health insurance an employer subsidizes for employees.

This is so shockingly false in most states that I don't understand how you feel you have enough personal experience to state this so confidently.

Neither California nor Texas have any PPO-style plans available on healthcare.gov. For all the public / self-employed plans, "Out-of-network" means "Pay for it yourself, this is not covered at all." That's a huge barrier to care when you need an urgent care and it's not clear which doctor at which urgent care might be covered.

Additionally the rates aren't just different due to subsidy, but due to quality of the participant pool. Many large employers are self-insured / self-funded, and the insurance company just administrates the fund, reimbursements, etc. However, the unsubsidized rates (made known to us via COBRA) are still much, much lower than the healthcare.gov rates because the participants are generally healthy, wealthy, and young.

When you buy healthcare.gov you get the shitty rates. This isn't just a difference of degree ... having a $100 deductible vs. a $6,000 deductible, or a $1,000 OOP max vs. a $22,000 OOP max literally makes the difference whether I can get my gastrointestinal cancer kept in check every year or not. I can afford the COBRA premiums for that $1,000 OOP max, but I absolutely cannot afford the healthcare.gov plan with >$15,000 in premiums on top of the $22,000 OOP max that I'm guaranteed to hit every. single. year. to get the care I need.

Anyone who is pro-business, pro-entrepreneur, should generally be for good public healthcare. This would relieve businesses of a LOT of administrative burden and overhead to let them focus on their core value proposition. It would also facilitate a lot of good startups by freeing people to go build something great. A lot of potential capital growth, innovation, and disruption is being wasted because the people who can do this are stuck in place.


This is not my experience in NJ and WA. Both had PPO plans with wide networks (BCBS at least) available, and I have never had to worry about out of network providers.

Everyone can find out the cost of their health insurance including employer subsidized in box 12 code DD of W-2. Mine have been very close to the healthcare.gov prices, which NJ conveniently lists here: https://www.state.nj.us/dobi/division_insurance/ihcseh/ihcra...

Also, the individual maximum out of pocket maximum is much less than $22k:

https://www.healthcare.gov/glossary/out-of-pocket-maximum-li...

>Anyone who is pro-business, pro-entrepreneur, should generally be for good public healthcare. This would relieve businesses of a LOT of administrative burden and overhead to let them focus on their core value proposition.

The current situation where businesses get to silo wealthy, young, white collar workers into healthier pools of insureds, and the ability to purchase insurance with pre tax money rather than post tax for individuals whose employer does not subsidize is all beneficial to large employers. Which is how they like it.

If the US is going to stick with insurance system, then at least everyone should be dumped on healthcare.gov and employers completely removed from the equation.


1. goto healthcare.gov

2. select a plan

3. pay for it

4. congrats, you now have healthcare.


A small anecdote:

During industrialization "extended" rests where added before they where required by labor protection because they increased productivity.

Multiple experiments have shown that in some situations software companies can be nearly as productive with a 32h week then a 40h week.

As far as I can tell the US has been moving in the opposite direction, dismantling or avoiding labor protection and sometimes outright forcing people to work multiple jobs.

Similar having long term health issues you can't treat because you can't afford it isn't grate for productivity. One of the more successful (non-private) health insurances in Germany is also one which also covers comparatively many precautionary health checks/things. As it turns out making it easier for people to less likely get serious sick is cheaper in the long run.

Add to this that a lot of IT systems where added but in my experience many of this IT systems are designed for middle/high level management to look nice, instead of being designed with and for the people which use them.

Lastly add to it that the future prospects look not so grate for a lot of citizens (not just limited to the US) which kills motivation (positive motivations works in general better long term then threads).

So I'm not surprised.


This reminds me of a study where purses would be dropped with < 20 usd equivalent in them at random places around the world and the sociologists, psychologists and economists couldn't understand why most of them were returned given their understanding of human nature and the answer that they didn't want to accept was that their understanding was wrong.

People don't work harder because their work is unfulfilling, their pay is underwhelming, and their hours are exhausting. Inflation just went through the roof and half of all price increases in domestic product are directly corperate profits and even the most uneducated among us are aware that every single part of this system is rigged.

and the Washington post claims "no one is sure why" but in reality they mean "nobody in a position of privilege that we would hire or talk to has an excuse that doesn't point out the obvious things we can't say about class struggle and income inequality"


My productivity has decreased simply because of delays and shortages. I can't get things when I need them or sometimes at all. Everything takes so much longer when you can't plan for anything.


Going to speak for my organization only, but I know exactly why we have become less productive. And it isn't even a bad thing. It's security. We played loose and fast and took risks. We got lucky (as far as I know) and it never bit us, but it certainly bit adjacent organizations and mandates started coming down from higher up to no longer play fast and loose and to prioritize security.

There is inherently a tradeoff here. Don't trust and verify takes longer than trust and don't verify.


We’re fucking exhausted and there’s never an end to work (agile). That’s probably part of it.


I like Agile in theory but you're totally right about there always being something else, no proper breaks, you fix or make several things and then two weeks from now you're doing the same thing again but a different feature or fix, pretty much no change in the pace of the routine, just go go go.

At least when I worked in fast food, sure there'd be the nightly dinner rush, but that only lasted about two hours, then everything would quiet down and you'd get to take a breather, take your time, goof off with coworkers, put a movie in the VCR in the breakroom (it was a long time ago), etc.

Usually I at least take it a little easy the day we finish a sprint, but I still have to 'report what I did yesterday' in the sprint meeting the day after, so I have to have done enough to have some progress to report the next day. And then it's off to the races again. It's fucking exhausting.


Yes!!! Waterfall could solve this in many ways.


This article doesn’t do a good job of defining its basic terms to make its claims.

They admit knowledge worker productivity if ‘tricky’ to measure, yet are somehow sure it has decreased drastically, with no link to evidence.


Without understanding precisely what is being measured, it is useless to try to understand why the metric is moving in whatever direction it is moving. This is hand-wringing and trying to blame the workforce for a collapsing bubble.


Wittgenstein's ruler comes to mind: what is 'productivity' measuring?

A quick glance at the chart in the article suggests that the variance of this metric is huge. It is more or less a white noise source with a small DC offset. Given the formula the BLS uses, I would be hard pressed to calculate my own productivity (outside of lawyers and people working on assembly lines or at fast food establishments most people do not keep track of their hours). If I can't measure my own 'productivity' then I have no idea how the hell the BLS is going to do it.


I was wondering the same thing -- how could one measure my productivity? (as a software engineer)

The best connection I could think of is something related to the company output, but in a market downturn I could be working 12 hour days, and the company would still be doing worse...


Agreed. This seems like a newspaper trying to squeeze an article out of a weak signal.


What is this nonsense "productivity" and how does one assign a number to it?

I think it's largely a monetary artifact. A bunch of money sloshed around the system. Those transactions inflated GDP. Then, with a lag, official inflation numbers caught up. Now "productivity" is lower than it had been at the peak, just because transients pass through our measurement systems.

Something about using the word "productivity" for this particular measurement feels borderline evil. So much that is productive in the vernacular sense is not productive in the GDP sense, and vice versa.

Having and raising children is productive (of human beings!), but it only enters GDP stats insofar as you buy formula, diapers, and college tuitions.

If you care for your own ageing parents, or if you raise your own children, GDP goes down. But if you take a job and pay somebody else to be a caregiver, then GDP goes up. Consider the extreme example of two women who pay each other to raise each other's children.

In Ukraine, wheat production is down. Some unmeasurable production of defense services is way up. Since the soldiers don't pay their squadmates to lay down covering fire, that working-together isn't captured by GDP.

We call it "productivity", but it's just a measure of intermediation by money. It'd be like measuring "social activity" as aggregate Facebook likes across the country.


Anecdotally, overall morale isn't great currently. I doubt that helps.


“No one knows why”: No, lots of people know why, but many of them refuse to acknowledge it because it contradicts the orthodoxy.

The first sentence lists cause and effect (though it makes the same mistake of fact that leads to the mystification here): “As workers slowly return to offices after the pandemic, productivity has been dropping in the U.S. economy.”

It is not “after the pandemic”, it just after everyone has decided to quit acknowledging the pandemic. But COVID doesn’t give a rats ass about whether people acknowledge it, so when you stop control measures (by having “workers return to offices”) and stop accommodation measures (like additional paid sick leave) you get more people catching COVID at work, and more working hours where workers are impacted by the effects of acute or long COVID.

There are lots of people out of the labor force due to long COVID (which is part of the reason for the tight labor market), but there are also even more people in the labor force, and working, with long COVID; see, e.g., https://www.brookings.edu/research/new-data-shows-long-covid...


I feel like supply chain issues are worth a mention here. The labor market has been super hot, and finding new employees is a chore. At the same time a number of sectors outputs have been rate limited by supply chain issues. Do you reduce shift work and lay off salaried staff while you wait for the supply chain to catch up, or do you use a combination of stimulus money, cheap debt, equity sales, etc... to bide your time until things are back to normal?

In the last year or two, at the peak of the employment boom, the answer was almost certainly to bide your time. If and when the supply chain returned to normal, you would then be in a good position to restore output to meet demand. If you had laid off a bunch of folks, you would have found yourself scrambling to rehire in one of the worst employment markets (for employers) in history.

However, with interest rates rising and a potential recession in the near term horizon, that equation might change. We could (and already) seeing more layoffs. I think after a year or two, we are going to start seeing productivity snap back as a result.


I mean, it's obviously WFH. You might be more productive working from home but most people aren't. Any productivity boost was due to the novelty of it all but disappeared once the honeymoon was over. My friends who had never worked a single day from home before the pandemic were ecstatic and felt really motivated. Nowadays, not so much.


> The productivity plunge is perplexing, because productivity took off to levels not seen in decades when the coronavirus pandemic forced an overnight switch to remote work, leading some economists to suggest that the pandemic might spark longer-term growth.

We're talking about a 4% change? That's the "plunge"? When it spiked during the pandemic, they were ready to conclude that it would always be like that, forever and ever, if not continuing to grow higher and higher? Both these positions are ridiculous, given how small the differences in both cases are, and how squishy a measurement productivity is in the first place. "If these trends continue, nobody will be doing any work in twenty years, and we won't even be able to tell you about it because we won't be working either!!!"


What makes no sense is looking at pandemic increases in productivity and ascribing it to remote work. There were 2 obviously much larger reasons for that increase.

1. At the macro level, the increase was heavily driven by the fact that the least productive jobs were simply not being done. Blue collar workers were not working while white collar workers were.

2. At an individual level a massive increase in productivity was a result of people not having anything else to do so they were working a lot more.

Remote work might have led to an increase in productivity but these 2 factors definitely did. Yet people have been talking as if all the productivity increase was entirely because of remote work when it was almost certainly behind at least these 2 factors.


Are they happier? Great.

Otherwise well, less crap produced and reaching the global market is already something nice to hear I guess.


This shouldn’t be surprising. For the majority of us workers - output no longer correlates to remuneration. Working hard just means that you have slightly more money to live paycheck to paycheck on.


if I had to guess, I might guess that it's because large (and even a lot of small) employers put profit above all else, including employee compensation. for a while, employees observed themselves working hard for zero benefit, so they simply slowed down. not everyone can just pick up and move to another job that treats them better.

if I had another guess, and I pulled from my experience as a software developer, I would say that the desire to have continuous productivity from all employees has created environments which throttle talented employees and asphyxiate those who are learning because they can't contribute immediately.

I see this all the time in my own life. I can contribute a great deal if I am left alone to do work that I see needs done, and I have demonstrated this multiple times. but if you don't understand this about me and you want daily stand-ups where I explain what I am doing, all I get is challenge from everyone on the call. "why are you doing that? we want you providing value. you should pair more." I promise, and I have delivered previously, that if you just leave me alone I can do great things, but when every last person on a call gets a say in what I work on, you render me completely ineffective, and that's where I am now. this is a direct consequence of technical leaders having MBAs and no understanding of people or the work they are doing.


At least in tech, no one tell them its filling in all the jira nonsense and irrelevant agile ceremony.


"The productivity plunge is perplexing, because productivity took off to levels not seen in decades when the coronavirus pandemic forced an overnight switch to remote work, leading some economists to suggest that the pandemic might spark longer-term growth. It also raises new questions about the shift to hybrid schedules and remote work, as employees have made the case that flexibility helped them work more efficiently. And it comes at a time when “quiet quitting” — doing only what’s expected and no more — is resonating, especially with younger workers."

Perplexing,really ? I do not find this perplexing at all. Maybe productivity increased at the beginning of the pandemic because people were afraid of losing their jobs. Maybe once people realized they could work from home and not worry too much about getting fired, they are simply letting gravity take its course and finding out how little they need to do to keep their jobs, or if they even care if they keep their jobs.

Look what has happened to remote education. We may now have a damaged generation.

Yes, for certain people in certain jobs, working from home can be a productivity boost. This is not true for the general population.


This headline is alarmist and the article itself is designed to manipulate people unfamiliar with the productivity metric or its recent movement. Note that the article and the graph are about the % change in productivity... which spiked sharply during the pandemic and has now returned to the growth line it was on prior.

Here's the fed numbers: https://fred.stlouisfed.org/series/OPHNFB

The productivity drop the OP is referring to is that little blip downwards at the end. Hell, here's the Fed asking a year ago if the pandemic boosted productivity: https://fredblog.stlouisfed.org/2021/07/has-the-pandemic-boo...

Basically all the chatter in these comments are irrational speculation, based on a false premise, and flatly wrong.


There should be a name for the logical fallacy where direction of travel of a statistic is taken out of context. Local fluctuation being taken as the beginning of a trend. Perhaps just "short-sightedness", but it's more specific - inference of future events based on short term signals.

It's really common in news about the economy. I guess the outcome is really important, but we only have these instantaneous signals. So something like superstition kicks in (when we're not carefully considering what is stated).


That plot in the article showing “percent change in labor output” looks like pure noise to me. Are they sure that’s measuring what they think it is?


Quoting a quote "The only way a business is successful and productive is if employees feel that sense of empowerment, that sense of energy and connection for the company’s mission and are doing meaningful work"

So companies are giving the intrinsic motivation spiel. Fine. Who doesn't want to gaze at the majesty of creation from the peak of the Maslow pyramid/hierarchy?

Yet this year has shown that at the first sign of trouble, even with billions parked in the bank, shareholders and not employees come first. Or if you are a startup then leadership mismanagement and bad decisions are absolved with layoffs (and some tiers). That bottom unsexy part of the Maslow pyramid where basic needs exist are looking very shaky. Hats off to you if you can keep the company and vision your number 1 goal at the looming expense of your family's well-being!


ITT: A bunch of people convinced they know why, with 1-2 anecdotal datapoints, mostly their own experience.

Come on people, be at least somewhat scientific.


Probably the long-term effects of COVID which causes exhaustion, brain fog, depression, etc... Combined perhaps by people having had time to reset and reflect on their life. Combined by the fact that the planet and democracy are dying.

These are theories of course, but I don't think one has to always look for very complex or technology related answers.


I haven't noticed any direct effects from my three bouts of COVID, but I have noticed a general decline after not being allowed out of my house for 8 months straight.


"less productive" compared to when? Last year? Historically? Is this a return to the mean?

Just a brief glance at the first chart in the article, and last year was a bigger increase (6%) than this year's decrease (4%), and 2020 had the largest increase ever (10%).

I didn't even bother reading the rest of the article after seeing that.


They ask just about everyone you can think of in this article about what's up with the US worker. They even ask Larry Summers, even though the only thing Larry Summers has to teach us is how to fail upwards consistently. Of course, they don't think to ask a, you know, US worker about what's up with them.


IMVHO the reason is simple: we all use IT stuff to do pretty anything. The original IT was *information* technology in the sense of tools to manage information. Then they involve for business purposes into crap try mimicking old paper-based tools offering their limits AND NOT their advantage but allowing those who sell crapware to profit.

The current involution is even deeper: most people nowadays own really nothing in IT terms since they live on someone else computers at someone else rules.

IF we came back to classic desktops, with their document UI, their end-user programming concepts AND classic HUMAN networking and organization in a far less near-flat pyramid than productivity will skyrocket.

Oh in other philosophical terms: free people are FAR more productive than slaves, since the modern world is more and more based on slavery...


Like everyone here I would also like to propose an explanation. I think it's because Mercury is in retrograde.


this is not the time the time to start a new love this is not the time the time to start a lease


Past US workers had the benefits of unions and collective bargaining. 40 hour work weeks. Overtime pay. Paid time off. 5 day work weeks. Decent benefits and pay. You could be the sole source of income for a family of 4 and pay for a suburban lifestyle.

Keep your worker bees happy and they will be productive.

Today, workers have shit pay compared to the CEOs and other C-level executives with multimillion dollar salaries, bonuses, and golden parachutes. Many people work multiple jobs just to pay rent. Unions mostly busted. Erosion of safety nets, and other public services. Those same CEOs even work with other companies in the same industry to artificially keep worker pay as low as possible by having an informal agreement to not recruit and hire from companies that have this agreement.


When you look at this graph, it just screams "noisy number."

In other words, "worker productivity" is a nonsense metric. It'd be more instructive to graph the variables that go into it, going deeper until you find something that makes some intuitive sense.


Quippy Answer: Bloat-tastic manager/worker ratios, plus meetings and other busywork.


Wouldn’t labor productivity be impacted by really low unemployment and a low real minimum wage?

That should mean lots of people who would not otherwise be part of the labor market are getting low wage/low productivity jobs. So in an aggregate measure dollars of output over hours worked, you’re raising the numerator at slower rate than you’re raising the denominator.

Early in the pandemic, high wage white collar workers stayed home and kept their jobs. Low wage service workers were furloughed. -> labor productivity goes up. Service workers get hired again-> labor productivity drops.


War, shortages, incompetent / corrupt government, bad behavior being incentivized while good behavior is being punished, and inability to trust people are all really discouraging. What am I working for?


From the actual BLS report:

> Labor productivity, or output per hour, is calculated by dividing an index of real output by an index of hours worked by all persons, including employees, proprietors, and unpaid family workers.

This metric IMO seems to be sensitive to other macro indicators. The recent economic slow down means less real output while the tight labor market means the same or more nominal hours worked. The slowdown in any industry sensitive to interest rates means a lot of people talking and waiting on how to adapt but fewer projects and products developed and delivered.


I bet my shirt that the measure is flawed.

I mean look at the chart showing the change in productivity. It's all over the place. Productivity is a cultural thing that doesn't suddenly jumps up and down.


It's difficult to be productive when you're waiting on a shipment of parts, for one thing. We've been working for decades to make industries work with more advanced logistics and less stock on hand. Now there are supply chain issues, and you can't assemble and sell something if you don't have the parts whether it's a car, a computer, or a rose hip half skim gigante honey latte.

Real wages are up a bit, but revenues are way up despite the supply chain issues. People are being forced back into offices who don't need to be there. Maybe morale is low. I know of concrete instances of low morale, and I'm sure there are others.

People's life changing because of RTO requires attention.

People are often looking to move or to change jobs recently, which requires attention.

Millions of people have been ill with a respiratory/vascular virus which sometimes causes long term damage. More than a million in the US have died from it. Survivors often have pulmonary issues and long-term mental fog which may be permanent or take years to recover. They have less energy and stamina. It's harder for many of them to concentrate. Some of those who died were in the workforce, and whatever knowledge they had about their job died with them. Other deaths were people's family and friends. Funerals, cleaning out houses, donating their belongings, and the grief itself aren't exactly good for worker productivity but they are things humans need to do.

Lots of micromanagers exist. For people who haven't returned to office with no open floor plan to walk around, many of them use a messaging app like Slack to micromanage. Water cooler conversation is in Slack channels. Meetings are in Slack, Teams, or Zoom or something else, and everyone's supposed to be engaged rather than working on their laptop until it's their turn to speak. There are often more officially designated meetings because people can't drop by one another's offices. Lots of work is concentration-based work, or "flow" work. Constant interruptions are bad for anyway, but when it takes 20 or 30 minutes to get all the context in your head to solve a problem and a three-minute interruption to lose all of that, more interruptions can be catastrophic for productivity.

Some types of business have minimum staffing requirements. You can cut staff and try to "right size", but you need enough staff to keep the place open if that's your goal. If orders are down enough, your staff will sometimes have less to do. If because of the shortages mentioned before your ability to fill orders is down, the same thing applies. You have a choice of eating some less profitable quarters until the supply chain levels out or just closing shop. You can't lay off 100% of your trained staff and count on rehiring them later.


I will tell you why: People are changing jobs a lot. When you start a new job you are usually slow at almost everything. You don’t know where files are, you don’t fully understand what you are working on, you don’t know who to go to for a particular question and so on. Worse you slow everyone else down too.

Companies decided employees weren’t worth raises. Employees left. Companies need to value employees more with pay if they want them to stick around and keep all that institutional knowledge.


So, the economic value measured for a piece of code drops if fewer people buy the product that it's in or its price has to be cut, entirely separate from how fast anyone is coding. Similar logic applies to a lot of sectors, not just software. Macroeconomic changes are going to work their way back to these stats eventually, independent of any changes in how workers spend their day or the concrete stuff they produce. (Kind of like stormbrew and nostrademons said!)


This is what happens when you measure GDP using services. The tail wags the dog with demand Being the tail. I suspect you see different results if you looked at physical Goods.


'Productive' is such manipulation is this usage. In the modern economy, products shipped per employee often has little relation to how hard an employee works. Does not this metric speak poorly upon leadership's decision-making, the increase of onerous busywork and surveillance, and a dislocated world economy?

If you believe Zuckerberg chasing the Metaverse has misallocated resources, his company has shipped less 'product' per employee. Does that really mean employees have been less 'productive'?


Can I be naive and argue a simple explanation?

Cheap money the last 13 years = hire more people who do less and get paid more. Or even hiring people for positions that aren't needed.


I would propose that part of the loss of productivity is the lack of staying power in employees. Employees are getting trained working a year or two and then moving on to something else.

The efficiencies derived by working 10-20 years in the same field and even the same position are lost with the modern workforce that barely gets trained to competency before leaving let alone staying long enough to build true efficiency and thus greater productivity.


It is surprising how no one is talking about the productivity loss due to social media & entertainment. I feel most people now spend a considerable amount of time on social media, entertainment on both personal & work time. I am not saying companies should impose restrictions on employees but we are living in a generation of 24x7 entertainment. We are definitely less creative and productive compared to workers 20 years ago.


Just think back to that Office Space scene where he's explaining he has eight bosses

The circumstances make it so that we're incentivized to do enough to not get fired


The lowering of standards, shortages, expensive fuel, food inflation are all reminiscent of Communism. People don't feel like working for a system when it seems to be failing them. A million dollars for a new home, when the lower bracket still makes under 50k, and get raises in cents. Just kind of kills incentive to work hard, when even if you do starve and penny pinch your way into some savings, inflation will eat it away anyway.


There's a strong motive to deny that people operate on incentive. If people are incentivized by profit and attaining greater things, the idea that wealth distribution doesn't disincentivize workers completely collapses.

Numerically, I make substantially more than I did 5 years ago. In terms of relative value, while I do have a fancier title, I barely make more than I did when my job was easier. If someone were to ask me "what are you career goals", I'd struggle to tell them any at this point because I have little reason to do more than what I already do. What, I'm supposed to want to take on more responsibilities only to find the economy adjust itself so my lifestyle barely changes? I know some people have a frankly ridiculous form of workaholism that allows them to persevere, but I see them as the horse from Animal Farm. Eventually they will lose the energy to do what they do effectively and the system is not going to support them in proportion for their loyalty, to say the least.


Thanks, that is interesting, and you've given me something to chew on for a while. The similarities are there, and the causes are plausible.


"Communism is when capitalism". What you're describing are literally failures of capitalism.


Costs went up like 20-30% but we didn't give out 20-30% raises. Perhaps workers are leveling their productivity to the purchasing power of their salary?


Too many things and tools to deal with. For example, my manager uses a shitload of apps/services that he keeps inputting information to, rather than spending time to work with my team. It's not to say that he is wasting time - he is just literally following the protocol.

The protocol of working and the order of things is just more fucked up these days than it was in the past.


As a CEO I have learned this one the hard way: ask the workers like an adult what things are wasting their time. Economists out here like 'how do we solve the vexing problems that ail us' meanwhile the top comment is about how doctors spend 50% of their time doing paperwork. we could double the amount of doctors by hiring secretaries again! Duh.


You should ask what’s wasting their time and then do some studying to confirm or correct it. People aren’t very good at objectively reporting how they actually spend/waste their time.


This is a bit abstract, but I think this has to do with processes and liability.

Whenever there is a problem people look for someone to blame.

The easiest thing is to create a new process to avoid this in the future, and be protected against blame.

If this proceeds unchecked, the process grows and grows until every little thing takes forever. The processes need to be pruned sometimes too, not just added onto.


Well, what has being more productive gotten them?


The article focuses on a productivity drop observed in 2022 as compared with the past two decades. Anecdotally, this is the first year I've been able to take a meaningful vacation since the pandemic started. Perhaps, the reduction of covid restrictions allowed individuals to improve their work-life balance in favor of more "life".


My guess is it's related to inflation. Changing costs can cause people and companies to adjust to different economic conditions, and that adjustment can hurt productivity.

Something similar happened in the mid-seventies.

https://fred.stlouisfed.org/series/OPHNFB


Look at any data comparing wage growth with productivity since the 60s and that should go a pretty long way to explaining it.


My theory: increasing dominance of bad productivity ideas that seem good on paper.

The execs love how cheap their GCCs in India are, but they're somehow unaware that these folks can take months to do tasks that should take less than an hour.

That and the exponentially growing parasite of tech-enabled bureaucracy, but we already knew that.


no carrot -- before people thought if they worked hard just for little longer they will escape the trap, now with housing doubling in 2 years, it's become hopeless. With no end goal, only thing left is acceptance and survival, and survival as is doesn't require me putting in 200%, or even 60%.


Perhaps it has to do with the fact that businesses run on skeleton crews these days, and stretching a low number of employees thin decreases their productivity. I'm sure that employees say the same thing if they were directly asked.

That and being forced back into offices after working remotely for 2 years.


If you directly asked me I would probably say: "Well, working from home the temptation to watch YouTube and stuff on my second monitor is much higher and sometimes I find I've wasted a lot of time procrastinating my tasks because there are essentially no consequences for doing so"


Weird that energy / natural resource costs werent mentioned given the way that productivity is calculated.


There is rising incompetence everywhere.


If my conpaby doubles staff at same pay, for internal whatever, but generates the same product, what happens to productivity?

Now, what if instead my company hires some other external business, to perform the same function with the same people at the same price? What happens to productivity?


systemic burnout, overstimulation in all dimensions of life, the rat race is getting harder and harder


The article focuses on the drop of productivity measured in 2022. Personally 2022 was the first out of 2 years I could take a meaningful vacation. Perhaps the reduction of restrictions allowed people to improve their work-life balance in favor of more "life".


It’s quiet quitting? Gallup polls suggest as much https://www.gallup.com/workplace/398306/quiet-quitting-real....


How about a giant freaking pandemic.


Lot's of people know why. Academia hasn't been able to find a way to frame it that is palatable.

We live in a society where every single thing is a rent seeking and nobody believes in anything besides winning because everything of substance has been hollowed out.


Does anyone know how productivity for knowledge workers is actually measured? The article only goes so far as to mention that measuring productivity is "particularly tricky", but what datapoints exactly are these statistics even pulling from?


The economic concept of productivity has nothing to do with anything you can measure about a single employee (or kind of employee), the article is just conflating two very different meanings of the same word.

In aggregate you measure it based on how much the company spends vs how much it makes off sold product. It's a measure of the efficiency of the company to its holders of capital.


Oh, that's very interesting. Can you get more specific? What's the difference between productivity in this sense then, vs profit margin?


I'm abstracting a bit more than I should really, because yeah the way I put it comes off as too close to just profit.

But there are a bunch of ways to measure it, that generally comes down to some comparison of inputs vs outputs and those inputs and outputs have to be measurable in some way in aggregate.

If you Google for "economic productivity" or "labor productivity" you can find better explanations of the details than I'm likely to give.


The more you measure productivity, the less productivity there will be. Employees spend a good amount of time documenting their productivity for nonsense like performance reviews, and that is a lot of time that could have been spent doing actual work


Classic counterintuitive anecdote where measurement increases productivity: https://en.m.wikipedia.org/wiki/Hawthorne_effect “The Hawthorne Works had commissioned a study to determine if its workers would become more productive in higher or lower levels of light. The workers' productivity seemed to improve when changes were made, and slumped when the study ended. It was suggested that the productivity gain occurred as a result of the motivational effect on the workers of the interest being shown in them.”.


Yesterdays post about interest rates and expected returns seems related from an aggregate and opposite side of the market:

https://news.ycombinator.com/item?id=33394486


The best way to understand this is to compare Tesla's R&D expenses to Twitter's R&D expenses. If you got into the details you would see MANY MANY meetings at Twitter with literally nothing was accomplished.


I'd be curious to know how long people have been in their current job vs. pre-pandemic. Many people are now returning to work, or finally getting around to switching to a new company. They need to be trained.


If you want to know why please read this article https://news.ycombinator.com/item?id=33395667


This might be a very stupid question, but how do they measure productivity for knowledge workers?

What does it mean that a software engineer is less productive? Or a stock trader? Or various other knowledge jobs?


For many years, for political reasons we'll never have another "great depression", and this has possibly now extended to reporting recessions.

Its worth pointing out that the economic indicator of productivity is basically GDP divided by worker-years-equivalent worked. So if its double-plus ungoodthink to ever report a decline in GDP, we can still, for now, report a decline in productivity while also reporting unemployment remains mostly constant, as long as no one makes the connection and cancels them for reporting disinformation.

Its not a "recession" of course, that would be badthink of the highest order, its just a mysterious decline in the productivity metric, and nobody can talk about why while remaining politically correct. Its a good demonstration of how effective censorship can be. Why, I hear times are so good, the party is increasing the chocolate ration.


Heavy push to go from WFH to in office in the last six months.

Decrease in productivity in the last six months.

It must be all the remote workers!

Ignore the return to office push, inflation, burnout, and every other possible factor.


It's the opposite. It's the working from home that's the problem.


Productivity rose year over year while wages stagnated and layoffs were frequent. Workers have discovered the hard way there's no payoff for hard work.


My guess is Covid. I don’t know about anyone else but I was so burned out focusing on work isn’t a priority.

It’s going to take a year or more before people feel like they’ve recovered.


Manufacturing productivity has dropped due to broken supply chains, in fact manufactuers of sub-components have own issues with missing components and so on.


Productivity is like pornography, we know it exists but there is no single definition or measurement of it. But people treat it like a bank balance...


>"U.S. workers have gotten less productive – no one is sure why"

U.S. media has gotten less informative -- no one is sure why


The supplied chart looks like noise. Maybe the supposed loss of productivity is cyclical. It does not seem too concerning, imho.


$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$


Why, it's the working from home of course.


The productivity chart is very spiky with values changing direction all the time. I’m not convinced this is note worthy


People going back to the office with 2 hours of commuting won't be more productive than people who work from home.


I don't know how this is measured:

Could the effect be caused by the US onshoring industries that have lower productivity?


I’d say it’s an incentive problem. Not many employees get extra pay if they’re individually more productive.


As technology develops, we need to work less. In 2050 I doubt we'll all be on a 40+ hour work week.


VC cash and cheap debt, oodles of it. Workers are reaping the benefits of 'growth.'


Covid (Long Covid) and getting poorer relatively over time will all factor in.


My paycheck only goes ¼ or less of the way it went just 5 years ago. Maybe that's why.


>washington post >bezos newspaper

of course they would be concerned about productivity of all folks


It never trickled down, if anything it trickled up. Pretty obvious why in my opinion.


“Productivity” is as useless as “GDP” when it comes to measure anything important.


Merit is no longer rewarded at my firm. Why be more productive than you have to be?


The common wisdom is more workers are "acting their wage".


Probably their terrible labour laws. You can only grind people down for so long.


>>no one is sure why HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA


productivity is scaling according to pay. system is working as intended.


The article is garbage. It just throws lots of things at the wall.


Concentration of wealth has made most work seem … pointless


Some people are sure why, they just dont listen to them...


"Rushing makes messes." - Robert C. Martin


Could the shift to hybrid/WFH be attributed?


Surely, it cant be financial insecurity, uncertain future, and not getting anywhere regardless of how much work they put in. It definitely cant be that the capitalist system has already squeezed the last drops of productivity from people in the last few decades and there is no more. It definitely can't be that what Marx predicted already happened and the majority of people don't have access to means of generating wealth/income, being unable to consume, therefore putting the economy into a crisis of growth and profitability.

No one is sure why. Its just a mystery.


We should blame it on return to the office.


People realize endless growth is impossible on a planet with finite resources and are tired of working for peanuts pretending like it is.


This is not a mystery. 2022 is the peak year of retirement for the large baby boomer generation. People approaching retirement are at their most skilled and thus most highly productive. As they retire the average age and hence average skill level drops. I suspect if you separated the statistics into blue collar jobs and white collar jobs, the blue collar worker output began dropping a decade ago as the boomers aged out of their physical fitness prime.


...and yet everyone is sure why.


Yawn. We all know this stems from the open discrimination against straight white men in the hiring process


Because burnout is pervasive?


Seriously?

Fuck off, Jeff.

Pay your FUCKING TAXES, Jeff.

You are an asshole, JEFF.


RTO


There is no enough workers


Its simple. The healthy people moved away.

Its obvious that much of the GDP is based on low aptitude/low skilled products and services (i call it fat juice and fat juice filtering, looking at buffetts investing advice). An unhealthy environment is not something to grow old with.

In the states, we can see that the illegal migration dumping is caused by the unskilled labor needed to support archaic GDP numbers.

As a technology native, I was too young to work when the H1-B economy started to spring up, but the linear from that is easy to see.

And somewhat off-topic, it looks like with the UK's PM, we get to see more of the effects that infosys played on the economy.

Businesses that outsourced gave away their whole business model is an easy conclusion, so now they all have lower quality/priced competition (obviously speculation, but plausible remains entertaining).

Looking at Walmart, I could never understand why they left their workforce in servitude, without paving the way for a future. The irony that "merica'" is filled with products from a communist country still astounds. They have no prospects for the future and supplement incomes with foodstamps and welfare.

Even quiet quitting is easy to decipher. Its an unhealthy environment.

Its simple. The healthy people moved away.


Maybe i'm too stupid to decipher your message entirely but the healthy people moved away? To where?


Not to you, obviously. It was a reference to participation (the lack thereof)

And I should have linked a source, but business outsourcing was more of referring to manufacturing in china and the counterfit goods. Software is obviously easier to hijack too.

It is cool seeing the karma go up and down on this post. Its back to (1) :)

A global forum is always fun :p

Its simple. The healthy people moved away.


Your post is a mish-mash of thoughts that might be coherent independently but put together is just indecipherable. Consider re-writing your comment to get the point across.

But let me try to respond anyway even though I am unsure of your point:

>In the states, we can see that the illegal migration dumping is caused by the unskilled labor needed to support archaic GDP numbers.

The native population's costs to maintain those jobs locally would make the underlying business untenable. Perhaps this is the end state of all these businesses taken by illegal immigration. Maybe robotics might save the day. People are not going to pay for 14-20$/hr picked strawberries. Maybe they will but that 20$/hr will eventually become 30$/hr and so on.

>As a technology native, I was too young to work when the H1-B economy started to spring up, but the linear from that is easy to see.

H1-B was a miracle program that allowed numerous non-white people a sliver of a chance to experience the fruits of the western empire. A large portion of the white population came over to the new world without any papers and/or none of the roadblocks that current immigrants face. After they established themselves, they slammed the door on everyone else. At the same time, the third world is recovering at a snails pace from decades of exploitation and wealth extraction by the western nations. Put yourself in the shoes of people who want to experience a kind of life that westerners live. What options do they have? They can't easily just waltz into the western countries like their white counterparts did...meanwhile they were born into an environment that was fundamentally destroyed and stripped bare by those same western powers. They can't fight back to take that wealth back because the western military might will crush them, they can't improve their countries in any reasonable timeframe because there are too many forces working against them. But then comes along H1-B and it gives many ambitious people an opportunity to get exploited by the ownership class in exchange for an economy class seat into the first world. Who are the healthy people in this scenario? The white American working class whose ancestors walked into abundance with much less effort than people today and whose descendants couldn't muster up the effort to fill those in demand tech jobs? From the point of view of the ownership class, they were never in the game to begin with. From my point of view, the healthy people are the H1-B people.

>Businesses that outsourced gave away their whole business model is an easy conclusion, so now they all have lower quality/priced competition (obviously speculation, but plausible remains entertaining).

Maybe the ownership class is smarter than you. China and the rest of the Asian workshops are aging much faster than the US. China is projected to drop to 600 Million by the end of the century. The current state of them taking all the healthy people is a mirage.


Again, with a global forum, the title was about productivity in the states. You do not sound like the intended audience.

Did the buffett joke about purchasing soda and dialysis stocks even make you laugh?

We are also on the cusp of automation, so unskilled is just an afterthought.

You can get 3 slivers for the price of one. Economics was never pretty words, its just a number.

https://www.sdrplay.com/beware-fake-sdrplay-devices/

I happen to look at their devices and see where increased support costs and possibly lower sales impacted their business model. I'm sure there are more if I look.

The world keeps on spinning. I can't say you even understood the "no one is sure why" speculation in the title.

Its simple. The healthy people like to laugh :p


We need to urgently optimize Boomers and GenXers know-how transfer to millenials or we're toasted.


Come on… we know why.


Just came into the thread for the comments and am not disappointed.


https://www.smbc-comics.com/comic/productivity-2 Mystery solved by an comic author with economy degree :)



Smartphones.


Smartphones.


Anecdote:

I'm a midwesterner and half of my siblings, most of my wife's siblings, and some of my friends' siblings are in their late 20s, have no jobs (or <10 hrs a week), and live with their parents.

From the outside, they look nervous / afraid to try to get into the job market or to date people. Men & women, but it leans men. The ones in poorer families stay home all day and play video games, and the richer ones venture out to spend their parent's money at restaurants or on trips but otherwise do the same. Half of them were doing minimum-wage work and left at the start of the pandemic and the other half have never had jobs.

I could just be in a local pocket of people like this, but I'm worried about how many people must have fallen off of the wagon and will never get back on.

Trying to get a skilled entry-level job after having done literally nothing for 5 years is hard for a lot of reasons. One being the mental hurdle you have to get over: you know you'll face a lot of rejection, you're out of practice, and your work peers will be a lot younger.

--------------

Controversial opinion: The whole thing has burnt me on UBI. I am afraid that the average American doesn't have the discipline to be productive if we introduce too much free income / free state of subsistence.

A great counter-argument is "well why does someone have to be productive? why should anyone have to work?". I don't know anything, but I suspect that we get to ask that question because of how dominant/rich the US is, and that is bound to end if we aren't more than competitive against countries that work their asses off.

Another counter is something like: exceptional people are responsible for the 10x-1000x outcomes that carry the economy, but those individuals are only the catalyst and do it on the backs of the rest of us. Takes all parts to make the machine work.

--------------

Back on topic. Here's some graphs:

FRED Graphs:

Hours worked by full-time and part-time employees by year: https://fred.stlouisfed.org/series/B4701C0A222NBEA

US Pop: https://fred.stlouisfed.org/series/POPTOTUSA647NWDB

Median weekly real earnings: https://fred.stlouisfed.org/series/LES1252881600Q

Employment-Population Ratio - 25-54 Yrs.: https://fred.stlouisfed.org/series/LNS12300060

Real gross domestic product per capita https://fred.stlouisfed.org/series/A939RX0Q048SBEA --------

US Employment rate by age 2000-2021: https://www.statista.com/statistics/217899/us-employment-rat...


Not In The Labor force is a huge phenomenon and has been since before pandemic. I don’t think it’s just pockets I think there is a widespread cultural aversion to sacrifice now for yourself and your future kids.

If your local economy is in decline, you see more people demoralized and jobs dry up in a cycle.

I think I read the term “futureless generation” recently and that stuck with me.


> no one knows why

$7.25


My local Aldi is advertising $16.50/hr + benefits. Average rent for a 1BR in the general area is $1000-1200ish.

As for me personally, I don't think I could be more productive if I tried. I finish work and then look to pick something up - are there isn't anything ready. If I try to make something ready, people want everything run past a committee to gain consensus. Twice in the past 6 months, I've spent 2 weeks developing a plan, running things past people, and then I come to a single question and the response is "hmm, you know, let's not work on this right now".

This morning I was on an email chain in which one executive had committed us to paying $$$$ for some consultants to do something, and he wanted the CTO to do some checks to make sure our system could handle it. But what exactly? The CTO had no idea what he was asking for. They went back and forth, and I'm nearly positive that they were both acting in good faith, but for the life of them they couldn't communicate. And so a bunch of workers are going to sit around with nothing to do, collecting their paycheck.

I've been doing this now for over 20 years, I'm fairly senior, and I deal with executive-level people a lot. I've never seen anything like what's going on, and it's totally unfair to blame this entirely on workers (they aren't blameless either, tbh).


I mean $7.25 is completely inaccurate, and that’s my bad. What I really mean to say here is that it feels like the whole non-C-level part of the work force is disincentivized to try hard, because C-level pay is hyper-inflated. Workforce get stiffed on ownership of the companies they participate in. Completely.

1) the first start up I ever joined, I ended up owning 10% of the platform and grinding frequent 70+ hr work weeks. I desperately wanted to spread my wings and make suggestions (you know, career advancement type bullshit) but I was consistently treated like I had only been hired because there was a talent shortage, and that I was expected to follow orders and shut the fuck up. Execs sold the company 3 years later for $50M. I got $6,000 from the deal.

2) Amazon is a multi trillion dollar company, with huge talent sourcing issues. Why? Employees are grist for the mill. When I interviewed, beginning SE’s were salary capped and offered a tops of $40k in stock options that vested over 4 years. Most devs don’t last more than 2.

These can’t be unique stories. I’m of the opinion that money and control need a fierce decoupling. I’d give a shit about implementing big visions if I were treated like it matters that I care and if I were presented a fair stake in the company. Until then, I’m going to worry about pursuing my personal projects more often than not.


react


wfh


facebook




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: