Hacker News new | past | comments | ask | show | jobs | submit login

I had a conversation with a friend recently who was a doctor. He was telling me that he had 3 patients in a fairly short period of time that had come in with mysterious symptoms (they were all in their 70's) and that had died soon after being admitted. While he knew that he had technically done everything by the book to treat them, he worried that he had made a mistake along the way that possibly could have been the cause of their death.

It made me appreciate that the code that I produce as a web developer most of the time would never have me facing those same ethical dilemmas -- there are very few times a bug in a web app would cause people to die or to be financially ruined.

In this case, however, you can see what happens when an ambitious startup ends up trying to apply things web developers had done for years (A/B testing, "design" decisions) to avenues of life where people's livelihood or health is at stake -- people can end up getting hurt. "Failing hard and fast", the motto many people have regarding startups, cannot be taken with the same arrogance. If you want to change the world in ways that impact people on a personal level, be ready to be very careful with the trust your customers put in you.




Absolutely.

I've been in healthcare for a bunch of years (I dunno, sheesh, I guess like 14?). For a big stint of that I built systems that aim to prevent medication errors in hospitals by telling clinicians when doses were due, etc.

I know of at least two incidents where code that I wrote failed in a way that (due to some edge-case concurrency in one case, and daylight savings time in another case) didn't make it clear to a nurse that another nurse had given a medication to a patient and the patient received a double-dosage of a pretty severe drug.

Imagine getting that call. It's every bit as fucking terrible and humbling as you'd think.

Far more numerous (thankfully) were the calls that we got reporting that we stopped double dosages or even order-of-magnitude label misreadings on drugs. I know of more than a handful of instances that code I wrote may have literally saved a life.

I've been lucky to have found myself at companies that take this stuff really, really seriously. It's a really hard balance to strike when failure is as devastating as it can be in healthcare, but the status quo is pretty terrifying also. In general every health IT shop's culture finds some balance between making decisions based on fear, and trying to achieve a velocity that allows problems to be fixed quickly and the greatest positive impact possible against the problem they're solving.

I talk to a lot of people who want to work on "big/real problems" and list "healthcare" as one of them but when it comes down to it are often pretty freaked out about the stakes. It's weird to be used to the gravity of it, I certainly am now. I can't even wrap my head around what it would mean to work on something that doesn't hurt people when it fails.


Also in the industry, but on the HL7 interface side.

I think you've hit the nail on the head. I've seen studies that suggest, for example, a 75-90% reduction in med error rates in hospitals that have gone to computerized barcode systems. So there's a tremendous amount of good to be done in the field.

But, as you point out, the potential damage that can be done by mistakes makes for a fair bit of anxiety at times. Fortunately, it's been my experience that people in the field tend to take testing very seriously as a result.

Another interesting thing on that aspect - I'm one of the only people in my department that doesn't have a prior clinical background. I almost wonder if the fact that they're less trusting of the technology at times is a good thing to some extent.


>I'm one of the only people in my department that doesn't have a prior clinical background. I almost wonder if the fact that they're less trusting of the technology at times is a good thing to some extent.

I totally agree. Over time I've come to really value working with teams made up of a variety of backgrounds. I think it's incredibly valuable to not end up in group-think on either end of the spectrum when it comes to how to approach problems, assessing risk, and healthy cynicism about technology.


And when the code does matter, too many don't realize the importance of getting their shit straight.

Privacy and user information is a "light" example, but a very basic one that should instill a bigger sense of responsibility than most developers have.


I recall hearing a discussion on the radio from a medical administrator re: the rate autopsies. IIRC the rate of autopsies has dropped dramatically in past few decades. There are many short-term administrative, financial & legal reasons to skip an autopsy, but it seems like a bad trend. While it's not ethical to A/B test human medical treatment in regular care, autopsies would seems like a critical data source to make improvements in health care - seems like yet another variant of "tragedy of the commons" problem that the rates are systematically declining.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: