Hacker News new | past | comments | ask | show | jobs | submit login

Absolutely.

I've been in healthcare for a bunch of years (I dunno, sheesh, I guess like 14?). For a big stint of that I built systems that aim to prevent medication errors in hospitals by telling clinicians when doses were due, etc.

I know of at least two incidents where code that I wrote failed in a way that (due to some edge-case concurrency in one case, and daylight savings time in another case) didn't make it clear to a nurse that another nurse had given a medication to a patient and the patient received a double-dosage of a pretty severe drug.

Imagine getting that call. It's every bit as fucking terrible and humbling as you'd think.

Far more numerous (thankfully) were the calls that we got reporting that we stopped double dosages or even order-of-magnitude label misreadings on drugs. I know of more than a handful of instances that code I wrote may have literally saved a life.

I've been lucky to have found myself at companies that take this stuff really, really seriously. It's a really hard balance to strike when failure is as devastating as it can be in healthcare, but the status quo is pretty terrifying also. In general every health IT shop's culture finds some balance between making decisions based on fear, and trying to achieve a velocity that allows problems to be fixed quickly and the greatest positive impact possible against the problem they're solving.

I talk to a lot of people who want to work on "big/real problems" and list "healthcare" as one of them but when it comes down to it are often pretty freaked out about the stakes. It's weird to be used to the gravity of it, I certainly am now. I can't even wrap my head around what it would mean to work on something that doesn't hurt people when it fails.




Also in the industry, but on the HL7 interface side.

I think you've hit the nail on the head. I've seen studies that suggest, for example, a 75-90% reduction in med error rates in hospitals that have gone to computerized barcode systems. So there's a tremendous amount of good to be done in the field.

But, as you point out, the potential damage that can be done by mistakes makes for a fair bit of anxiety at times. Fortunately, it's been my experience that people in the field tend to take testing very seriously as a result.

Another interesting thing on that aspect - I'm one of the only people in my department that doesn't have a prior clinical background. I almost wonder if the fact that they're less trusting of the technology at times is a good thing to some extent.


>I'm one of the only people in my department that doesn't have a prior clinical background. I almost wonder if the fact that they're less trusting of the technology at times is a good thing to some extent.

I totally agree. Over time I've come to really value working with teams made up of a variety of backgrounds. I think it's incredibly valuable to not end up in group-think on either end of the spectrum when it comes to how to approach problems, assessing risk, and healthy cynicism about technology.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: