Hacker News new | past | comments | ask | show | jobs | submit login

I don't think he's advocating that he should be setting the rules or self-evaluating. He just wants the rules to be sensible and sensibly enforced. He has justification to complain if the modifications they requested violated safety policies, risked patient well-being or defeated the purpose of the study. Policy that doesn't add any benefit deserves to be questioned.

Not to mention that the questions were already being asked. He just wanted to ask them earlier to compare against the eventual diagnosis. It's not the kind of study that should have been abandoned in frustration after two years, as happened here.




I think for his next study he should do one into the mental health of those who interact with Institutional Review Boards. If nothing else, it would have a nice recursive effect on the authors.


So, an example of how the author doesn't understand the point of these requirements is the 'encryption' (anonymisation) process, added to 'the files had to be kept together!'. The anonymisation process isn't about stolen files. It's about data leakage, and keeping identifying data out of result sets. If you're pulling your all-nighter to prep for the big talk tomorrow, for example, you're not going to accidentally include identifying data from your results page if it doesn't exist on your results page in the first place.

Edit: more important for the scientific process (rather than privacy) than 'data leakage' is 'data sharing', as yread points out below. Anonymised data can be quickly and safely shared, and others can run their analyses on your results. Non-anonymised data can't be shared. If you're interested in publishing a robust scientific paper, why would you be against opening your data for inspection?

> Not to mention that the questions were already being asked. He just wanted to ask them earlier to compare against the eventual diagnosis. It's not the kind of study that should have been abandoned in frustration after two years, as happened here.

Then he could have just asked them and done his own informal study. Nothing is stopping the doctor from saying "are you happy, then sad?" on first meeting a patient. But if you want to do a formal, publishable study, then you should have all your ducks in a row. Make sure your independent variables are properly controlled, make sure any ethical issues have been externally vetted, so on and so forth.

While the IRB certainly had some annoying concerns, so much of this author's frustration just simply wouldn't be there if he understood why those questions were being asked.


> make sure any ethical issues have been externally vetted, so on and so forth.

Sure, but the complaints of the IRB mentioned (and the auditor) seem to be far beyond ensuring practice is ethical. Instead, they seem to focus on following process only for the sake of process.

Why should the consent form have the title of the study? Why should the consent form contain a list of risks when there are none? What is wrong with having consent forms signed with pencil when pens aren't allowed. Why should the data integrity plan require periodic review (i.e. why should we have a data integrity plan - integrity plan). These are all indicative of a bureaucratic system that places too much emphasis on 'process', losing sight of 'outcome' in the end.


> These are all indicative of a bureaucratic system that places too much emphasis on 'process', losing sight of 'outcome' in the end.

Reminds me of something that Jeff Bezos (of Amazon.com) wrote in his 2016 Letter to Shareholders:

> Resist Proxies

> As companies get larger and more complex, there’s a tendency to manage to proxies. This comes in many shapes and sizes, and it’s dangerous, subtle, and very Day 2.

> A common example is process as proxy. Good process serves you so you can serve customers. But if you’re not watchful, the process can become the thing. This can happen very easily in large organizations. The process becomes the proxy for the result you want. You stop looking at outcomes and just make sure you’re doing the process right. Gulp. It’s not that rare to hear a junior leader defend a bad outcome with something like, “Well, we followed the process.” A more experienced leader will use it as an opportunity to investigate and improve the process. The process is not the thing. It’s always worth asking, do we own the process or does the process own us? In a Day 2 company, you might find it’s the second.

https://www.amazon.com/p/feature/z6o9g6sysxur57t


Ethics committees are there to protect both the institution and the subjects of the study. One of those protections is avoiding misleading the subjects, unless absolutely necessary and beneficial. How many times have you seen people here on HN bitch about Company X's misleading marketing? It's exactly the same with human studies - people feel used and abused when they find out they were lied to. Similarly, jancsika below points out that the author is working on a patient population with literal paranoid people in it; they're not likely to respond well if they find out a questionnaire was for a different purpose than stated.

Sticking to a common set of rules and only deviating when there's very good reason is one way to help protect subjects. What's the 'outcome' here? A doctor wants to do a study. Why is that more important than the rights of the subjects? Yes, everyone who does a study thinks it's going to cure cancer and solve the national debt. They'll promise the moon in order to get their way. These processes are put in place to protect people against poorly-planned studies. And there's no way to know ahead of time that a study is 'trivial' - if you're working on humans, you need to be vetted. "But we already do this to patients anyway" is besides the point; if you let doctors bypass vetting because of that argument, you'd see all sorts of horrific stuff happening. Ethics committees didn't come about because bureaucracy invented them for the sake of it, they came about because people were being unknowingly tested on by medicos who promised that the study was 'beneficial for the common good'.

And what you find distressing is not what other people find distressing. Search for mncharity's comment elsewhere on this page, where people are distressed simply by being asked about viruses. Yeah, sure, that's not typical, but the counterpoint is: is the research beneficial enough to warrant causing distress to people who would otherwise have been left alone?

In short, this 'needless bureaucracy' is there to protect both the institutions and innocent people from researchers going 'rogue'.


No one is saying the vetting process should be done away with. They are saying it should be reformed to not be so god damn stupid.

There's like 50 wall of text posts in here that don't seem to be understanding that


I think it's quite easy to see the benefit of the author's study and their frustration is completely understandable - however, I can't help but feel that much of their frustration was simply because the rules impeded their progress, not that the rules were actually useless.

My major gripe whenever there is a long piece that decries the bureaucracy of various regulatory boards is that the complaints tend to be about how the bureaucracy is a personal inconvenience. Some of the gripes I'll absolutely grant; pen versus pencil and inflexibility for giving potentially violent persons a weapon probably needs some sort of leeway, but I think protectionary measures absolutely should be a brick wall; a surmountable one, sure, but only as a result of you actually trying a bit and demonstrating that your intended actions aren't going to do exactly what the regulation is trying to prevent, and that should be on the researcher using human participants to demonstrate.

The idea of easily avoidable and mutable regulatory functions seems contradictory - that is, a researcher shouldn't be declaring what should and should not apply to them. This isn't fear that they're Hitler and going to inject people with nastiness, it's fear of the dumb mistakes that every human makes and our often poor ability to predict the outcome of certain actions. I get it - they want to help people and the regulations are inconvenient for them; but having gone through many IRB processes myself, it's not insurmountable in the least bit, much less anything the author listed.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: