Hacker News new | past | comments | ask | show | jobs | submit login

I thought I remembered a proposal to create some new categorical exemptions from IRB review, one of which was somehow related to studies where you just talk to people. (I've been interviewed by several social scientists and very often their "protocols", consisting exclusively of informal interviews, had clearly been through extensive IRB reviews, which felt super-odd because journalists wouldn't need any ethical review at all in order to carry out exactly the same interview in exactly the same way.)

On the other hand, you clearly can harm people in various ways by or as a result of interviewing them, for example by breaching confidentiality, by being forced to breach confidentiality, or by making them feel bad about themselves by insulting them during the interview or just by bringing up painful and distressing memories. And also it's hard for researchers who aren't doctors to extend the same level of legal protection to what their research subjects tell them:

https://www.socialsciencespace.com/2017/01/social-science-ne...

So maybe one idea would be to create a categorical IRB exemption for studies that combine non-invasive observations, or existing datasets, with interviews not reasonably expected to be distressing to the subjects, and conducted subject to otherwise existing norms on patient confidentiality and medical privacy? (recognizing that limited legal protection for that confidentiality still poses problems for some studies)

(I'm certainly happy to apply the Chesterton's fence principle and learn more about the history of unethical experimentation before seriously advocating this.)




> interviews not reasonably expected to be distressing to the subjects

I did some guerrilla street usability testing, of animated science education video fragments.[1] About the size of objects - sort of like Powers of Ten. During red lights at busy street crossings.

Even with a very small test population, I saw people distressed by surprising things.

Mention of millimeters reminded an elderly Brit of unpleasant childhood experiences when learning metric.

A college student expressed distress over a character breaking the head off an (enlarged to arm-sized) T4 bacteriophage.

I had someone run away, across the street, saying "how could you show me something so disgusting!", following a quiet background sound effect, a child's hacking cough, accompanying the word "virus". When I later stopped the video immediately after it to get feedback, most people were "what cough?".

Many people were variously distressed by the mention or discussion of viruses and bacteria. Which I didn't expect, but in retrospect seems unsurprising - for many, they have strong and negative associations. They get very bad press. Think "fun story about viruses!" having the emotional flavor of "fun story about genocide!".

So at this point, I'm unclear on what can reasonably be expected to not distress people.

"How about this nice weather we've been having?"... "Houston has been bringing back painful memories of my family losing its home and business to flooding. :(" Somewhat tongue in cheek, but if you are aiming for do no harm...?

[1] There are some bits of the test videos in the "How to remember sizes" section of http://www.clarifyscience.info/part/Atoms .


I'm not sure if it's politically acceptable to say this these days, but maybe it's ok if people experience discomfort sometimes. It's ok to see icky things, or are 'forced' to remember an unpleasant memory. It's not the responsibility of the people around you to protect your feelings, and your bubble.

There is often more value in science existing than in some marginal perceived harms incurred from a handful of people getting interviewed. Your emotional wellbeing is your responsibility, not the responsibility of the people around you.

And on a personal level, we don't grow from being comfortable. The opposite - we grow through discomfort. We grow through facing our demons and realising that they don't actually kill us.


I dunno about you but I think myoviruses look creepy as shit. https://i.pinimg.com/236x/cb/e0/59/cbe0591ea18e0afa4686bd3f4...


Creepy? The flailing sticky legs that glom on to you? The teeth that punch a hole in your side? The high-pressure vessel driving high-speed injection? Which only gets the DNA string half way in, so it squirms the rest of the way in, ratcheting as molecules attach and begin transcribing your corruption? The minute from bite, to being doomed?

As you frolick in the summer waves, every cup of seawater is war zone, filled with massive and deadly warfare between bacteria and viruses. Ten billion combatants. Bacteria shedding extracellular vesicles like anti-missile flares. Mass dumping of chemical weapons. Suicidal sacrifices. Exploding victims. 2 day bacterial survival - 50%. And most, they haven't been cultured, haven't been sequenced - we haven't even given them names.


The problem is "reasonably expected." For an recent example consider the 70.000 Tinder (?) profiles now located on the piratebay. The guy who did that was likely preoccupied with questions like which scraper to use, and just never thought about "edge cases" like homosexuals in Saudi Arabia. Clearly someone is not reasonable here, but is this a reasonable oversight on the researcher's part?


Other issues also still apply even when there is no physical risk. For example, vulnerable populations like prisoners or children or your students may feel that they are (and may really be) forced to participate, that they can't say no. Some may be unable to give proper consent, such as weakened hospital patients on morphine or severe psychiatric patients.

> journalists wouldn't need any ethical review at all in order to carry out exactly the same interview in exactly the same way.)

Journalists have codes of ethics and laws they are subject to.


Here's a famous example of a "just talk" experiment that went wrong (it literally created the Unabomber):

https://www.theatlantic.com/magazine/archive/2000/06/harvard...


From the article: "Murray subjected his unwitting students, including Kaczynski, to intensive interrogation — what Murray himself called 'vehement, sweeping, and personally abusive' attacks, assaulting his subjects’ egos and most-cherished ideals and beliefs."

I guess that's technically "just talk" but it's not what most people would think of as "just talk".


Didn't they dose him with LSD?


> which felt super-odd because journalists wouldn't need any ethical review at all in order to carry out exactly the same interview in exactly the same way.

Journalists don't have a history of injecting you with syphilis just to see what happens, though.


Every large group will have some people who commit horrible acts, including the media. "Hate radio", for example, was a major contributor to the Rwandan genocide. https://en.wikipedia.org/wiki/Radio_T%C3%A9l%C3%A9vision_Lib...


Pople who are intentionally committing a genocide are out of scope for IRBs, since they are already refusing any kind of oversight because they know what they are doing is wrong.


Neither do anthropologists or computer scientists, who are both now subject to IRB review in universities.


Computer scientists will do stuff like expose people in repressive regimes to internet censors (http://conferences.sigcomm.org/sigcomm/2015/pdf/papers/p653....) or hack your Facebook account and slurp all your data (http://www2013.wwwconference.org/companion/p751.pdf)

Though ENCORE was done with IRB approval, the community is mostly unsure how that was possible...


Thanks for the examples. I'd agree that they show that computer science research is capable of harming people.

Edit: although I don't think they change my intuition that it's strange that high-risk invasive procedures that people expected to cause grave injury are dealt with by the same oversight mechanism as interviews (even though I'm very convinced of the ethical importance of strong confidentiality protections for interviews).

Second edit: I'm also aware that IRBs aren't only inspired by Tuskegee and Nazi experiments, but also by stuff like the Milgram and Zimbardo experiments which didn't involve invasive interventions.


In IRB's that I've participated in, low-risk interviews can go through expedited review, and though it's still happening under the IRB, it's far from the same process as something involving say, injections, imaging, gathering information about potential criminal involvement, or deception.

Though, every IRB is different. Which is another problem!


Who has ever done that?


As mentioned elsewhere in this thread, https://en.wikipedia.org/wiki/Tuskegee_syphilis_experiment is one of the most important parts of the history of IRBs.


Tuskegee syphilis experiment, ok. However, as bad as that was, no one was injected with syphilis.


I thought I remembered that the Public Health Service intentionally infected people in order to see what would happen, but Wikipedia seems to say that they intentionally withheld treatment from people who were already infected and deceived them about this, rather than causing the infections themselves (although some of the study subjects spread the disease to other people during the course of the study, which were new infections that could have been prevented by treatment).


I believe a modern design for this study would compare different kinds of treatment (but not the elimination of treatment... yikes).


Modern policy for medical research is that everyone gets a best available known treatment, and some people also get an experimental unknown treatment.


There must be cases where this isn't practical. E.g., if the normal treatment for greyscale is to cut off the arm before it spreads, how could you try any alternative treatment?


It isn't, and the answer is "you can't get good data". A good example would be immunotherapy for cancers, which has to be tested alongside treatments with adverse impact on immune function


Isn't the net result similar anyway?

1. People who should have been treated weren't.

2. As a result of #1, more people would have been subsequently infected.

Additionally:

3. Distrust in healthcare is engendered amongst a marginalized population, with potential to lead to further incidences of undiagnosed ailments.

The net result is similar, more people are infected than would otherwise have been. Just because the infection mechanism is less obvious doesn't necessarily make it less bad.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: