Hacker News new | past | comments | ask | show | jobs | submit login

Are you FDA approved? You’re distributing a medical product and making claims that 86% of people feel less stress, depression or anxiety. In the US at least, I believe you either need FDA approval, or a disclaimer that you’re not approved.



It's an excellent point and why we're very careful with the claims we make. The claim I think you are referring to is "86% of users report feeling better after their first session with Iona Mind" , which is based upon feedback the users give within the app, but does not relate to any specific measurements relating to any conditions within the DSM-5 which is the standard manual which defines mental health disorders within the United States. We are indeed not a medical device, and we do make it clear to users during sign up that Iona Mind is a self help tool.

If we were making that claim that we treat specific diseases such as Major Depressive Disorder (MDD) or Generalised Anxiety Disorder (GAD) then you are correct in assuming that would require FDA approval as a medical device, although for the duration of the covid-19 pandemic the FDA published the following guidance on low-risk self help and general wellness apps: https://www.fda.gov/media/136939/download

which states: "... Given these public health benefits, for the duration of the COVID-19 public health emergency, FDA does not intend to object to the distribution and use of computerized behavioral therapy devices and other digital health therapeutic devices for psychiatric disorders, which are described in Section III.A, without compliance with the following regulatory requirements, as applicable, where such devices do not create an undue risk in light of the public health emergency:..."


You answer makes me believe you really know what you are talking about and you have de-risk lot of stuff from investor point of view. But with that answer, as a customer now I have trust issues with the app as I am not sure if what you are claim is actually true and feels like you have hacked around a regulation that is meant to keep unfair and false app and service in check. It gives me snake-oily vibes.


That's good feedback and there is a lot of snake-oil in the industry in general, which is why we base our app around evidence-backed practices such as CBT ( as mentioned in the post, there's much evidence that CBT-based apps are beneficial e.g. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5727354/ ). I think you're getting at an open question for regulators, which is the ongoing regulation of "wellness" apps, a category which this app currently falls into. The EU actually published a green paper about the Digital health space several years ago, in which they touch on the "wellness apps" space, with an possible eye to potential stricter regulation in future, although they are yet to enact anything in this space. https://digital-strategy.ec.europa.eu/en/library/green-paper...

The ultimate long term aim for the company is to actually work towards regulation as a medical device as this also unlocks more ways in which the app can be distributed (e.g. through healthcare systems).


A much more articulate and informed take than I'm capable of was recently written by a practicing psychiatrist, about the medicalization of CBT apps and how absurd it is.

https://astralcodexten.substack.com/p/welcome-to-the-terribl...

> feels like you have hacked around a regulation that is meant to keep unfair and false app and service in check.

Blind faith in every excess of the FDA's obstructionism and other bureaucratic pathologies was always insane to me, but it really, really shouldn't have survived for anyone paying even casual attention to the global pandemic we're in the midst of. Agencies like the FDA are taking on the herculean task of standardizing the messy landscape of scientific knowledge into bite-size, oversimplified answers that the dumbest layman can make use of. This is hard enough to begin with, and gets even worse when you remember that they're as subject as any institution to politics, inertia, bad organizational culture, etc etc etc. I get that most people have a powerful, almost-foundational urge to avoid critical thinking, but it's a fantasy to think that you can outsource 100% of your cognitive function to bureaucracies that can provide maximally-simplified, crystal-clear access to The Truth. When it comes to your health or that of those you care about, it's downright negligent to do so.

To be clear: I think the FDA does a massive amount of necessary, fundamental work that the medical system depends on, and I'm not suggesting people go hog-wild trying every random substance without regards to efficacy and safety guidelines.

But the approach you take in your comment is a dramatic and farcical example of bureaucracy-as-religion. Consider what we're talking about: you're concerned about non-medically-approved use of a chatbot that (reductively) helps you think about things more positively. Is there a principled reason that you're (presumably) okay with people watching yoga videos at home without a prescription? Or picking up a book on meditation? Or reading Couch-to-5K and starting jogging? Is there any difference between these activities and a CBT chatbot, beyond the fact that the latter has been medicalized by our regulatory bureaucracy?


Some excellent points. As you say, what is the difference between a book on CBT (as mentioned in the original post "bibliotherapy" is known to be effective in peer-reviewed studies) and a CBT app? And at what point do you draw the line between these two (assuming the claims made are equal)? I feel this is a question regulators globally are somewhat grappling with currently, and it's clearly going to be an focus going forwards in this space.


Fair enough. I understand you distrust in FDA and their regulations. Yoga videos do not harm you. In worse case you will tear your ligament.

Person with depression and suicidal tendency (because we are talking about CBT) on the other hand is not comparable to yoga video. It is more dire than you think. Redirecting them to an app (because it is more easily accessible) is pretty bad idea when they should be going to an actual therapist.

Having lived with a person for 5 year with chronic depression, your comment trivializes the disorder that many people in the world are dealing with.


> Person with depression and suicidal tendency (because we are talking about CBT) on the other hand is not comparable to yoga video. It is more dire than you think.

> Having lived with a person for 5 year with chronic depression, your comment trivializes the disorder that many people in the world are dealing with.

Bold of you to assume that I have no personal or otherwise direct experience with mental disorders, or that I'm otherwise trivializing it. Were I in a grumpier mood, I'd tell you to go fuck yourself : D

I've seen plenty of friends give up on finding therapists because of the seemingly-universal nightmare of delays and bureaucratic incompetence, and I've watched the medical system spend _decades_ chronically mismanaging the mental healthcare of a severely-mentally-ill immediate family member until I took over his care.

> Redirecting them to an app (because it is more easily accessible) is pretty bad idea when they should be going to an actual therapist.

The logic you express in this comment also seems unsound. You claim that access to an effective and highly-available treatment should be severely reduced because it prevents people who need more intensive, less-available treatment from seeking it. It's quite an extraordinary assertion that we should _reduce_ access to care, leaving only the highest-cost, highest-hassle, lowest-availability options, and then to further claim that this will _help_ patient outcomes overall. What's your reasoning here?

Again, I refer you to my previous examples. Do you think information on yoga, mindfulness, and jogging should be banned? Why do you assume that access to a chatbot will displace more intensive mental healthcare but access to yoga and mindfulness won't? If anything, yoga and mindfulness are far more widely-known as a form of treatment for mood disorders; the average person probably hasn't even heard of CBT.

You can extend this even further. Should cardiovascular exercise be medically-gated so that the severely obese don't avoid gastric bypass surgery? Should we make it more difficult to get SSRIs (and hell, therapy itself!) so that the severely mentally ill don't avoid electro-convulsive therapy? Doesn't following your logic imply that both of those moves would increase patient welfare?

Truly, I'd like to understand how you think this logic fits together. I'm quite baffled by it.


Sorry for the previous comment. I understand your in distrust in medical system and bureaucratic processes.


> Yoga videos do not harm you. In worse case you will tear your ligament.

Ligament tearing is definitely harm.


I can see you have (probably justifiable) beef with FDA regulations, and the politics they play. But the posters worry is real. CBT is offered as a therapy to many mental disorders. It can indeed be done by yourself, so in theory it makes sense that you don’t need to regulate it, but it sits right at the border of that demarcation if you ask me. Questions about whether it should be regulated is valid, and the answer offered is also somewhat valid though perhaps not opportune.


My rhetorical questions apply to your point too. Do you think information on how to do yoga, how to be mindful, and how to get started running should also be medicalized, made inaccessible without going through our byzantine medical system and paying a thousand dollars (or trying to convince insurance to do so)? How about books on how to eat and cook healthily? Or a self-help lecture on positive thinking?

If not, why? All of these things are common parts of a treatment plan for mood disorders, and yoga and jogging bring with them infinitely more risk of harm than self-CBT does.

Your claim is that if something can help with a disorder, people should be prevented from accessing it easily. This seems precisely backwards to me! Something being "offered as a treatment" is a terrible, terrible basis for throwing up substantial medical barriers, and causes much more harm from reduced access than it provides benefits. As I said, I'm not imagining away the need for the FDA: The risks of accidental misuse or other safety concerns are a legitimate basis for medical barriers, costly though they may be. But I think the onus is on the ban-happy folks to explain why a chatbot that helps you think positively is closer to Fentanyl than it is to a YouTube video on beginner's yoga.

The answer is simply path-dependence. The reason you categorize CBT differently is that it was developed through the formal scientific establishment while (eg) yoga and mindfulness were developed externally and validated post-hoc by the establishment. This is decidedly irrelevant to the best interests of the patient, and there are a trillion and one examples of the FDA[1] acting in ways that are explicitly harmful to patient welfare.

My ultimate point here is that if your concern is the health of yourself and your loved ones, the barriers the medical system throws up around treatments serves as a weak prior that often needs working around and should never be blindly and slavishly followed without at least a bit of basic research giving you an understanding of _why_ the barrier exists. My sister and brother-in-law are doctors, and a decade+ of conversation with them about medical culture has given me a healthy respect of how pants-on-head stupid and/or ignorant some patients can be[2]. These poor folks have no real choice but to follow the FDA blindly, even despite the heavy tax on their physical and mental wellbeing that they pay for doing so. But if you've got anywhere near an average IQ and/or basic, basic cognitive ability[3], it's simply negligent to follow this strategy.

[1] I'm using the FDA as an evocative metonymy for the medical system writ large, as everything from med school culture to our insurance infrastructure contributes to this status quo. The FDA isn't a fully agentic actor in this scenario, and while it has severe cultural flaws that I'd love to see improved upon, it and especially its staffers can't be fully blamed for the status quo.

[2] This is where many physicians' God complex comes from. They have to deal with the masses day-in and day-out, and have too keen a sense of just how rock-bottom cognitive ability can get. Most of the rest of us end up with social interactions that are pretty substantially segregated by cognitive ability.

[3] to the point that anyone who can read and understand this comment thread more than qualifies


It's a CBT app - if it helps people feel better that's good enough for most. Perhaps a message at the beginning saying "Remember this isn't a replacement for licenced therapy so if you're feeling particularly bad contact your GP or <suicide hotline>" could be useful.

But for those who haven't got access to anything, something is better than nothing, and there is evidence even journals tracking your feelings help (which this appears to be, albeit in a more advanced form).


Indeed, the general availability of something like an app means we can reach people who otherwise wouldn't have been reached and may not have learned about this kind of self-help skill, and there is a good evidence base that journalling and tracking feelings can really help people.

Great feedback on the message at the beginning, actually the next version will have an upgraded "crisis line" / SOS section in the app for people who need it. While we are clear that Iona is not a crisis service at the beginning screens, it also makes sense to have appropriate signposting throughout the app in case people ignore or don't read the warnings.


I haven't fact-checked the article yet, but the fact that this is an area with low-hanging fruit and the apparent expertise of the team leads me to think we should have some confidence here.

Cf. from the OP:

> Our clinical director, Professor Paul Farrand, was the architect of using Low-Intensity Cognitive Behavioural therapy at scale in the UK’s National Health Service (NHS), and the author of one of the first international textbooks on the subject.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: