Health

Are Mental Health Apps Garbage? No, There's Just a Secret to Finding a Good One

"It’s the equivalent of having a therapist next to you when you get anxious."

by Kastalia Medrano
Getty Images

Mental health apps are everywhere — more than 165,000 of them claim to help people living with conditions including depression, schizophrenia, post-traumatic stress disorder, and countless more. But with no medical oversight, the vast majority of available apps aren’t founded in any sort of good science, and they range from just useless to actually harmful. Researchers at the University of California Davis want to change that. Since mental health apps clearly aren’t going away, there needs to be a way to sort the wheat from the chaff.

Can clinicians harness the immense popularity of these apps in a way that’s both safe and effective? Inverse spoke with Dr. Peter Yellowlees, a professor of psychiatry at UC Davis and an expert in the field of integrating technology into clinical settings. Yellowlees recently created a framework for psychiatrists to judge the merits of various mental health apps. He calls it ASPECTS — a mnemonic to help judge if an app is actionable, secure, professional, evidence-based, customizable, and transparent.

So, how did you decide to design this framework?

We have a large clinical practice at UC Davis, and we already use apps here on a daily basis — quite a number of them. As we use more and more of them, we’ve become more and more aware of the shortcomings many of them have. So we thought we’d try to put together a basic algorithm to assess the quality of the apps, because there really isn’t anything like that currently available.

And are there specific apps you’ve found to be useful, that pass the ASPECTS test and have actual clinical merit?

Quite a few that the U.S. Department of Veterans Affairs has created. They have the original one, which was PTSD Coach, and they’ve developed five or six more since then, and they’ve done a really good job. Another one I use a lot is called Virtual Hope Box, which is a nice relaxation, meditation, cognitive-therapy app that patients can use themselves. It doesn’t collect data or anything; it’s the equivalent of having a therapist next to you when you get anxious. And there are a number of different mood trackers — with those, you just need to look at the amount of detail they have … That’s the small number that I use, but in reality of course the apps are used primarily by people who have anxiety, stress, PTSD, and substance-related disorders.

How did you decide on the six criteria within ASPECTS?

These are core criteria for quality for essentially almost any type of software. And unfortunately they’re not criteria a lot of people stick to [when designing apps.] You look at a number of apps out there, they clearly haven’t been built with a clinician, they don’t make sense from a clinical point of view, they don’t collect useful data. Or, potentially, they’re not secure. The biggest problem with most apps, though, is they don’t integrate with other systems … so if you’re in the VA and you’re using VA apps, you can get some of that data, but when I’m seeing a patient I can’t download data from a patient’s app to my records, which is what would be really useful. Clearly, though, apps will integrate in time.

The key thing with these apps is that they’re not devices, they’re not hardware. They’re all software. That’s why the FDA doesn’t want to be involved. You wouldn’t want to try to regulate Microsoft Word.

How are apps like this generally marketed? Is there any oversight at all right now, or can people just make all kinds of fantastical miracle claims with no legal consequences?

You’ll find most of them just have a mini disclaimer and that’s about all they do. There are 160,000 or so of these, and if I had to guess I’d say 90 percent are not likely to be useful.

Obviously people network and share opinions. It’s generally word of mouth. What you see with apps is the phenomenal way a lot of people get excited to use them for the first few weeks, like with a Fitbit, and then it becomes less exciting over time and they abandon it. So for app developers, how do they create a sticky app that’s genuinely useful? If you don’t meet all the criteria we’re defining [with ASPECTS], you’re unlikely to have a sticky app.

So you think we’ll see apps being widely used as clinical tools in psychiatry?

A lot of psychiatrists, and physicians in general, are already using apps and they’re comfortable with their patients using apps that are encouraging people to, say, get exercise. Clearly that’s going to stay. We’re going through this process in the next five years or so and we’ll eventually have the equivalent of drug formularies — the total grouping of well-tried and tested medications for a particular disease. We will, I’m sure, gradually move to having app formularies, 20 or 30 for depression, 30 for, maybe, diabetes. And the apps will gradually get recommended and rise to the top. We’re already developing formularies [at UC Davis], and at the end of the day we’ll probably come up with 50 or 60 apps that are best and those are the apps we’ll recommend to physicians.

This interview has been edited for brevity and clarity.

Related Tags