Mental healthcare is going digital, with new apps offering online therapy—and even teaching clients how to do it themselves. Can a robot be a good therapist? Who is liable when things go wrong? Dr. Joe Ruzek, a psychologist who specializes in web- and phone-based psychological interventions, Zach Harned, a third year Stanford Law student, and Alison Darcy, CEO and founder of Woebot, join Pam and Joe on Stanford Legal to discuss.
This episode originally aired on SiriusXM on May 25, 2019.
“You know Pam, most people never go to a therapist. It’s too much, and frankly they don’t want to. But everybody’s got some problems. And the great attraction to digital mental health is that you can get on your PC, or even better yet, on your phone, and it’s always there at your convenience and it’s infinitely scalable. So if we get the right program for anxiety or depression, Jack is going to talk about PTSD, we might be able to treat a million people,” says Stanford Legal co-host Joe Bankman, setting up a recent discussion about mental health and new apps and programs aimed at providing digital services.
“There’s some evidence that sometimes people are more honest and more open with a machine than they are with a person. And most of what’s offered on these technologies is pretty basic. It’s a first level of self-care. It has low potential, I think in most cases for harm and therefore can be stepped up and added to with human support if it’s needed,” explains Dr. Joe Ruzek, a psychologist who specializes in web- and phone-based psychological interventions who has been developing mental health apps—particularly for veterans and others who suffer from PTSD. “I also will argue that the digital technologies we have now could make traditional face-to-face care more effective by getting outside of the office and reaching into the life of the person.”
Alison Darcy, PhD ’11, CEO and founder of Woebot, a new app for cognitive behavioral therapy, explains that Woebot is an emotional assistance guide that delivers a DIY version of cognitive behavioral therapy through short data conversations that are text based.
“We know that a guided self-help version of certain approaches to therapy are more effective than just pure self help. Woebot is like a guide but an automated guide, which is really sort of new for this field,” says Darcy.
Karlan raises liability concerns for this new mental health service, asking about the legal ramifications when a Woebot user threatens self-harm or even suicide.
“This is something that’s not particular to Woebot. Large platforms are now having to consider this as a content moderation issue for a variety of other sorts of mental health apps,” says SLS student Zach Harnett, who is working at Woebot. “Woebot, like many other products, is very upfront in noting that it is not designed to assist people with high levels of suicidal ideation or severe levels of depression. But yes, there’s still a sort of responsibility issue of when something like this becomes clear, does the program make the appropriate referrals, which Woebot obviously does in those cases.”
“But nobody really knows what the liability rules are in digital mental health,” adds Bankman, noting how new the area is.
“That’s absolutely right. And saying there aren’t a lot of cases is putting it generously. I would say probably zero is the number that are out there now. There’s a remarkable dearth of data on that sort of application,” says Harnett. “The FDA seems to be the primary player at this point in terms of producing the correct oversight. Because as Joe mentioned, there’s tremendous potential for growth assisting people who are really in need. There’s also potential for abuse and people being injured by inappropriate apps. And they’re one of the main players to look into this space regarding oversight.”