How Does the FDA Regulate Mental Health Apps?
The rise of mental health apps promises an accessible option for diagnosis and treatment as more individuals want to be involved in making their own healthcare decisions. The global market for all mobile health apps reached $8 billion in 2018 with a projected CAGR of 38.26 percent for the next few years. Currently, the FDA has few guidelines for medical mobile applications (MMAs) aimed at treating mental health concerns and prefers to regulate such products as necessary. That being said, safety and efficacy concerns raised by psychiatric professionals regarding this type of software might mean that more regulations are coming down the road. Medical device manufacturers and developers looking to enter this space can get healthcare providers on board and rise above the competition by providing evidence-based backing for their mental health apps.
In 2018, a study found an overwhelming available for download on various mobile platforms. A percentage of these apps are provider-facing and serve as telehealth conduits. However, countless others are direct-to-consumer oriented and aimed at meeting a self-identified mental health need. Some apps include AI chat bot therapists that deliver text-based therapy scripts, while others connect patients to human therapists and support groups using algorithms. Many also function as self-diagnostic quizzes or self-help programs that offer practical suggestions based on a person’s self-reported mood. A few even use biometrics to by tracking a user’s breath and heart rate data and using facial recognition software to map expressions.
What Does the Current FDA Policy Say?
The FDA has focused most of its regulatory efforts for premarket certifications and approvals on medical device apps that control and connect physical counterparts, preferring to take a more as-needed approach to oversight for mental health apps. This discretionary enforcement policy applies to apps that impact behavior or coach a user with diagnosed health issues through their daily life. Within the guidance document, the FDA even includes an example of an app that helps patients with diagnosed psychiatric conditions with simple exercises and reminders as the archetype of a product under their discretionary enforcement.
An in-depth look at FDA guidelines shows that the agency still expects apps to meet GMP guidelines and to address software bugs in a timely manner. Developers should also take note that the FTC can investigate and take action against any false product claims that a medical app might make. Without evidenced-based proof of efficacy, potential users are forced to rely on anecdotal evidence through online reviews which may or may not put a company in a positive light. For those looking to distribute their app on a prescription basis, it is likely that clinical trial data will be needed to obtain the classification. For example, Akili Interactive and Click Therapeutics are waiting on FDA review for each of their cognitive behavior therapy games for treating pediatric ADHD, depression, and other issues.
Headspace, one of the most profitable and popular mental health apps, is another product currently seeking FDA prescription approval. By 2020, the company hopes that its randomized control clinical trials will deliver the evidence to support their claims that the program can address up to 12 conditions. One trial is focused on the effects of three months of Headspace usage on HbAIC levels for diabetic patients, and the software is also being studied in relation to pelvic pain and work stress. Company executives have recognized that the app would reach its maximum value in conjunction with, not in place of, provider supervision.
Quality and Regulatory Factors to Consider When Designing Mental Health Apps
A major concern of professionals about mental health apps involves the lack of data security and privacy measures. If an app is not distributed through a partner, HIPPA laws do not apply. One study that examined apps for dementia care found that less than half had a written privacy policy, and many of those that had one admitted to sharing data with third parties. A lack of security could lead some to worry that their data could be used against them for discriminatory purposes in the workplace or when receiving healthcare. Developers should consider being proactive in their data security measures in order to gain the trust of potential users and healthcare partners.
Since only 41 percent of people experiencing mental health issues report receiving help, these apps may provide another avenue when issues such as time, accessibility, and financial issues present roadblocks to treatment. However, professionals worry that the self-service approach may lead to over-diagnosis and over-treatment. Without the input of a provider, an individual may not have anyone to challenge or confirm what they believe to be true about their condition. Some professionals believe that content on these platforms could influence people to believe that they had a mental health issue when their attitude or behavior could be a more typical reaction to stress.
Considering that the top ten wellness apps saw a combined 170 percent increase in revenue from 2017 to 2018, it’s safe to say that mental health-oriented mobile apps can be a potential source of growth and revenue for medical device and other life science companies. As people look for ways to improve their healthcare outcomes, downloading a mental health app might be the answer for some. Without strict government regulations, it’s up to manufacturers and developers to find evidence-based solutions to demonstrate the safety and efficacy of their products.
Subscribe to Clarkston's Insights
Coauthor and contributions by Sabrina Zirkle