THE ROAD TO WELLVILLE?: AI-powered health apps are proliferating. We’re not ready for them. The mental stresses of the pandemic have fueled a boom in wellness devices that track speech, facial expressions and even eye blinks to assess emotional states. This kind of “affective computing” can replicate therapy or detect depression when in-person care isn’t available — and even remotely monitor workers and children. But the burst of interest is heightening concerns about whether there’s enough government oversight of the technology, known as “emotion AI”. Future Pulse spoke about the tension points with Alexandrine Royer , a doctoral candidate studying the field and the digital economy at the University of Cambridge and a student fellow at the Leverhulme Centre for the Future of Intelligence. The conversation was edited for length and clarity. Emotion AI is in its infancy and can only capture a limited range of human emotions, at best. Does it have any constructive use right now? Emotion AI might be beneficial when it is used for the day-to-day management of symptoms. It can also encourage individuals to take further steps towards improving their mental well being, and the wide availability of tools can reduce the stigma surrounding mental health treatment. But it remains a supplementary tool and not a cure. It’s also worth noting the wide pool of devices that fall under affective computing and mental health. Sensor-equipped virtual reality devices can help expose patients to stress-inducing scenarios within the safety of their home or psychologist’s office. The essential element is that there is ultimately some form of clinical oversight over these self-guided treatments. You’ve written the wellness industry is eager to profit on the digitization of health care and that people who can’t afford in-person therapy may be referred to bot-powered counseling. Where should the government set guardrails? The government doesn’t require that digital tool providers have a “duty of care” towards their users. More stringent regulation on the safety and reliability of these digital mental health solutions would allow for gimmicky apps to be removed from app stores. The FDA should reconsider how it classifies AI-powered therapies as “minimal risk” and ask that such digital tools provide accurate information on how to reach other person-based resources. Some applications do not have any protocols in place if the user is at risk of suicidal thoughts. Prolonged clinical testing of the application, and peer-reviewed findings, should be required before these chatbot therapists are released onto the market. Users should also be made aware how the credentials of digital tool providers compare to those of licensed professionals. Our personal devices already track the way we sleep, exercise and eat. Why the concern over tracking people’s emotions? We have yet to achieve a scientific consensus on how to recognize human emotions. Not only is the information at risk of being inaccurate, but it can also be tinged with certain biases. Emotion AI systems that incorporate facial-recognition have been found to poorly identify the emotions of people of color. Digital tracking can have far-reaching consequences for individuals as employers are increasingly looking to integrate mood analysis software into recruitment strategies and monitoring of daily worker performance. You argue the push for affective computing can widen health disparities. How so, and what can be done to avoid that? Digital tools are attractive as they are cheaply sourced, infinitely scalable, always available, and cannot experience the emotional exhaustion that overworked therapists face. For health insurance providers and employers, it can be tempting to limit coverage to digital solutions. It’s estimated that 100 million people reside in communities in the U.S. that suffer from shortages in mental health professionals. As the cost of affordable therapy remains a barrier to many, health care providers may look to gradually replace in-person mental health care in favor of automated therapists. Digital solutions can thus appear as a panacea rather than encouraging further investments into early interventions and preventative measures in vulnerable communities. Welcome back to Future Pulse, where we explore the convergence of health care and technology. Share your news and feedback: @dariustahir, @ali_lev, @abettel, @samsabin923, @_BenLeonard. |