The morbid coronavirus has introduced a new set of challenges to the workforce. With hundreds of thousands of new positive cases across the globe every day, doctors and scientists are ensuring maximum care and treatment. The problem is, while the physical impact of COVID-19 can be assessed and treated, the mental impact is often ignored. As mental illness crisis has worsened during the past 7-8 months, some organizations are turning towards robotics and related technologies that can offer hope to their employees to improve their mental health.
This has also been substantiated in a new study conducted by Oracle and Workplace Intelligence that shows how robots are offering unique opportunities to support the mental health of workplace in a scalable, cost-effective way. The global study that polled over 1000 CXOs and HR managers from India alone, finds that with increased workplace stress, anxiety, and burnout, more people are looking at robot therapists for mental well-being and support, rather than humans.
The study shows that employees prefer humans over robots to support their mental health as they believe robots provide a judgment-free zone, they are an unbiased outlet to share problems and can offer quick answers to health-related questions as opposed to human therapists, especially in a crisis such as a pandemic. Over two-thirds of respondents also said they would prefer to talk to a robot over their manager about stress and anxiety at work, adding that they are open to having a robot as a therapist or counselor.
Mental Health Support from Robot Therapists
Researchers have observed that people are comfortable in talking to avatars than a therapist or online service providers. A good example is AI chatbot that helps in areas where physical accessibility is not possible. Generally, these chatbots are fed with mock transcripts from counselors, physicians that allow them to deal with a wide array of issues. Other than pandemic caused mental illness, WHO (World Health organization) includes depression, bipolar affective disorder, schizophrenia, and other psychoses, dementia, intellectual disabilities, and developmental disorders like autism in the scrolls of common mental disorders.
Woebot, a chatbot created by a team of Stanford psychologists and AI experts, is built on a platform of cognitive-behavioral therapy (CBT). It uses structured exercises to encourage a person to question and change their habits of thought—like a step-by-step manual. Woebot can help patients manage mental health conditions by changing the way they think and behave by enabling patients to reframe their negative thoughts into positive ones using natural language processing, clinical expertise, and light-hearted daily talk intended to create a therapeutic experience for the user.
Another chatbot is Wysa, developed by Touchkin using Facebook Messenger as the user interface, in collaboration with researchers from Columbia and Cambridge universities. This AI-based mental and emotional wellness app responds to the emotions a user expresses and uses evidence-based CBT, dialectical behavior therapy (DBT), meditation, breathing, yoga, motivational interviewing, and micro-actions to help build mental resilience skills.
While the usage of robots in treating mental health is relatively new, some of the apps such as Let’s Meditate, from the Heal Me Team,Wysa, Now&Me and others, have proved to be an asset during the current COVID-19 pandemic times. Thanks to the proliferation of smartphones and higher speed Internet connectivity, these apps are becoming a viable and effective method for getting mental health services while overcoming the barriers of mental health stigmas.
In fact, employees are looking for their organizations to provide more mental health support, with the Oracle study substantiating that it can have profound impact on global productivity as well as the personal and professional lives of the workforce.
“How well HR leaders respond to the current crisis will come down to two factors – their own capabilities as leaders, and the quality of the digital tools available to them. Certainly, their ability to be effective will be greatly enhanced should many of their routine tasks be automated, allowing them to focus on the human aspects of their roles,” Shaakun Khanna, head of HCM applications, Asia Pacific, Oracle, said.
The Dark Side of Robot Therapists
Last year, the Technical University of Munich (TUM) published an initial study into how AI robot therapists could be used in the future to treat mental illness.
However, the TUM researchers are highly aware of the possibilities of human bias being coded into as complex and nuanced an AI as would be necessary to treat mental health. It has been proven by several researchers that human biases can be built into algorithms, which are then “reinforcing existing forms of social inequality” via coding in “data-driven sexist or racist bias”, to name a few.
The mental health devices would then cause harm in accidental ways, which unnecessarily complicates the possibility of successful treatment.
“Therapeutic AI applications are medical products for which we need appropriate approval processes and ethical guidelines. For example, if the programs can recognize whether patients are having suicidal thoughts, then they must follow clear warning protocols, just like therapists do, in case of serious concerns,” Alena Buyx, TUM professor and co-author said speaking about the risks.
To prevent this, it may be necessary for management and families to understand how the algorithm works in connection with the treatment. Explaining this properly may take up time before treatment commences.
Another researcher noted the dangers of having a robot therapist to treat mental health. He noted that robots which “aim to alleviate loneliness or provide emotional comfort” risk patients they support being very dependent on them. This is a serious worry in relation to long-term use of AI interventions, as the AI is not meant to be a permanent presence in someone’s life but to help change their behaviors.
Experts agree that a robot, no matter how smart, simply can’t replace a human therapist’s ability to do such things as provide deep insight into past events, hold space, empathize and provide a nurturing relationship built on mutual trust.
“Even if AI gets to a level of sophistication where it can be a wholesale alternative to traditional mental-health treatments, I believe that to be successful, it will need to be part of an integrated care pathway,” Alison Darcy, founder and CEO of Woebot, said in a statement.
Researchers deny that humanoid robots and chatbots can be used to improve mental health and it has great potential. However, being a pretty new field with its share of risks involved, a lot more research and effective execution is the need of the hour.