What do we have in store for us?
Can a Chatbot or Freud app fully replace a therapist? As of today, No. It seems plausible that the future of psychotherapy involves significant advancements in AI which will have a profound influence on psychological practice with systems that will administer evidence-based and very effective treatments to clients. Although the application of AI technologies in the mental health care field is filled with the potential to increase access to care innovatively, it comes with its own risks and drawbacks.
AI Therapy Systems:
There already have been major breakthroughs which indicate an interesting future in therapy. Beginning with ELIZA in the 1960s, a simple computer program that mimicked non directive social interactions and received a shocking number of personal responses by its administrative staff, to the currently popular Woebot, a chatbot by Facebook which responds to maladaptive thought patterns by using the principles of cognitive behavioral therapy (CBT) and engages in more than 2 million conversations a week.
Paro, a harp seal stuffed robot was created for the elderly and those in homes or those who cannot responsibly take care of a pet. Although non-living, Paro can provide similar comfort as provided by a pet by responding to touch and voice and activating the parasympathetic nervous system, thereby reducing stress.
Ellie, created by USC researchers, helps people with depression and veterans suffering from PTSD using a webcam and microphone which enables her to analyse emotional cues and provide feedback and also by processing the rarest of speech which helps her record pauses. Similarly, Karim, a psychotherapy bot aids workers and refugee communities in the Middle East.
Therapeutic video games and mobile applications are on the rise with increased benefits such as improving self-confidence (eg. Mindbloom), adherence to treatment, reducing stigma surrounding mental health and also enhancing social skills, eg. Second Life, an online virtual game for children with autism or Sosh, a mobile app for individuals with Asperger’s Syndrome.
A bright future?
AI therapy can, in fact, prove to be extremely favourable.
Many times patients lack the motivation to follow up with their therapists or stick to plans and techniques advised for their betterment. Currently, a certain technology is being worked upon which provides tailored mental health treatment that helps clients to stay committed to therapy.
AI therapy can also spot suicidal patterns and thoughts which humans may potentially miss by analysing them and examining curated databases of clinical knowledge. This is a huge plus as it helps to prevent self-harm and will reduce the number of deaths caused by suicide.
The future also takes a look at having potential implanted AI technologies which may repair general cognitive abilities or restore function to areas of the brain that have been damaged by strokes or brain injuries.
A self-help paradigm that can be personalised according to an individual’s needs and can also experience important emotions needed during counselling like empathy or respond and recognise emotions of the patient while taking in cultural differences into consideration does indeed sound all too good. However, it can have threatening downsides.
Chatbots and AI therapy systems aren’t protected by medical data privacy and security law. The early ELIZA program was shut down immediately when its creator perceived it as a threat after its outraged users found out that all their conversations were recorded and accessible. Similarly, although Facebook’s Woebot keeps identities anonymous, Facebook still owns logs of all conversations which Woebot has with its users. Confidentiality and privacy become blurred with the use of autonomous AI systems.
At the same time, certain AI therapy systems can be very expensive. For eg., Paro costs almost $7,000. This defeats the purpose of AI therapy aiming at increasing accessibility to mental health care. Although doubtful, AI therapy can also have other economic implications to the field of psychology leading to job losses in a knowledge-based profession if systems are developed to a point where they can provide a full range of mental health services.
Therapists frequently encounter ethical dilemmas. To tackle these, AI systems will be expected to make value judgements which involve complex reasoning and technology at the end of the day which is always vulnerable to errors. An extreme advancement in AI that enables systems to develop their own values and beliefs is possible but risky as it may conflict with those of its own creator. Weizenbaum, the creator of ELIZA, said, “Computers should not be allowed to make important decisions because computers lack the human qualities of compassion and wisdom”.
An important aspect of therapy is the “human element” which develops the therapeutic bond between the client and therapist — something that futurists believe AI systems will lack. Along with such genuine connections, a consequential change in a patient’s life as a result of therapy comes with the fallibility and tension involved in the process. There is a good chance that AI systems will exceed these sensory capabilities of humans.
And lastly, unlikely but possible, a positive transference toward an AI system would be problematic and baffling to resolve, as depicted in the movie Her.
In any case, therapy may radically change. It is safe to anticipate that face-to-face counselling may not be practised everywhere at all times due to more convenient therapy systems supplanting them. There may be a rise of “surrogate counsellors” with protocol packaged treatments for common mental disorders (eg. depression, anxiety) that can be accessed by clients. Practice settings are bound to change and payment transactions may also be mostly electronic.
Advanced AI technology is already found in almost all sectors today including mental health care. AI and the future of psychotherapy seem to be ideal in many ways, opening doors to incredible possibilities which were not at all imagined in the past. Technological singularity is near and it is definitely something to be thrilled about but it also shoots a series of new professional, legal and ethical complications. Considering how technology dependent we are becoming each day, it is easy to assume that the above mentioned paradigms will become a reality. However, it is difficult to predict whether they will eventually and inevitably affect us favourably or not and it is imperative to build a unique framework which only aims at improving healthcare and lifestyle.
Feature Image Credit: Riccardo Annandale on Unsplash
Shruti Venkatesh is a Research Associate (Mental Health) at One Future Collective.
Luxton, D. D. (2013, November 11). Artificial Intelligence in Psychological Practice: Current and Future Applications and Implications. Professional Psychology: Research and Practice. Advance online publication. doi: 10.1037/a0034559
I want to be free, but patriarchy and capitalism tether me!
Pride with OFC, 2022
Who decides what queerness looks like?