Home > Blog > AI Therapy & Therapists Using AI to Make Therapy Better
Maria Szandrach, Co-founder and CEO
There is a growing interest in using artificial intelligence (AI) in therapy and mental healthcare nowadays, both on the therapist's and the client's sides. We hear a lot about clients going for AI therapy using chatbots instead of going to a therapist for therapy, especially after the pandemic, to get a cure for their mental illnesses. So the question that arises here is, "Is AI going to help clinicians or get them out of their jobs?"
But chatbots are not the only use case for AI in mental health. There are plenty of other AI systems and therapeutic tools based on machine learning, created using complex algorithms. Some of these are actually designed to assist a mental health professional in providing better mental health care. These tools might not even be visible to the clients. Let’s review some of these technologies.
Have Your Progress Notes Automatically Written For You!
✅ 100% HIPAA Compliant
✅ Insurance Compliant
✅Automated Treatment Plans
✅Template Builder
✅ SOAP, DAP, BIRP, EMDR, Intake Notes and More
✅ Individual, Couple, Child, Group, Family Therapy Types
✅ Recording, Dictation, Text & Upload Inputs
There are bots that schedule visits and answer requests for therapy. This is something that most businesses use nowadays to manage inbound requests from potential clients. Therapists can do that as well, and there are HIPAA-compliant bots available.
Some companies offer AI-enhanced diaries, where clients are encouraged to enter their thoughts in between sessions and some simple AI analysis is performed. This is usually based on sentiment and some keywords. For example, Limbic provides that.
Companies with high reputations such as Woebot and Wysa are known to almost everyone. The interaction with them is very streamlined. AI is not heavily involved, as it's primarily a scripted dialog with well-known CBT interventions. Sometimes clients complain that it is not very tailored and does not feel personal. But together with therapy, it could be a valuable addition, a modern form of homework.
This is a more experimental field. It is questionable whether it can even be classified as clinical or evidence-based. Those bots just discuss topics with the client based on the statistical data they have after absorbing lots of internet content. Replika AI is an example. The client can create their own avatar or a friend who is supposed to help them with their well-being. Replica positions itself as a therapy proxy, but it is unclear whether the bot provides any value other than easing symptoms of loneliness in the short run.
Between-session communication is also vital, and documenting it effectively is just as critical. Tools like Mentalyc AI that focuses on streamlining essential clinical documentation tasks, enabling mental health professionals to spend less time on notes and more time on care. With tools to automate documentation and organize records, Mentalyc helps ensure accuracy and compliance without the extra administrative burden.
Some of the benefits of using chatbots in therapy are as follows.
AI-based therapies can be accessed remotely, which can be especially helpful for people who live in rural areas or have mobility issues.
Some people may feel more comfortable discussing sensitive issues with an AI than with a human therapist.
AI-based therapies can provide consistent and standardized care, which can be helpful for people who have difficulty forming a rapport with a human therapist or who have had negative experiences in therapy.
However, the most popular opinion on the topic is that AI-based therapies are not a replacement for human therapy. They can be a valuable supplement to traditional therapies, but they should not be used as the sole form of treatment. It's also important to carefully consider the ethical implications of using AI in therapy, as well as the potential risks and limitations.
There is one tool so far out there that does it, called Mentalyc. With Mentalyc, you can easily input patient information into their AI program and have it generate a personalized progress note automatically. This not only saves time and energy but also ensures the accuracy of patient data.
Plus, with Mentalyc's built-in natural language processing capabilities, the generated progress notes are tailored to each specific patient, making them both comprehensive and easy to read. The ability to quickly create accurate therapy progress notes helps make private practices more efficient while improving overall patient care.
Eleos provides a baseline for a note, but its core value proposition is to label interventions during sessions.
Based on academic research, Lyssn offers tools aimed at training young clinicians. Lyssn also provides transcripts of sessions, but without session notes, meaning that they transcribe the whole session verbatim.
AI note-taking in psychotherapy involves using artificial intelligence to automatically transcribe and analyze the content of therapy sessions. This can be done using voice recognition software or other AI-based tools.
Some potential benefits of using AI for note-taking in psychotherapy include:
AI can help therapists save time by automatically transcribing and organizing their notes, allowing them to focus on providing care to their clients.
Artificial Intelligence can help ensure that notes are transcribed accurately and consistently, which can be especially helpful if a therapist has difficulty writing legibly or works with clients with strong accents.
Artificial Intelligence also can help therapists analyze the content of their sessions in ways that may not be possible for a human, such as identifying patterns and trends over time.
AI such as the one from Mentalyc focuses on proving medical necessity and organizing the notes in a way that helps pass audits.
The average time spent on a note outside of a session is 15–20 minutes. With AI, it takes up to 3 minutes to read, adjust, and sign.
However, it's important to carefully consider the ethical implications of using Artificial Intelligence for note-taking in psychotherapy, as well as the potential risks and limitations. Therapists should also be transparent with their clients about the use of AI in therapy and ensure that their clients are comfortable with this approach.
Mentalyc offers client consent and takes good care of data security by anonymizing transcripts and not storing raw session data. It was reviewed by ethics experts, lawyers, psychology professors, and clinicians.
Advancements in technology have made it possible to approximate the presentation of certain diagnoses (mostly depression and anxiety) from different data streams such as voice, mobile phone information, or interactions with games.
Lots of tools have emerged on the market, including Kintsugi, or Sondehealth. Those tools mostly detect depression or anxiety from voice effects, or at least that's what they say. It is questionable how insightful and useful that can be for a therapist.
A growing number of mobile apps claim that they can detect depression and anxiety based on mobile phone data. For example, geolocation can be used to see whether the client spends most of their time at home or takes some walks. Mindstrong, led by Thomas Insel, pioneered this approach about a decade ago. Since then, there have been many projects, but the use cases are not very clear.
Most of the tools promote it as monitoring the mental health of employees, but there are some ethical concerns around it. Most of the apps are still in the research phase. Approximation of PGQ9 and GAD7 from simple phone data comes across as simplistic and potentially violates privacy. But it is possible that, with time, this technology will become more precise and companies with high-security standards will manage to succeed.
Simple games that allow monitoring the state of depression and other disorders have recently hit the market but have been the subject of research for some time. Clients can engage with them between sessions or while waiting for therapy to start, and the game can check and alert the clinician if there is a higher risk. Some companies, like Thymia, want to at some point be able to treat disorders using games.
Amelia VR is an example of a company that does that (previously called Psyrius). There is also a German company called PsyCurio offering the same for the German market. They usually have a lot of different programs for simulating situations and exposing clients to their phobias. Those tools should be used in a clinical setting.
AI is proving to be beneficial to psychotherapy just as it is for all other professions. New AI-based systems and software are replacing the old, slow, and inefficient methods of psychotherapy. They have proved to be very effective so far. Psychotherapists use them to make their psychotherapy practice better and faster. Therapists are encouraged to take advantage of these resources and not overlook them and get left behind in this era of artificial intelligence.
So, if you haven't started making use of AI in your private practice and are looking to start using AI in your practice, Mentalyc is the way to go. This powerful platform allows you to quickly and easily set up an AI-driven system that can help you provide better care for your patients.
Have Your Progress Notes Automatically Written For You!
✅ 100% HIPAA Compliant
✅ Insurance Compliant
✅Automated Treatment Plans
✅Template Builder
✅ SOAP, DAP, BIRP, EMDR, Intake Notes and More
✅ Individual, Couple, Child, Group, Family Therapy Types
✅ Recording, Dictation, Text & Upload Inputs
Maria Szandrach
Maria is an experienced entrepreneur with over 10 years of experience and an MSc degree from London Business School. She co-founded Mentalyc as her third startup (the previous ones were solving problems in the Mental Health and Insurance industries). As a teenager, Maria went to therapy for an eating disorder. She switched therapists 5 times before she eventually recovered. She devoted her career to making therapy more effective and efficient.
Company
Product
Legal
Contact us
Who we serve
Psychotherapists
Join us