Editorial
Artificial intelligence in the era of ChatGPT ‑ Opportunities and challenges
in mental health care
Downloaded from by BhDMf5ePHKbH4TTImqenVA+lpWIIBvonhQl60Etgtdnn9T1vLQWJq/+R2O4Kjt58 on 12/15/2024
Om P. Singh1,2
1
WBMES, Kolkata, West Bengal, 2AMRI Hospitals, Kolkata, West Bengal, India
Chat Generative Pre‑training Transformer (ChatGPT)[1] is a and making psychiatric diagnosis and treatment accessible
powerful AI‑based chatbot system launched on November and affordable.
30, 2022, by San Francisco‑based OpenAI. It gained massive
attention and currently has over 100 million users, which The ability of ChatGPT and other AI‑based chatbots
makes it the fastest‑growing consumer application.[1] to generate human‑quality responses can provide
companionship, support, and therapy for people who have
It is a transformer‑based neural network system that uses problems with accessibility and affordability in terms of time,
a vast neural network to produce a human‑like language distance, and finances. The ease, convenience, and simulation
through which it communicates. The AI‑based programs are of talking to another human being make it a superior app
programmed with unlimited text data to understand the for providing psychotherapies. The ChatGPT and AI‑based
context and relevancy of human communications. chatbots are programmed and trained with vast knowledge
about psychiatric conditions and respond with empathy. Still,
There is massive competition in this segment as multiple they cannot diagnose specific mental health conditions and
similar, better‑advanced apps are on the verge of provide treatment details reliably and accurately.
being launched, like Google Bard,[2] Microsoft Bing AI,
Chinese Ernie bot, Korean SearchGPT, Russian YaLM 2.0, Though there is a lot of excitement associated with the use
Chatsonic, Jasper Chat, Character AI, Perplexity AI, and of AI in various psychiatric conditions, there are several areas
YouChat. of concern with its use. To start with, ChatGPT and other AI
are trainable and are trained using web‑based information
ChatGPT and other AI platforms hold enormous potential and utilize the reinforcement learning technique with
in many fields, including mental health. They carry vast human feedback. If not prepared with proper responses and
utilization possibilities and are coming in a big way. From from authentic sites, they can provide wrong information
chats and games to writing computer programs, music regarding the condition and inappropriate advice, which
compositions, songs, and teleplays to writing essays, may be potentially harmful to persons with mental problems.
letters, scientific papers, abstracts, and introductions, it will
affect one and all in a big way. It is not hard to predict that Confidentiality, privacy, and data safety are significant areas
they will make a massive difference in the mental healthcare of concern.[4] Any person utilizing an AI‑based app for their
delivery system. mental health condition and therapy is bound to share
important personal details about themselves and family
There is a huge treatment gap in mental health care in members, making them potentially vulnerable in situations
developing, lower, and lower‑middle‑income countries. of breach of confidentially and in scenarios of the data
According to WHO, there is a 76%–85% treatment gap in breach.
developing countries regarding mental disorders. According
to National Mental Health Survey, in India, the treatment Other concerns are the lack of proper standardization and
gap reported for any mental disorder is as high as 83%. monitoring, the universality of applications, misdiagnosis,
A huge deficit of mental health professionals far below the wrong diagnosis, inappropriate advice, and the inability
specified norms and an inequitable resource distribution to handle crises.[4] There are also concerns regarding their
make the gap more prominent.[3] AI and digital interfaces safety, efficacy, and tolerability.
are emerging as viable alternatives for reducing this gap
These pose significant concerns about the ethical issues
Address for correspondence: Prof. Om P. Singh,
Department of Psychiatry, WBMES, AMRI Hospitals,
related to using ChatGPT and AI‑based apps in academics,
Kolkata ‑ 700 091, West Bengal, India. diagnosis, treatment, and therapy.
E‑mail: [email protected]
Submitted: 21‑Feb‑2023, Accepted: 21‑Feb‑2023, There is a definite need to regulate and monitor AI‑based
Published: 03-Mar-2023 apps. American Psychiatric Association (APA) has formulated
© 2023 Indian Journal of Psychiatry | Published by Wolters Kluwer - Medknow 297
Singh: AI in the era of ChatGPT
a digital psychiatry task force to evaluate and monitor AI and psychiatric disorders in India. Indian J Psychiatry 2019;61:225. doi:
10.4103/0019-5545.258323.
mental health‑related apps for their efficacy, tolerability, 4. Singh S, Sagar R. Time to have effective regulation of the mental health
safety, and potential to provide mental health care.[5] apps market: Maximize gains and minimize harms. Indian J Psychol Med
2022;44:399-404. doi: 10.1177/02537176221082902.
5. Nick Z. APA task force reviews digital tools for mental health care, Dec
APA has come up with an innovative initiative of the App 2022. Available from: https://psychnews.psychiatryonline.org/doi/10.1176/
appi.pn.2023.01.12.11. [Last accessed on 2023 Feb 16].
Evaluation Model called App Advisor.[5] APA’s App Evaluation 6. App Advisor. American psychiatric association, 2023. Available from:
Model has been adopted and replicated by several other
Downloaded from by BhDMf5ePHKbH4TTImqenVA+lpWIIBvonhQl60Etgtdnn9T1vLQWJq/+R2O4Kjt58 on 12/15/2024
https://www.psychiatry.org/psychiatrists/practice/mental-health-apps.
healthcare organizations, e.g., the Division of Digital [Last accessed on 2023 Feb 20].
Psychiatry and BIDMC at Harvard University App Evaluation,
Health Navigator App Evaluator Model and Assessment This is an open access journal, and articles are distributed under the terms of
the Creative Commons Attribution‑NonCommercial‑ShareAlike 4.0 License,
tools, NYC Department of Health and Mental Hygiene: NYC which allows others to remix, tweak, and build upon the work non‑commercially,
Well App Advisor, to name a few.[6] as long as appropriate credit is given and the new creations are licensed under
the identical terms.
With the vast difference in awareness, education, language,
and level of understanding in the Indian population, Indian
Access this article online
Psychiatric Society and other stakeholders should start to
Quick Response Code
evaluate and regulate AI‑based global and local apps for Website:
their safety, efficacy, and tolerability and guide the general www.indianjpsychiatry.org
public for proper and safe use.
References DOI:
10.4103/indianjpsychiatry.indianjpsychiatry_112_23
1. Roose, K. (2022, December 5). The brilliance and weirdness of ChatGPT.
New York Times. Available from: https://www.nytimes.com/2022/12/05/
technology/chatgpt-ai-twitter.html. [Last retrieved on 2023 Feb 03].
2. Schechner S. Google opens ChatGPT rival bard for testing. WSJ. Feb,
2023. Available from: https://www.wsj.com/articles/google-opens-testing- How to cite this article: Singh OP. Artificial intelligence in the
of-chatgpt-rival-as-artificial-intelligence-war-heats-up-11675711198. [Last era of ChatGPT ‑ Opportunities and challenges in mental
accessed on 2023 Feb 24]. health care. Indian J Psychiatry 2023;65:297-8.
3. Singh OP. Chatbots in psychiatry: Can treatment gap be lessened for
298 Indian Journal of Psychiatry Volume 65, Issue 3, March 2023