0% found this document useful (0 votes)
25 views7 pages

Fpsyt 10 00746

Uploaded by

michamiwi999
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views7 pages

Fpsyt 10 00746

Uploaded by

michamiwi999
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

PERSPECTIVE

published: 18 October 2019


doi: 10.3389/fpsyt.2019.00746

Key Considerations for Incorporating


Conversational AI in Psychotherapy
Adam S. Miner 1,2,3*, Nigam Shah 4, Kim D. Bullock 1, Bruce A. Arnow 1, Jeremy Bailenson 3
and Jeff Hancock 3
1 Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, Stanford, CA, United States,
2 Department of Epidemiology and Population Health, Stanford University School of Medicine, Stanford, CA, United
States, 3 Department of Communication, Stanford University, Stanford, CA, United States, 4 Stanford Center for Biomedical
Informatics Research, Stanford University School of Medicine, Stanford, CA, United States

Conversational artificial intelligence (AI) is changing the way mental health care is
delivered. By gathering diagnostic information, facilitating treatment, and reviewing
clinician behavior, conversational AI is poised to impact traditional approaches to
delivering psychotherapy. While this transition is not disconnected from existing
professional services, specific formulations of clinician-AI collaboration and migration
Edited by:
Michelle Burke Parish, paths between forms remain vague. In this viewpoint, we introduce four approaches
University of California, Davis, to AI-human integration in mental health service delivery. To inform future research
United States
and policy, these four approaches are addressed through four dimensions of impact:
Reviewed by:
Stefanie Kristiane Gairing,
access to care, quality, clinician-patient relationship, and patient self-disclosure and
University Psychiatric Clinic Basel, sharing. Although many research questions are yet to be investigated, we view safety,
Switzerland
trust, and oversight as crucial first steps. If conversational AI isn’t safe it should not
Donald M. Hilty,
UC Davis Health, be used, and if it isn’t trusted, it won’t be. In order to assess safety, trust, interfaces,
United States procedures, and system level workflows, oversight and collaboration is needed
Peter Yellowlees,
University of California,Davis
between AI systems, patients, clinicians, and administrators.
United States
Keywords: natural language processing, artificial intelligence, expert systems, psychotherapy, conversational AI,
*Correspondence: chatbot, digital assistant, human–computer interaction
Adam S. Miner
[email protected]
INTRODUCTION
Specialty section:
This article was submitted to Clinicians engage in conversations with patients to establish a patient-therapist relationship (i.e.,
Public Mental Health, alliance), make diagnoses, and provide treatment. In traditional psychotherapy, this conversation
a section of the journal typically involves a single patient and a single clinician (1). This model of psychotherapy is being
Frontiers in Psychiatry modified because software programs that talk like people (i.e., conversational artificial intelligence,
Received: 09 December 2018 chatbots, digital assistants) are now beginning to provide mental health care (2). Conversational
Accepted: 17 September 2019 artificial intelligence (AI) is gathering diagnostic information (3, 4) and delivering evidence-based
Published: 18 October 2019 psychological interventions (5–7). Additionally, conversational AI is providing clinicians with
Citation: feedback on their psychotherapy (8) and talking to young people about suicide, sex, and drug use
Miner AS, Shah N, Bullock KD, (9, 10).
Arnow BA, Bailenson J and
Conversational AI appears unlikely to achieve enough technical sophistication to replace human
Hancock J (2019) Key Considerations
for Incorporating Conversational
therapists anytime soon. However, it does not need to pass the Turing Test (i.e., able to hold human
AI in Psychotherapy. seeming conversations) to have a significant impact on mental health care (2). A more proximal
Front. Psychiatry 10:746. challenge is to plan and execute collaborative tasks between relatively simple AI systems and human
doi: 10.3389/fpsyt.2019.00746 practitioners (11–13). Although AI in mental health has been discussed broadly (for a review

Frontiers in Psychiatry | www.frontiersin.org 1 October 2019 | Volume 10 | Article 746


Miner et al. Incorporating Conversational AI in Psychotherapy

see 14), specific formulations of clinician-AI collaboration and to limit the sharing strictly to the extent needed to provide
migration paths between forms remain vague. effective treatment and ensure safety (16, 17). Against this
Articulating different forms of collaboration is important, backdrop, having conversational AI listen to psychotherapy
because the deployment of conversational AI into mental health sessions or talk directly with patients represents a departure
diagnosis and treatment will be embedded within existing from established practice.
professional services. Conversational AI will likely interact with In the “humans only” approach, psychotherapy remains
traditional workers (i.e., clinicians), but how these roles and unchanged. Most psychotherapy sessions are heard only by
responsibilities will be allocated between them has not been the patient and clinician who are in the room. If a session
defined. To guide future research, we outline four approaches were recorded, the labor intensiveness of human review
and dimensions of care that AI will affect. would ensure most sessions would never be analyzed (8). The
Within the four approaches of AI-human integration second approach, “human delivered, AI informed,” introduces
in mental health service delivery, one extreme is a view into the room a listening device connected to software that
that any involvement by conversational AI is unreasonable, detects clinically relevant information (18) such as symptoms
putting both patients and providers at risk of harmful or interventions (19), and relays this information back to
unintended consequences. At the other extreme, we explore the patient or clinician. Quantitative analysis of recorded
how conversational AI might uniquely serve a patient’s needs psychotherapy is in its early stages, but it shifts to software
and surpass the capacity of even the most experienced and programs the burden of extracting relevant information from
caring clinician by overcoming entrenched barriers to access. audio or text. In the third approach, “AI delivered, human
Although embodiment (e.g., virtual avatars or robots) can supervised,” patients speak directly to a conversational AI
have a significant impact on interactions with virtual systems, with the goal of establishing diagnoses or providing treatment
we focus exclusively on the potential benefits and challenges (20). A human clinician would either screen patients and
of verbal and written language-based conversation and ignore hand off specific tasks to conversational AI or supervise
the implications of embodiment or presence (15). Table 1 conversations between front-line conversational AI and
summarizes the four approaches and our related assumptions. patients. The fourth approach, “AI only,” would have patients
talk to a conversational AI with no expectation of supervision
CARE DELIVERY APPROACHES by a human clinician.
One of the less developed but more alluring ideas of AI
It is unclear whether the path forward will involve simultaneous psychotherapy is “AI delivered, human supervised.” Even the
experimentation with all four degrees of digitization, or most ardent supporters of AI will acknowledge that there are
progression through these approaches. We first briefly describe certain things humans do better than computers. Combining
how these compare to the way individual psychotherapy is most people and algorithms may potentially build on the best of both
often delivered today. Perhaps surprisingly, laws, norms and the approaches, and AI–human collaboration has been suggested
ethics of data sharing represent a nonobvious but critical factor as a way to address limitations in planning treatment in other
in how these alternative approaches can operate now or develop medical areas such as oncology (21). Indeed, the prevailing
in the future. opinion of expert systems researchers in the 1980s argued that
Currently, psychotherapy sessions are rarely recorded computer–human collaboration would outperform either people
except in training institutions for supervision. When they or computers alone (for a review see 22).
are, for example during training or to assess clinician In assessing any system to augment the practice of psychotherapy
fidelity during clinical trials, trained human clinicians with the first consideration of its impact should be that it will ensure
prescribed roles and responsibilities are the listeners and patients and clinicians are helped and not harmed (23, 24). In the
provide oversight. With few exceptions, such as immediate discussion below, we consider salient issues that impact the potential
risk of serious harm to the patient or others, clinicians need value and harm of different delivery mechanisms by focusing on
explicit permission to share identifiable patient information. four dimensions of impact: access to care, quality, clinician-patient
When one of these exceptions is invoked, there is an obligation relationship, and patient self-disclosure.

TABLE 1 | Delivery approaches and dimensions of impact for conversational AI.

Care delivery approach Dimensions of impact

Access to care Quality Clinician-patient Patient self-disclosure


relationship and sharing

Humans only Unchanged Established No disruption Unchanged


Human delivered, AI informed Unchanged Potentially improved Potentially disrupteda Unknown
AI delivered, human supervised Improved, but limited scalability Unknown Likely disrupted Unknown
AI only Improved, not restrained by human Unknown Nonexistent Unknown
attention
a By “disrupt” we do not mean to signal that the result will be necessarily good or bad.

Frontiers in Psychiatry | www.frontiersin.org 2 October 2019 | Volume 10 | Article 746


Miner et al. Incorporating Conversational AI in Psychotherapy

DIMENSIONS OF IMPACT [e.g., Structured Clinical Interview for DSM-5 (41)] to


unstructured interviews in which the conversation develops
Access to Care based on the clinician’s expertise, training, and the patient’s
Limited access to mental health treatment creates a demand for features. Conversational AIs have interviewed patients about
scalable and non-consumable interventions (25, 26). Despite the symptoms for PTSD with a high level of patient acceptance (20).
high costs and disease burden associated with mental illness (27), Conversational AI has been piloted across numerous clinically
we have a decreasing number of clinicians per capita available relevant groups such as clinical depression (6) and adolescent
to provide treatment in the US (28). Increasing the number of stress (42). In a study in which students believed they were
human clinicians is not currently feasible, in part because of speaking with a conversational AI, the students reported feeling
the decline from 2008 to 2013 per capita for both psychologists better after talking about their problems following the encounter
(from a ratio of 1:3,642 to 1:3,802) and psychiatrists (from a ratio (43). Although these early findings point to potential benefits,
of 1:7,825 to 1:8,476) (28). Conversational AI has the potential there is a lack of rigorous clinical trial data and uncertainty about
to help address insufficient clinician availability because it is regulatory oversight (2).
not inherently limited by human clinician time or attention. Yet while there is reason for optimism, inflated or
Conversational AI could also bridge one of the current tensions unsubstantiated expectations may frustrate patients and
in care delivery: although clinicians value patient conversations, weaken their trust in psychotherapeutic interventions (44, 45).
they have no financial incentive to engage in meaningful but Many current computation methods can be used to search for
lengthy conversations (29). specific dialogue acts, but additional work is needed to map
The decreasing amount of time spent in meaningful theoretically important constructs (e.g., therapeutic alliance) to
conversations exacerbates the shortage of psychiatrists and causal relationships between language patterns and clinically
psychologists. Psychiatrists’ use of talk therapy has been relevant outcomes. Psychotherapy quality will be difficult to
consistently and steadily declining, meaning fewer patients are assess without disentangling causal inferences and confounding
receiving talk therapy during psychiatric visits (30). In contrast factors. Beyond computation, patients’ attitudes matter in
to a human clinician’s time and attention, conversational AI is psychotherapy because those who have a negative experience
relatively non-consumable, making it an attractive alternative compared with their expectations have worse clinical outcomes
to delivery of care by a human. If conversational AI is effective (46). If a patient loses trust in a conversational AI, they may be
and acceptable to both patients and clinicians, it may address less likely to trust human clinicians as well. As conversational AI
longstanding challenges to mental health access. These include becomes more sophisticated and expectations of benefit increase,
the ability to accommodate rural populations and to facilitate there are growing concerns that users will transition from feeling
increased engagement from people who may experience let down to feeling betrayed (47). These factors suggest that
traditional talk therapy as stigmatizing (31). careful experimentation about sub-processes in AI-mediated
communication merits research attention.
Quality
Technology has been highlighted as a way to better understand Clinician–Patient Relationship
and disseminate high quality psychotherapy (32, 33). Clinicians Modern medicine views the patient–clinician relationship
are already using texting services to deliver mental health as critical to patient health (48), and provider wellness (49).
interventions (34), which demonstrates a willingness by patients Indeed, appreciation of the importance of the patient–clinician
and clinicians to test new approaches to patient-clinician relationship in modern medicine can be traced back to the
interaction. These new approaches facilitate novel measures influence of clinical psychology (50). Therapeutic alliance
of intervention quality. For example, innovations in computer develops from clinicians’ collaborative engagement with patients
science (e.g., natural language processing and machine learning) and reflects agreement on treatment goals, the tasks necessary to
are being used to assess language patterns of successful crisis achieve such goals, and the affective bond between patient and
interventions in text-based mental health conversations (18, provider (51). Therapeutic alliance is consistently associated with
35). Computational analysis of psychotherapy is encouraging symptom improvement in psychotherapy (52–54). Numerous
researchers and companies to identify patterns of patient approaches exist to create alliance during psychotherapy,
symptomology and therapist intervention (36, 37). This approach including the use of supportive language, mirroring emotions,
may improve psychotherapy quality by better understanding and projecting warmth. Although originally conceptualized
what effective clinicians actually do. This assessment has for human-to-human conversations, users have reported
historically occurred through clinicians’ self-reports or time experiencing a sense of therapeutic alliance when speaking
intensive human audits (e.g., 38). directly with conversational AI, suggesting this bond may not
Although its efficacy is not definitively established, there are necessarily be restricted to human-human relationships (3). If
reasons to expect that conversational AI could constructively conversational AI can create and maintain a therapeutic alliance,
enhance mental health diagnosis and treatment delivery (39, the provision of psychotherapy will not be necessarily limited by
40). A diagnostic interview aids the patient and clinician in human clinicians’ time and attention.
understanding the patient’s presenting problem and provides Establishing therapeutic alliance with conversational AI may
a working model of how problems are being maintained. benefit both patients and providers. By allowing conversational AI
Approaches vary from highly structured diagnostic interviews to take over repetitive, time-consuming tasks, clinicians’ attention

Frontiers in Psychiatry | www.frontiersin.org 3 October 2019 | Volume 10 | Article 746


Miner et al. Incorporating Conversational AI in Psychotherapy

and skill could be deployed more judiciously (55). Allowing been established on the premise of a dyadic relationship between
clinicians to do less of the work that contributes to burnout, such patient and clinician. The extent to which conversational AI
as repetitive tasks performed with little autonomy, may improve inherits liability for harm is untested. As conversational AI takes
clinicians’ job satisfaction (56). Clinician burnout is associated on clinical duties and informs clinical judgment, expectations
with worse patient outcomes and is increasingly recognized as a must be clarified about how and when these systems will respond
problem which must be more adequately addressed (57, 58). to issues related to confidentiality, safety, and liability.
At the same time, software that augments clinical duties has
been criticized for distancing clinicians from patient care (59). In
mental health, this risk is especially salient because the content DISCUSSION
of therapy is often quite intimate. Some of the repetitive, time-
consuming tasks clinicians engage in with patients, such as Experts in AI, clinicians, administrators, and other stakeholders
reviewing symptoms or taking their history, are precisely the recognize a need to more fully consider safety and trust in the
vehicles by which clinicians connect with and understand their design and deployment of new AI-based technologies (67, 68).
patients’ experiences and develop rapport. It is unknown whether A recent Lancet commission on global mental health states
having a conversational AI listen in on psychotherapy will that “technology-based approaches might improve the reach of
significantly impact patients’ and clinicians’ sense of therapeutic mental health services but could lose key human ingredients
alliance. This area merits further research. and, possibly, lower effectiveness of mental health care” (33).
To inform future research directions, we have presented four
Patient Self-Disclosure and Sharing approaches to integrating conversational AI into mental health
Patient self-disclosure of personal information is crucial for delivery and discussed the dimensions of their impact.
successful therapy, including sensitive topics such as trauma, Because conversational AI may augment the work of
substance use, sexual history, forensic history, and thoughts of self- psychotherapy, we seek to encourage product designers,
harm. Patient self-disclosures during psychotherapy are legally clinicians, and researchers to assess the impact of new practices
and ethically protected (24) and professional norms and laws have on both patients and clinicians. Other areas of medicine have
been established to set boundaries for what a clinician can share seen success with AI, such as lung cancer imaging and building
(60). Unauthorized sharing of identifiable patient information diagnostic or prognostic models (69–73), and conversational AI
can result in fines, loss of license, or even incarceration. Moreover, for health is an emerging field with limited research on efficacy
because of the natural limitations of human memory, patients and safety (40, 63, 74).
are unlikely to expect a human clinician to remember entire Before we deploy AI-mediated treatment, workflow changes
conversations perfectly in perpetuity. This capacity is in stark must be considered in the context of other demands on clinician
contrast to conversational AI, which has near-limitless capacity time and training. Clinicians are already being asked to be
to hear, remember, share, and analyze conversations as long familiar with telehealth (75) social media (76), and mobile health
as desired. Because humans and machines have such different (77), while simultaneously being reminded of the need for self-
capacities, patient expectations of AI capabilities may impact care in light of clinician burnout (58). Before we insert new
treatment decisions and consent to data sharing (23). devices into clinical care, it will be crucial to engage clinicians
In mental health, conversational AI has been shown to and design evaluation strategies that appreciate the skills,
both facilitate and impede disclosure in different contexts. For attitudes, and knowledge of affected workers. Just as we can’t
example, users were more open with a conversational AI than expect technology companies to easily understand healthcare, we
with a human listener in reporting mental health symptoms (20), can’t expect medical professionals to intuit or work in harmony
and have been successfully used to treat persecutory delusions with new technology without thoughtful design and training.
for people with psychosis (61). Conversely, users were more A limitation of this work is that we do not set out a specific
reluctant to disclose sensitive information such as binge drinking research agenda, and some important considerations are
behavior to a conversational AI compared to a non-responsive beyond the scope of this work (e.g., the cost and feasibility
questionnaire (62). Because personal disclosures are central to of each approach). We propose instead that initiatives
diagnosis and treatment in psychotherapy, users’ expectations using conversational AI anticipate challenges and leverage
and behavior towards technology-mediated conversations merit lessons learned from existing approaches to deploying new
further assessment (63, 64, 65). technology in clinical settings that involve clinician training
Certain disclosures in a psychotherapy context carry specific and patient protections from the start (32, 77). We instead
ethical and legal mandates, such as reporting suicidal or homicidal encourage those proposing to put AI into care settings to
ideation. In 1969, a therapist at the University of California did not directly consider and measure impact on access, quality,
share the homicidal ideation of a patient with the intended victim. relationships, and data sharing.
The patient subsequently killed the named victim, and the victim’s The potential benefits are clear for mental health. If diagnosis
family sued. This case (Tarasoff v. Regents of the University of or treatment can be done by conversational AI, the societal burden
California, 1974) established clinicians’ duty not only to protect of treating mental health could be diminished. Additionally,
the confidentiality of their patients but also to notify individuals conversational AI could have a more long-term relationship with
their patient might harm. A failure to warn leaves a clinician a patient than clinicians who rotate out of training centers. Despite
liable to civil judgment (66). Most case law and norms have these potential benefits, technology carries risks related to privacy,

Frontiers in Psychiatry | www.frontiersin.org 4 October 2019 | Volume 10 | Article 746


Miner et al. Incorporating Conversational AI in Psychotherapy

bias, coercion, liability, and data sharing that could harm patients ACKNOWLEDGMENTS
in expected (e.g., denial of health insurance) and unintended ways
(33, 44, 74, 78, 79–81). Conversations are valuable for patients and This work was supported by grants from the National Institutes
clinicians, and it is crucial to make sure they are delivered safely and of Health, National Center for Advancing Translational Science,
effectively, regardless of who or what does the talking. Clinical and Translational Science Award (KL2TR001083
and UL1TR001085), the Stanford Department of Psychiatry
Innovator Grant Program, and the Stanford Institute for
AUTHOR CONTRIBUTIONS Human-Centered Artificial Intelligence. The content is solely
the responsibility of the authors and does not necessarily
ASM and JH contributed to the initial conceptualization and design represent the official views of the NIH. We thank Nicole
of the manuscript. AM wrote the first draft. NS, KDB, BAA, and Martinez-Martin JD PhD, Victor Henderson MD MS, and Stan
JB contributed to manuscript revision, read and approved the Fisher for their valuable feedback. Reference formatting assisted
submitted version. by Charlesworth Author Services.

REFERENCES 15. Rehm IC, Foenander E, Wallace K, Abbott JA, Kyrios M, Thomas N. What
role can avatars play in e-mental health interventions? Front Psychiatry
1. Goldfried MR, Greenberg LS, Marmar C. Individual psychotherapy: process (2016) 7:186. doi: 10.3389/fpsyt.2016.00186
and outcome. Annu Rev Psychol (1990) 41(1):659–88. doi: 10.1146/annurev. 16. American Psychiatric Association. The principles of medical ethics with
ps.41.020190.003303 annotations especially applicable to psychiatry. (2001) Washington, DC:
2. Miner AS, Milstein A, Hancock JT. Talking to machines about personal Author
mental health problems. JAMA (2017) 318(13):1217–8. doi: 10.1001/ 17. American Psychological Association. Ethical principles of psychologists and
jama.2017.14151 code of conduct. Am Psychol (2002) 57(12):1060–73. doi: 10.1037//0003-
3. Bickmore T, Gruber A, Picard R. Establishing the computer–patient working 066X​.57.12.1060
alliance in automated health behavior change interventions. Patient Educ 18. Althoff T, Clark K, Leskovec J. Large-scale analysis of counseling conversations:
Couns (2005) 59(1):21–30. doi: 10.1016/j.pec.2004.09.008 an application of natural language processing to mental health. Trans Assoc
4. Rizzo A, Scherer S, DeVault D , Gratch J, Artstein R, Hartholt A, et al. Comput Lingu (2016) 4:463. doi: 10.1162/tacl_a_00111
Detection and computational analysis of psychological signals using a 19. Xiao B, Imel ZE, Georgiou PG, Atkins DC, Narayanan SS. Rate my therapist:
virtual human interviewing agent. 10th Intl Conf Disability, Virtual Reality automated detection of empathy in drug and alcohol counseling via speech
& Associated Technologies; Gothenburg, Sweden. (2014) Available at: http:// and language processing. PloS One (2015) 10(12):e0143055. doi: 10.1371/
ict.usc.edu/bibtexbrowser.php?key=rizzo_detection_2014&bib=ICT.bib journal.pone.0143055
(Accessed 15 Oct. 2018). 20. Lucas GM, Gratch J, King A, Morency LP. It’s only a computer: virtual
5. Bickmore TW, Puskar K, Schlenk EA, Pfeifer LM, Sereika SM. Maintaining humans increase willingness to disclose. Comput Hum Behav (2014) 37:94–
reality: relational agents for antipsychotic medication adherence. Interact 100. doi: 10.1016/j.chb.2014.04.043
Comput (2010) 22(4):276–88. doi: 10.1016/j.intcom.2010.02.001 21. Goldstein IM, Lawrence J, Miner AS. Human–machine collaboration in
6. Fitzpatrick KK, Darcy A, Vierhile M. Delivering cognitive behavior therapy cancer and beyond: the centaur care model. JAMA Oncol (2017) 3(10):1303–
to young adults with symptoms of depression and anxiety using a fully 4. doi: 10.1001/jamaoncol.2016.6413
automated conversational agent (Woebot): a randomized controlled trial. 22. Metaxiotis KS, Samouilidis JE. Expert systems in medicine: academic
JMIR Ment Health (2017) 4(2):e19. doi: 10.2196/mental.7785 illusion or real power? Inform Manage Comput Secur (2000) 8(2):75–9. doi:
7. Oh KJ, Lee D, Ko B, Choi HJ A chatbot for psychiatric counseling in mental 10.1108/09685220010694017
healthcare service based on emotional dialogue analysis and sentence 23. Martinez-Martin N, Dunn LB, Roberts LW. Is it ethical to use prognostic
generation. Mobile Data Management (MDM), 2017 18th IEEE International estimates from machine learning to treat psychosis? AMA J Ethics (2018)
Conference; 29.IEEE. (2017) pp. 371–375. doi: 10.1109/MDM.2017.64 20(9):804–11. doi: 10.1001/amajethics.2018.804
8. Imel ZE, Steyvers M, Atkins DC. Computational psychotherapy research: 24. Roberts LW. A clinical guide to psychiatric ethics. Arlington, VA: American
scaling up the evaluation of patient–provider interactions. Psychotherapy Psychiatric Pub (2016). ISBN: 978-1-61537-049-8.
(2015) 52(1):19. doi: 10.1037/a0036841 25. Kazdin AE, Rabbitt SM. Novel models for delivering mental health services
9. Crutzen R, Peters GJY, Portugal SD, Fisser EM, Grolleman JJ. An artificially and reducing the burdens of mental illness. Clin Psychol Sci (2013) 1(2):170–
intelligent chat agent that answers adolescents’ questions related to sex, 91. doi: 10.1177/2167702612463566
drugs, and alcohol: an exploratory study. J Adolesc Health (2011) 48(5):514– 26. Kazdin AE, Blase SL. Rebooting psychotherapy research and practice to
9. doi: 10.1016/j.jadohealth.2010.09.002 reduce the burden of mental illness. Perspect Psycholog Sci (2011) 6(1):21–37.
10. Martínez-Miranda J. Embodied conversational agents for the detection and doi: 10.1177/1745691610393527
prevention of suicidal behaviour: current applications and open challenges. 27. Dieleman JL, Baral R, Birger M, Bui AL, Bulchis A, Chapin A, et al. US
J Med Syst (2017) 41(9):135. doi: 10.1007/s10916-017-0784-6 spending on personal health care and public health, 1996–2013. JAMA
11. Bailenson JN, Beall AC, Loomis J, Blascovich J, Turk M. Transformed (2016) 316(24):2627–46. doi: 10.1001/jama.2016.16885
social interaction: decoupling representation from behavior and form in 28. Olfson M. Building the mental health workforce capacity needed to treat
collaborative virtual environments. Presence: Teleop Virt Environ (2004) adults with serious mental illnesses. Health Affairs (2016) 35(6):983–90. doi:
13(4):428–41. doi: 10.1162/1054746041944803 10.1377/hlthaff.2015.1619
12. Luxton DD. Recommendations for the ethical use and design of artificial 29. Kaplan RS, Haas DA, Warsh J. Adding value by talking more. N Engl J Med
intelligent care providers. Artif Intell Med (2014) 62(1):1–0. doi: 10.1016/j. (2016) 375(20):1918–20. doi: 10.1056/NEJMp1607079
artmed.2014.06.004 30. Mojtabai R, Olfson M. National trends in psychotherapy by office-based
13. Hyman L. Temp: how American work, American business, and the American psychiatrists. Arch Gen Psychiatry (2008) 65(8):962–70. doi: 10.1001/
dream became temporary. New York, New York: Penguin Random House archpsyc.65.8.962
(2018). ISBN: 9780735224070. 31. Perle JG, Langsam LC, Nierenberg B. Controversy clarified: an updated
14. Luxton DD. Artificial intelligence in behavioral and mental health care. review of clinical psychology and tele-health. Clin Psychol Rev (2011)
Elsevier/Academic Press (2016). doi: 10.1016/B978-0-12-420248-1.00001-5 31(8):1247–58. doi: 10.1016/j.cpr.2011.08.003

Frontiers in Psychiatry | www.frontiersin.org 5 October 2019 | Volume 10 | Article 746


Miner et al. Incorporating Conversational AI in Psychotherapy

32. Mohr DC, Schueller SM, Montague E, Burns MN, Rashidi P. The behavioral 52. Flückiger C, Del Re AC, Wampold BE, Symonds D, Horvath AO. How
intervention technology model: an integrated conceptual and technological central is the alliance in psychotherapy? A multilevel longitudinal meta-
framework for eHealth and mHealth interventions. J Med Internet Res (2014) analysis. J Couns Psychol (2012) 59(1):p.10. doi: 10.1037/a0025749
16(6):e146. doi: 10.2196/jmir.3077 53. Horvath AO, Del Re AC, Flückiger C, Symonds D. Alliance in individual
33. Patel V, Saxena S, Lund C, Thornicroft G, Baingana F, Bolton P, et al. The psychotherapy. Psychotherapy (2011) 48(1):9. doi: 10.1037/a0022186
Lancet Commission on global mental health and sustainable development. 54. Norcross JC ed. Psychotherapy relationships that work: therapist contributions
Lancet (2018) 392(10157):1553–98. doi: 10.1016/S0140-6736(18)31612-X and responsiveness to patient needs. New York: Oxford University Press
34. Schaub MP, Wenger A, Berg O, Beck T, Stark L, Buehler E, et al. A web-based (2002).
self-help intervention with and without chat counseling to reduce cannabis 55. Jha S, Topol EJ. Adapting to artificial intelligence: radiologists and
use in problematic cannabis users: three-arm randomized controlled trial. pathologists as information specialists. JAMA (2016) 316(22):2353–4. doi:
J Med Internet Res (2015) 17(10):e232. doi: 10.2196/jmir.4860 10.1001/jama.2016.17438
35. Dinakar K, Chen J, Lieberman H, Picard R, Filbin R. Mixed-initiative real- 56. Harvey SB, Modini M, Joyce S, Milligan-Saville JS, Tan L, Mykletun A, et al.
time topic modeling & visualization for crisis counseling. Proceedings of Can work make you mentally ill? A systematic meta-review of work-related
the 20th International Conference on Intelligent User Interfaces; Atlanta GA: risk factors for common mental health problems. Occup Environ Med (2017)
ACM. (2015) pp. 417–426. doi: 10.1145/2678025.2701395 74(4):301–10. doi: 10.1136/oemed-2016-104015
36. Owen J, Imel ZE. Introduction to the special section “Big ‘er’Data”: scaling 57. Delgadillo J, Saxon D, Barkham M. Associations between therapists’
up psychotherapy research in counseling psychology. J Couns Psych (2016) occupational burnout and their patients’ depression and anxiety treatment
63(3):247. doi: 10.1037/cou0000149 outcomes. Depress Anxiety (2018) 35:844–50. doi: 10.1002/da.22766
37. Iter D, Yoon J, Jurafsky D. Automatic detection of Incoherent speech 58. Panagioti M, Panagopoulou E, Bower P, Lewith G, Kontopantelis E, Chew-
for diagnosing schizophrenia. Proceedings of the Fifth Workshop on Graham C, et al. Controlled interventions to reduce burnout in physicians: a
Computational Linguistics and Clinical Psychology: From Keyboard to Clinic; systematic review and meta-analysis. JAMA Int Med (2017) 177(2):195–205.
New Orleans, LA. (2018) pp.136–146. doi: 10.18653/v1/W18-0615 doi: 10.1001/jamainternmed.2016.7674
38. Cook JM, Biyanova T, Elhai J, Schnurr PP, Coyne JC. What do 59. Verghese A. Culture shock-patient as icon, icon as patient. N Engl J Med
psychotherapists really do in practice? An Internet study of over 2,000 (2008) 359(26):2748–51. doi: 10.1056/NEJMp0807461
practitioners. Psychotherapy (2010) 47(2):260. doi: 10.1037/a0019788 60. Edwards G. Doing their duty: an empirical analysis of the unintended
39. Haque A, Guo M, Miner AS, Fei-Fei L. Measuring depression symptom severity effect of Tarasoff v. Regents on homicidal activity. J Law and Econ (2014)
from spoken language and 3D facial expressions. Paper presented at NeurIPS 57(2):321–48. doi: 10.1086/675668
2018 Workshop on Machine Learning for Health. Montreal, Canada (2018). 61. Craig TK, Rus-Calafell M, Ward T, Leff JP, Huckvale M, Howarth E,
40. Laranjo L, Dunn AG, Tong HL, Kocaballi AB, Chen J, Bashir R, et al. et al. AVATAR therapy for auditory verbal hallucinations in people with
Conversational agents in healthcare: a systematic review. J Am Med Inform psychosis: a single-blind, randomised controlled trial. Lancet Psychiatry
Assoc (2018) 25(9):1248–58. doi: 10.1093/jamia/ocy072 (2018) 5(1):31–40. doi: 10.1016/S2215-0366(17)30427-3
41. First MB, Williams JBW, Karg RS, Spitzer RL. Structured clinical interview for 62. Schuetzler RM, Giboney JS, Grimes GM, Nunamaker JF The influence of
DSM-5 disorders, clinician version (SCID-5-CV). Arlington, VA: American conversational agents on socially desirable responding. Proceedings of the
Psychiatric Association (2016). 51st Hawaii International Conference on System Sciences; Waikoloa Village,
42. Huang J, Li Q, Xue Y, Cheng T, Xu S, Jia J, et al. Teenchat: a chatterbot system Hawaii. (2018) pp. 283–292. ISBN: 978-0-9981331-1-9. doi: 10.24251/
for sensing and releasing adolescents’ stress. International Conference on HICSS.2018.038
Health Information Science; Queensland, Australia. Springer, Cham (2015) 63. Bickmore T, Trinh H, Asadi R, Olafsson S. (2018a) Safety first: Conversational
pp. 133–145. doi: 10.1007/978-3-319-19156-0_14 agents for health care. In: Moore, R, Szymanski, M, Arar, R, Ren, GJ, editors.
43. Ho A, Hancock J, Miner AS. Psychological, relational, and emotional effects Studies in Conversational UX Design. Human–Computer Interaction Series.
of self-disclosure after conversations with a Chatbot. J Commun (2018) Springer, Cham. doi: 10.1007/978-3-319-95579-7_3
68(4):712–33. doi: 10.1093/joc/jqy026 64. French M, Bazarova NN. Is anybody out there?: understanding masspersonal
44. Aboujaoude E. Telemental health: Why the revolution has not arrived. World communication through expectations for response across social media
Psychiatry (2018) 17(3):277. doi: 10.1002/wps.20551 platforms. J Comput Mediat Commun (2017) 22(6):303–19. doi: 10.1111/
45. Miner AS, Milstein A, Schueller S, Hegde R, Mangurian C, Linos E. jcc4.12197
Smartphone-based conversational agents and responses to questions about 65. Liu B, Sundar SS. Should machines express sympathy and empathy?
mental health, interpersonal violence, and physical health. JAMA Int Med Experiments with a health advice chatbot. Cyberpsychol Behav Soc Netw
(2016) 176(5):619–25. doi: 10.1001/jamainternmed.2016.0400 (2018) 21(10):625–36. doi: 10.1089/cyber.2018.0110
46. Watsford C, Rickwood D. Disconfirmed expectations of therapy and young 66. Swerdlow BA. Tracing the evolution of the Tarasoff Duty in California.
people’s clinical outcome, help-seeking intentions, and mental health service J Sociol Soc Welfare (2018) 45:25.
use. Adv MentHealth (2013) 12(1):75–86. doi: 10.5172/jamh.2013.12.1.75 67. Bhugra D, Tasman A, Pathare S, Priebe S, Smith S, Torous J, et al. The WPA-
47. Brooker N. “We should be nicer to Alexa.” Financial Times (2013). https:// lancet psychiatry commission on the future of psychiatry. Lancet Psychiatry
www.ft.com/content/4399371e-bcbd-11e8-8274-55b72926558f (Accessed (2017) 4(10):775–818. doi: 10.1016/S2215-0366(17)30333-4
October 15, 2018). 68. Stone P, Brooks R, Brynjolfsson E, Calo R , Etzioni , O , et al. “Artificial
48. Martin DJ, Garske JP, Davis MK. Relation of the therapeutic alliance with Intelligence and Life in 2030.” One Hundred Year Study on Artificial
outcome and other variables: a meta-analytic review. J Consult Clin Psychol Intelligence: Report of the 2015-2016 Study Panel, Stanford University,
(2000) 68(3):438. doi: 10.1037//0022-006X.68.3.438 Stanford, CA, (2016) Doc: http://ai100.stanford.edu/2016-report. (accessed
49. Rosenthal DI, Verghese A. Meaning and the nature of physicians’ work. N October 15, 2018).
Engl J Med (2016) 375(19):1813–5. doi: 10.1056/NEJMp1609055 69. Avati A, Jung K, Harman S, Downing L, Ng A, Shah NH. Improving palliative
50. Szasz TS, Hollender MH. A contribution to the philosophy of medicine: the care with deep learning. IEEE International Conference on Bioinformatics
basic models of the doctor–patient relationship. AMA Arch Int Med (1956) and Biomedicine; Kansas City:IEEE, MO. (2017). pp. 311–316. doi: 10.1109/
97(5):585–92. doi: 10.1001/archinte.1956.00250230079008 BIBM.2017.8217669
51. Horvath AO, Greenberg LS. Development and validation of the 70. Jung K, Covington S, Sen CK, Januszyk M, Kirsner RS, Gurtner GC, et al.
working alliance inventory. J Couns Psych (1989) 36(2):223. doi: Rapid identification of slow healing wounds. Wound Repair Regen (2016)
10.1037//0022-0167.36.2.223 24(1):181–8. doi: 10.1111/wrr.12384

Frontiers in Psychiatry | www.frontiersin.org 6 October 2019 | Volume 10 | Article 746


Miner et al. Incorporating Conversational AI in Psychotherapy

71. Gulshan V, Peng L, Coram M, Stumpe MC, Wu D, Narayanaswamy A, et al. 78. Bickmore TW, Trinh H, Olafsson S, O’Leary TK, Asadi R, Rickles NM,
Development and validation of a deep learning algorithm for detection et al. (2018b) Patient and consumer safety risks when using conversational
of diabetic retinopathy in retinal fundus photographs. JAMA (2016) assistants for medical information: an observational study of Siri, Alexa, and
316(22):2402–10. doi: 10.1001/jama.2016.17216 google assistant. J Med Int Res 20(9):e11510. doi: 10.2196/11510
72. Pusiol G, Esteva A, Hall SS, Frank M, Milstein A, Fei-Fei L. Vision- 79. Caliskan A, Bryson JJ, Narayanan A. Semantics derived automatically from
based classification of developmental disorders using eye-movements. language corpora contain human-like biases. Science (2017) 356(6334):183–
International Conference on Medical Image Computing and Computer- 6. doi: 10.1126/science.aal4230
Assisted Intervention; Athens, Greece: Springer, Cham (2016) pp. 317–325. 80. De Choudhury M, Sharma SS, Logar T, Eekhout W, Nielsen RC. Gender
doi: 10.1007/978-3-319-46723-8_37 and cross-cultural differences in social media disclosures of mental illness.
73. Yu KH, Zhang C, Berry GJ, Altman RB, Ré C, Rubin DL, et al. Predicting Proceedings of the 2017 ACM Conference on Computer Supported Cooperative
non-small cell lung cancer prognosis by fully automated microscopic Work and Social Computing; Portland, OR. (2017) pp. 353–369. doi:
pathology image features. Nat Commun (2016) 7:12474. doi: 10.1038/ 10.1145/2998181.2998220
ncomms12474 81. Martinez-Martin N, Kreitmair K. Ethical issues for direct-to-consumer
74. Bai G, Jiang JX, Flasher R. Hospital risk of data breaches. JAMA Int Med digital psychotherapy apps: addressing accountability, data protection, and
(2017) 177(6):878–80. doi: 10.1001/jamainternmed.2017.0336 consent. JMIR Ment Health (2018) 5(2):e32. doi: 10.2196/mental.9423
75. Maheu MM, Drude KP, Hertlein KM, Lipschutz R, Wall K, Hilty DM.
An interprofessional framework for telebehavioral health competencies. Conflict of Interest: The authors declare that the research was conducted in the
J Technol Behav Sci (2017) 2(3–4):190–210. doi: 10.1007/s41347-017-0038-y absence of any commercial or financial relationships that could be construed as a
76. Zalpuri I, Liu HY, Stubbe D, Wrzosek M, Sadhu J, Hilty D. Social media potential conflict of interest.
and networking competencies for psychiatric education: skills, teaching
methods, and implications. Acad Psychiatry (2018) 42(6):808–17. doi: Copyright © 2019 Miner, Shah, Bullock, Arnow, Bailenson and Hancock. This is an open-
10.1007/s40596-018-0983-6 access article distributed under the terms of the Creative Commons Attribution License
77. Hilty DM, Chan S, Torous J, Luo J, Boland RJ. A Telehealth framework (CC BY). The use, distribution or reproduction in other forums is permitted, provided
for mobile health, smartphones, and apps: competencies, training, and the original author(s) and the copyright owner(s) are credited and that the original
faculty development. J Technol Behav Sci (2019) 1–18. doi: 10.1007/ publication in this journal is cited, in accordance with accepted academic practice. No
s41347-019-00091-0 use, distribution or reproduction is permitted which does not comply with these terms.

Frontiers in Psychiatry | www.frontiersin.org 7 October 2019 | Volume 10 | Article 746

You might also like