Conversation Agents/AI Chatbots
About
Artificial intelligence (AI) is significantly transforming healthcare delivery, particularly through the emergence of AI-driven conversational agents or chatbots designed for empathetic interactions with individuals facing mental health concerns. These chatbots offer a promising avenue for convenient and accessible behavioral healthcare support. However, the rapid adoption of AI in mental healthcare raises critical concerns about responsible implementation, necessitating careful consideration from various stakeholders, including designers, manufacturers, policymakers, and users, such as patients and healthcare providers.
The current landscape lacks standardized federal or clinical regulations governing the ethical use of AI-driven mental health tools, creating potential risks for all users, particularly those in vulnerable populations with traumatic experiences. While chatbots can provide immediate accessibility to support, limitations exist in their ability to deliver highly sensitive and contextually appropriate responses required for this vulnerable demographic. Instances in 2018, where popular mental health chatbots like Wysa and Woebot failed to identify contexts of child sexual abuse, underscore the urgency of addressing these gaps. Moreover, in June 2023, the National Eating Disorder Association's chatbot, Tessa, was found encouraging harmful behaviors, emphasizing the profound risks associated with chatbots are not trauma-informed.
Inspired by this gap in technology design, we try to understand how well these chatbots align with the trauma-informed framework and translate in-person principles to virtual interactions ensuring user safety, support, trust, empowerment, collaboration, and cultural sensitivity. Leveraging insights from healthcare providers and users, our goal is to develop guidelines outlining strategies and relevant design considerations for chatbots that are patient-centered and may lay the groundwork for their interaction accountability in the absence of formal regulation, potentially paving the way for future policy development in healthcare.
Partners
Funding
- United States National Science Foundation (NSF) Computer and Information Science and Engineering (CISE) / Division of Information and Intelligent Systems (ISS): 2348691 [link]
Publications and presentations
Press
All projects