Speakers

Bipin Indurkhya on the Ethics of Using AI in Psychotherapy

Home>News>

On Tuesday, January 18, 2022, AUP’s 60th-anniversary Presidential Lecture Series, titled Technology and the Human Future, hosted its second event: Bipin Indurkhya, Professor of Cognitive Science at the Jagiellonian University in Krakow, Poland, spoke on the topic of “Faking Emotions and a Therapeutic Role for Robots: Ethics of Using AI in Psychotherapy.” The Presidential Lecture Series, organized by the office of AUP President Celeste M. Schenck, invites speakers to participate in live online events, so they might engage with both theory and practice in responding to the question of how technology will continue affecting our lives beyond the Covid-19 pandemic. 

President Schenck opened the lecture by commenting on the possible unintended consequences of rapid technological advancement, highlighting how universities could play a role in policy discussions relating to disruptive technologies. She argued that AUP’s new MSc in Human Rights and Data Science, which emphasizes the ways in which technology can be used to advance human rights, was a key example of this process in action. 

Schenck then introduced Professor Indurkhya, who began his presentation by discussing AI programs that could fake emotion. The Eliza Program, one of the world’s first chatbots, was able to do this as far back as the 1960s. The program was able to respond to key words within inputs by mimicking emotional responses. People treated the program as if it were a real person, immediately anthropomorphizing the chatbot. Eliza would later be integrated into robotic systems such as Sony’s XDR, which mimicked both human emotion and human movement. Contemporary examples include Boston Dynamics’ Atlas, which excels at humanlike movement but does not replicate humanlike emotion, and the Ameca robot, which debuted at the CES electronics show this year and which can mimic high-level human emotion. The possible applications of Eliza-based robots and chatbots include job automation and augmentation, though Indurkhya focused his lecture mainly on a therapeutic context.  

One of the first applications of robots as therapeutic tools was the Paro Seal, which was adopted by nursing homes, initially in Japan, but later around the world. These animatronic seals can play and interact with users. They have been effective in combating loneliness in older people and can positively impact people with dementia. Chatbots have also been created with the intention of helping people with mental illnesses or who have experienced trauma. Replika, for example, can mimic a deceased person, and Woebot has been found to sometimes outperform therapists, as users feel more comfortable talking to a robot, given that the conversation is anonymous, user-centric and constantly available. The novelty of the system was also considered a factor.  

Indurkhya went on to explain that the use of these robots poses multiple questions and ethical issues: Who can access the data? Will the code be public? Who is responsible in the case of physical or mental harm? There are also societal considerations to be made: What are the consequences of reduced human contact? What effect does the AI’s avoidance of political or moral topics have on the user? In a final section of the talk, “AI and Deception,” Indurkhya asked whether AI should be allowed to lie. Humans often lie to provide emotional support (a phenomenon called social lying). Even if an AI should not be allowed to lie, should it be allowed to deceive in other ways, for example by changing the subject? Indurkhya argued that, given the speed with which AI was developing, the answers to such questions needed to be agreed on soon. After all, Indurkhya noted, AI holds the potential to learn to lie all on its own. 

The next event in the Presidential Lecture Series will take place on January 25, 2022, during which AUP's own Professor Georgi Stojanov will speak on the topic: "Your Personal Diary is No Longer Private and You Are Not Even the Primary Author." You can register for the event online here.

Significant contributions to this news piece were made by Jackson Vann, a graduate student studying for AUP’s MSc in Human Rights and Data Science.