Chatbot challenge studied
New research shows that social chatbots could be doing more harm than good for neurodiverse people.
Australian researchers have flagged potential concerns over the use of new AI tools, calling for more studies into the impact of the software on neurodiverse people and those who find human interaction difficult.
While the AI chatbot is appealing to many people who struggle with face-to-face conversations, the technology may foster bad habits that could lead to further social isolation.
That is the view of University of South Australia and Flinders University researchers in a recent essay published in the Journal of Behavioural Addictions.
The researchers say that chatbots, now integrated into social networking platforms like Snapchat, could perpetuate communication difficulties for people with autism, anxiety and limited social skills.
Lead researcher, UniSA Psychology Honours student Andrew Franze, says the rapid development of social chatbots has pros and cons which need investigating.
“Young people with social deficiencies tend to gravitate towards companionship with online social chatbots in particular,” Franze says.
“They offer a safe means of rehearsing social interaction with limited or no risk of negative judgement based on appearance or communication style. However, there is a risk they can become dependent on chatbots and withdraw even further from human interactions.”
Franze says the inability of chatbots to have a real “conversation,” or display empathy and soft emotional skills, can reinforce dysfunctional habits in many neurodiverse people.
“Some chatbots have a generally servile quality and so there is no resistance or opposing view that characterises human conversations. This means that users can control the conversation completely; they can pause it, delay it, or even terminate the conversation. All of this is counterproductive to developing appropriate social skills in the real world.”
While social chatbots may relieve social anxiety, this relief may develop into a form of dependency that negatively impacts on actual relationships.
The researchers say that industry-linked research has promoted the benefits of commercial chatbot applications, but feedback from parents, family members, teachers and therapists is needed to gain a broader understanding of its impacts.
“We need to gather evidence about the myriad of ways that these technologies can influence vulnerable users who may be particularly drawn to them,” Franze says.
“Only then can we develop policies and industry practices that guide the responsible and safe use of chatbots.”