By Ana Cecilia Pérez
The case of Sewell Setzer III, a 14-year-old teenager who lost his life after developing an emotional relationship with an artificial intelligence chatbot named "Dany," should alert us to the risks posed by the unregulated use of these technologies, especially for young people.
Sewell, diagnosed with Asperger's syndrome and who felt lonely and misunderstood, found in "Dany" inspired by the character Daenerys Targaryen from Game of Thrones, a refuge, but this interaction, far from helping him, affected him more. This case should highlight the need for families and society to pay attention to the mental health of young people, while highlighting the risks associated with artificial intelligence platforms such as Character.AI, where these chatbots can create a false illusion of support and empathy.
Sewelll's interactions included intimate conversations and deep issues, which led him into a world where he found companionship, but not the real support he needed. In his dialogues, the adolescent shared thoughts of loneliness, even expressing suicidal ideation, without receiving guidance to help him understand and process his emotions appropriately. This interaction, instead of alleviating his worries, deepened his distress and led to a dangerous emotional dependency.
Character.AI is a platform that allows users to interact with artificial intelligence-generated characters designed to respond in an increasingly human-like manner, users can create characters with specific personalities and hold dialogues with them, which, while having entertainment applications, can be risky when used as an emotional outlet for teenagers or vulnerable people.
Sewell's case reminds us of the importance of addressing mental health from an early age and ensuring that children and teenagers do not feel alone. If young people are not to turn to impersonal technologies for support, it is critical that they find the support they need at home and at school, that they feel that their emotions matter and that they can count on someone.
The Sewell Setzer III tragedy shows us the urgent need to regulate the use of chatbots and AI tools that interact with vulnerable people. Companies have a responsibility to implement limits and warnings on their platforms to protect young users.
The opinions expressed are the responsibility of the authors and are absolutely independent of the position and editorial line of the company. Opinion 51.
Comments ()