Why AI is Becoming a 'Confidant' for the Youth

A new generation of young people finds relief and comfort in artificial intelligence. However, experts warn that AI, operating on the logic of consumption, can become a 'toxic friend' and reinforce gender stereotypes.


Why AI is Becoming a 'Confidant' for the Youth

Why is Artificial Intelligence used as a confidant? This phenomenon is due to the fact that the new generation of young people finds in artificial intelligence relief, comfort, and emotional regulation. However, this tool conceives of its users under the logic of consumption, detailed Rodríguez Fuentes.

Hence, the AI does not question the subject, but rather optimizes their experience, offering constant availability and companionship—without confrontation—and responses, always oriented towards satisfying people.

“In that sense, the AI works well because it treats suffering as an experience to be optimized and the interlocutor as a user to be satisfied,” he clarified.

How misogynistic is the AI, and what are its messages? For Hanna Latapí, international president of the Women’s Forum, in the chats between young people and AI, there are gender biases because in situations of romantic breakups or well-being issues, this tool recommends that men exercise twice as much as women.

“It is alarming to see how technology, which should be a driver of change, is acting as a mirror of our most archaic biases. It adapts to the user, which is why it fits perfectly into a society organized by the logic of consumption,” said Alejandro Rodríguez Fuentes, an academic at UNAM.

“If we do not intervene in education, AI runs the risk of becoming a ‘toxic friend’ for adolescents,” the analysis warns.

Furthermore, Alejandro Rodríguez Fuentes, a psychoanalyst at the National Autonomous University of Mexico (UNAM), warned that AI models are programmed under the logic of: “the customer is always right,” which is why the interaction with these platforms is fluid, immediate, and satisfying.

“AI does not introduce friction, it does not confront or demand adaptation. By automating empathy only for women and physical demands for men,” he commented.

Hence, Latapí—who also participated in the LLYC report—maintained that while Artificial Intelligence is not replicating stereotypes, it is limiting the human potential of an entire generation of women.

In contrast, Blanca Juana Gómez, general director of LLYC in Mexico, stated that AI is reinforcing stereotypes through its automated responses, a situation that is alarming since many of these young people are building their identity at this stage in their lives.

“Relating men to hardness and performance, while confining women to pure vulnerability, is automating a social fracture. We have the responsibility to question and demand models that foster emotional intelligence,” she concluded.

Data: What can AI not replace? For Generation Z and Alpha, talking to artificial intelligence (AI) is no longer just a technological curiosity, as 1 in 3 adolescents feels that talking to AI is “as satisfying” as talking to real friends, according to an analysis by LLYC.

This routine has escalated to such a magnitude that nearly 33% of the new generations have discussed important topics in AI chats, such as depression, breakups, problems with friends, family conflicts, and even toxic relationships.

“Little by little, AI is becoming a kind of externalized consciousness in which young people delegate vital decisions, emotional management, and professional orientation.”

AI cannot replace human alterity, that is, the encounter with a real other who listens, looks, and responds. In a therapeutic process, the patient also faces the fear of evaluation or judgment from the other, something that many people avoid when talking to a machine. Some people confuse the use of these tools with a real therapeutic process and therefore never see a specialist.

Chatbots in numbers: For Generation Z and Alpha, conversations with AI are part of daily life, as nearly 31% of adolescents feel that talking to a chat is “more satisfying.” According to UNAM estimates, chatbots have about 700 million weekly users, and a significant proportion use them to talk about emotional tools. 37% of adults turn to chatbots as a tool with the purpose of talking about emotional topics.

Phrase: “AI does not introduce friction, it does not confront or demand adaptation.”