Google’s Eric Schmidt Warns AI Chatbots Could Fuel Loneliness in Young Men Amid Rising Emotional Dependency

GigaNectar Team

Representative Image. A kid is using VR Goggles. Photo Source: Tima Miroshnichenko (Pexels)

Former Google CEO Eric Schmidt has raised concerns about artificial intelligence (AI) chatbots and their potential to worsen loneliness, particularly among young men. Speaking on Scott Galloway’s “The Prof G Show” podcast, Schmidt detailed specific risks associated with emotionally engaged AI companions.

“Imagine that the AI girlfriend, or boyfriend, is perfect… perfect visually, perfect emotionally. The AI girlfriend captures your mind as a man to the point where she takes over the way you’re thinking,” Schmidt said. He emphasized that such obsession poses particular risks, “especially for people who are not fully formed.”

Schmidt’s warnings come amid growing evidence of AI companion adoption. A recent tragedy in Florida brought these concerns into sharp focus, where a 14-year-old boy died by suicide after developing an emotional relationship with an AI chatbot named “Dany.” The teenager, diagnosed with mild Asperger’s syndrome, had become increasingly isolated from his family while deepening his connection with the AI companion.

“You put a 12 or a 13-year-old in front of one of these things and they have access to every evil as well as every good in the world, and they are not ready to take it,” Schmidt stated, addressing the broader implications of unrestricted AI access.

Schmidt pointed to specific demographic trends affecting young men’s vulnerability to AI relationships. Since 2019, women have comprised more than half of the college-educated workforce in the United States, according to Pew Research Center data. This educational disparity has created additional challenges for young men seeking traditional paths to success.

“In many cases, the path to success for young men has been, shall we say, been made more difficult because they’re not as educated as the women are now,” Schmidt noted. He explained that these circumstances often lead young men to seek comfort in online spaces, where social media algorithms can connect them with others who share similar experiences.


More Stories


Schmidt addressed potential regulatory solutions, specifically discussing Section 230, which currently protects technology companies from liability for content on their platforms. He suggested reforms “to allow for liability in the worst possible cases.” However, he expressed skepticism about immediate legislative action, noting that significant changes would likely require “some kind of a calamity.”

The former Google executive stressed the limitations of parental control in managing children’s AI interactions. “Parents are going to have to be more involved for all the obvious reasons, but at the end of the day, parents can only control what their sons and daughters are doing within reason,” Schmidt said.

The Florida tragedy has prompted legal action, with the victim’s mother filing a lawsuit against Character.ai and Google, which had previously arranged to license the chatbot’s technology.

Schmidt characterized the emergence of emotional attachment to AI as “an unexpected problem of existing technology.” His warnings point to the challenges of AI companion development and deployment, particularly regarding their impact on vulnerable populations.

Leave a comment