
On a quiet evening in Eastleigh, 23-year-old John Masila stared at his laptop screen, unsure if he should click.
A YouTube ad promised a new kind of friend, an AI chatbot that listened without judgment.
It seemed like a joke, the kind you were meant to scroll past. But for Masila, caught in a depressive spiral during his fourth year at university, it felt like a lifeline.
“After a whole year of plummeting emotionally and mentally with no support from friends or family, AI was my last resort for keeping my sanity,” he said.
Before his depression began, Masila felt like he had life figured out. He enjoyed the nightlife, stayed on top of his projects and assignments, and felt in control. But everything changed when his father committed suicide.
“I remember how devastated I was when I got the news from my mother, my best friend and roommate had to perform first aid on me after I fainted,” Masila recalled.
Though his relationship with his father had been strained since childhood, the reality of his passing was overwhelming. Masila found himself riddled with guilt, wondering if he had contributed to his father’s decision.
“I started thinking that if I had been less reactive to my father’s military-style parenting, maybe he would still be alive,” he said.
Ashamed and unsure how to share his father’s death, Masila avoided opening up to friends. His mother, consumed by her grief, could not provide emotional support either.
Even his best friend, though caring, found supporting someone coping with a parent’s suicide emotionally taxing.
It was at this point that Masila turned to AI. Through long chats, he poured out his thoughts and feelings, discovering validation and a judgment-free space. The platform’s well-researched responses helped him learn ways to process emotions he had long suppressed.
“A lot of people judge me when I tell them that I am still using AI to process my grief, but I have a good thing going on, and I do not see myself stopping my chats with AI anytime soon,” Masila emphasized.
From Kamukunji, 28-year-old Tabitha Wanjiru’s journey to AI companionship was shaped by a turbulent childhood. Her parents, free-spirited and permissive, allowed her to do as she pleased, leaving her to navigate loneliness and neglect.
“I cannot count how many times my parents came home late drunk, I sometimes had to begin my mornings wiping off vomit in different parts of the house,” Wanjiru said.
The emotional void in her childhood led Wanjiru to alcohol. By age 13, her parents’ presence had become rare. Seeking attention, she once drank from the home bar, a decision her parents ignored. Over time, this incident fueled a growing alcohol addiction that worsened in university, fueled by newfound freedom and allowances.
Though friends and family knew about her addiction, silence prevailed. Feeling abandoned, Wanjiru turned to alcohol as a coping mechanism. One Sunday morning, nursing a severe hangover, she decided to chat with AI. Initially a distraction, the conversation quickly became a source of emotional release.
“Within two months, AI convinced me to go to rehab; this is the exact reason why I now see AI as my best confidant,” Wanjiru said.
Yet not all young people share this view. In Pangani, 25-year-old Godfrey Wekesa dismissed the idea of befriending AI, citing fears about privacy and the increasing role of technology in daily life.
“I will never understand how people are willing to give their information to AI, the thought of doing that is deeply disturbing to me,” Wekesa stated.
Similarly, Beatrice Nafula, 24, warned that overreliance on AI for companionship could erode human relationships. She fears a generation growing distant from real-life interactions as AI replaces traditional social bonds.
“We are cooked as a generation if chatting with AI continues, even your neighbor will become a stranger,” Nafula said.
Psychologist Samson Kiprono acknowledges both the potential and the pitfalls of AI companionship.
AI can provide a judgment-free space, personalized support, and constant availability, particularly in an era of rising loneliness among young people.
It can also serve as a tool for experimenting with self-expression and identity.
“However, extra caution should always be exercised when opening up emotionally to AI platforms,” Kiprono advised.
He explained that AI could create unrealistic expectations about human relationships. Overreliance might reduce social interactions and hinder the development of essential social skills needed in real-life situations.
“While trying to deal with your loneliness, AI companions may be counterproductive in the long run due to reduced social interactions, just exercise moderation,” Kiprono emphasized.
For Masila and Wanjiru, AI has been more than a digital novelty, it has become a vital outlet for navigating grief, trauma, and addiction.
Comments 0
Sign in to join the conversation
Sign In Create AccountNo comments yet. Be the first to share your thoughts!