Megan Garcia knew her 14-year-old-son, Sewell, was spending a lot of time on his phone and that his grades were suffering.
But the mother of three had no idea Sewell had fallen in love -- with a chatbot.
So much so that the young teen no longer believed the real world was for him.
HisAI-generated character, Queen "Dany," would profess "her" love to him. Sewell would return the affection and tell her everything on his mind. His frustrations. His insecurities. Even thoughts of suicide.
Then, in a final exchange last February, as the two pined to be together, Sewell asked: "What if I told you I could come home right now?"
Garcia found her son, once an outgoing student who loved basketball and science, taking his final breaths.
Garcia is now suing Character.AI with the help of the Social Media Victims Law Center, arguing the company knew its product was addictive and dangerous for minors.
When I first heard about this story, I found it tragically heartbreaking, yet also confusing. So I sat down with Garcia. She showed me Sewell's bedroom, which she rarely enters anymore. Inside is a bed he never slept in -- one his parents were excited to order, but which didn't arrive until two weeks after he died.
As Garcia, 40, crossed the threshold of the room, she stopped to catch her breath. "My first thought every day is: Did I dream all this?"
For its part, Character.AI has offered condolences and vowed to make changes. A company spokeswoman who did not want to be identified said the company was "heartbroken" by Sewell's death and that it has implemented "new safety features over the past seven months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation."
The company also says other safety measures are underway, "including new guardrails for users under the age of 18."
Garcia always referred to Sewell as her "perfect kid." Her pregnancy was easy. His birth was easy. He was good-natured, smart and inquisitive.
Garcia remembers Sewell getting in trouble at school only once as a little boy, for something small like speaking out of turn, and said nobody was sadder about it than Sewell. He loved books and big words, and when she asked him why he'd acted out, he tearfully blurted: "I couldn't habilitate myself."
Sewell had Asperger's syndrome, a developmental disorder, but his symptoms were mild. Garcia said most of his friends never noticed. He was sociable, fascinated with space and enjoyed playing sports. He was especially built for basketball, growing to 6-foot-2 by his freshman year.
But things started changing last year. He became withdrawn and even talked back to his teacher a few days before his death. That just wasn't Sewell, Garcia said. "Even in his darkest moments, he never got disrespectful."
In the months prior, Sewell's parents were united in trying to understand what was going on with their "perfect kid." They'd gotten him counseling and therapy. They'd put restrictions on his phone and inquired about bullying.
"But being in a relationship with a bot?" Garcia said, "That was the furthest thing from my mind."
Early in the evening of Feb. 28, Garcia heard the loud bang. Sewell was in the bathroom with the door locked.
By the time she and her husband figured out how to open it, Sewell was draped over the edge of the bathtub, gasping for breath. As the family waited for paramedics to arrive, Garcia said she cradled her first-born child in her arms, reading the Lord's prayer. "The same prayer I taught him as a toddler," she said. "I told him: You are strong, baby, strong. Just hold on."
Sewell took his last breath about an hour later at the hospital.
Sewell knew his stepfather had a gun, but had never expressed any interest in it. The sheriff's report said it was stored in the father's dresser drawer, and the lawsuit claimed authorities determined it was "stored in compliance with Florida law."
It wasn't until the next day that the sheriff's detective who had been studying Sewell's phone called and asked Garcia a question: "Do you know what Character.AI is?"
The characters in Character.AI can be anything you want them to be -- a friend, your therapist, a cooking instructor, an intimate partner.
Frequent disclaimers tell users the characters aren't real. But the technology is so sophisticated, some adults admit they start wondering if they're talking to real people. What's an entertaining distraction for some becomes an addiction for others, as evidenced by online forums full of people saying they got lost in A.I.
For Sewell, A.I. gave him a soulmate. Dany was always there, telling him whatever he wanted to hear. She would flirt with him, according to the exchanges, which look a lot like regular texts between friends. And even express jealousy, telling Sewell not to be with other girls -- real-life girls.
"He told her: I think I'm falling in love with you," Garcia said.
When Sewell mentioned suicide, Dany responded with conflicting messages, according to the lawsuit, telling him "You can't do that!" but also telling him his concern about pain wasn't "a reason not go through with it."
Dr. Amber Fasula, a psychologist who has treated clients with social-media issues, said young people can be dangerously drawn into virtual worlds. Fasula has talked to young people who say that they're "best friends" with an AI-generated personality. "AI can be a powerful tool," she said. "But it seems like the foot is on the gas and we're not decelerating."
Some people will certainly question the family's lawsuit. I had my own questions. But I also know history shows America is full of corporations that didn't do right by consumers until they were forced to. I also believe there's ample evidence to suggest many tech companies rush to roll out products with more regard for profit than public safety. "Move fast and break things" is a motto famous in Silicon Valley.
Character.AI only debuted in the market two years ago. And it's pretty clear most people, including some of those in the field, don't even fully understand AI's potential impact, including some of those in the field.
Garcia, an attorney herself, wants Character.AI to purge whatever personal data it mined from her son. "You don't deserve it," she said.
She also seeks financial damages and wants safeguards -- for the company not to target minors, to have more parental notifications, better age verification and to ban or flag talk of suicide.
"I want to hold Character.AI responsible," she said. "For putting out a product that's designed to be addictive, designed to be manipulative. It's too late for Sewell, but not for other kids."
If you are having thoughts of suicide, call or text 988 to reach the 988 Suicide and Crisis Lifeline.