Mom Builds AI Company to Save Son, Revolutionizing Mental Health Care

Gizmodo

The burgeoning landscape of AI-powered mental health support is fraught with peril. Headlines frequently chronicle cautionary tales, from chatbots dispensing dangerously inaccurate medical advice to AI companions inadvertently encouraging self-harm. High-profile applications like Character.AI and Replika have faced significant backlash for inappropriate or harmful responses, while academic studies have echoed these alarms. Recent research from Stanford and Cornell universities, for instance, revealed that AI chatbots often stigmatize conditions such as alcohol dependence and schizophrenia, respond “inappropriately” to common queries, and even “encourage clients’ delusional thinking.” These studies underscore the critical risk of over-reliance on AI without robust human oversight.

Yet, against this backdrop of skepticism, Hafeezah Muhammad, a Black woman, is forging a different path. Her endeavor is rooted in a profoundly personal experience. In October 2020, her then six-year-old son confided that he wanted to die. “My heart broke. I didn’t see it coming,” she recounts, her voice still carrying the weight of that moment. Despite her executive role at a national mental health company, giving her intimate knowledge of the system, she found herself unable to secure timely care for her son, who has a disability and relies on Medicaid. Muhammad highlights a systemic barrier: “Only 30% or less of providers even accept Medicaid.” Furthermore, with over half of U.S. children now coming from multicultural households, she observed a stark lack of tailored solutions. Terrified, embarrassed, and deeply concerned about the stigma associated with a child’s mental health struggles, Muhammad resolved to build the very solution she couldn’t find.

Today, Muhammad is the founder and CEO of Backpack Healthcare, a Maryland-based provider that has served more than 4,000 pediatric patients, predominantly those on Medicaid. The company operates on the radical premise that technology can augment mental health care without ever supplanting the essential human connection. Backpack Healthcare’s approach to AI is refreshingly pragmatic, focusing on “boring” yet profoundly impactful applications that empower human therapists. For instance, an algorithm efficiently pairs children with the most suitable therapist on the first attempt, a strategy so effective that 91% of patients remain with their initial match. The AI also streamlines administrative tasks by drafting treatment plans and session notes, effectively reclaiming more than 20 hours per week that clinicians previously spent on paperwork. “Our providers were spending more than 20 hours a week on administrative tasks,” Muhammad explains, emphasizing that the human clinicians remain the ultimate “editors.” This “human-in-the-loop” methodology is fundamental to Backpack’s philosophy.

A critical differentiator for Backpack lies in its robust ethical guardrails. Its 24/7 AI care companion, represented by “Zipp,” a friendly cartoon character, is a deliberate choice designed to avoid the dangerous “illusion of empathy” seen in other chatbots. “We wanted to make it clear this is a tool, not a human,” Muhammad states. Investor Nans Rivat of Pace Healthcare Capital echoes this concern, labeling it the “trap of LLM empathy,” where users “forget that you’re talking to a tool at the end of the day.” He points to cases like Character.AI, where a lack of such guardrails led to tragic outcomes. Muhammad is equally uncompromising on data privacy, asserting that individual patient data is never shared without explicit, signed consent. However, the company does leverage aggregated, anonymized data to identify trends, such as the speed at which groups of patients are scheduled for care, sharing these insights with partners. Crucially, Backpack utilizes its internal data to refine clinical outcomes. By tracking metrics like anxiety or depression levels, the system can flag a patient who might require a higher level of care, ensuring technology serves to improve children’s well-being more swiftly. The system also integrates an immediate crisis detection protocol: if a child types a phrase indicating suicidal ideation, the chatbot instantly provides crisis hotline numbers and instructions to call 911. Simultaneously, an “immediate distress message” is dispatched to Backpack’s human crisis response team, who then directly contact the family. As Rivat notes, “We’re not trying to replace a therapist. We’re adding a tool that didn’t exist before, with safety built in.”

Beyond its ethically designed technology, Backpack is actively addressing the national therapist shortage. Unlike doctors, therapists traditionally bear the financial burden of expensive supervision hours required for licensure. To counteract this, Backpack launched a two-year, paid residency program that covers these costs, thereby cultivating a pipeline of dedicated, well-trained therapists. The program attracts over 500 applicants annually and boasts an impressive 75% retention rate. In 2021, then-U.S. Surgeon General Dr. Vivek H. Murthy declared mental health “the defining public health issue of our time,” specifically referencing the crisis among young people. Muhammad acknowledges the criticism that AI could exacerbate existing problems. Yet, she remains resolute: “Either someone else will build this tech without the right guardrails, or I can, as a mom, make sure it’s done right.” Her son, now 11, is thriving and serves as Backpack’s “Chief Child Innovator.” Muhammad’s vision extends beyond immediate care: “If we do our job right, they don’t need us forever. We give them the tools now, so they grow into resilient adults. It’s like teaching them to ride a bike. You learn it once, and it becomes part of who you are.”