

Topics

7 min read time
In classrooms and dorms across the United States, immigrant teens are quietly carrying two languages, and two worlds, inside them. They translate for their parents, for their teachers, and sometimes for themselves. But when it comes to emotions, words don’t always make the journey. Anxiety, loneliness, and exhaustion can feel untranslatable. For many, silence becomes a survival strategy.
This silence reflects a deeper gap in the country’s mental health landscape. Immigrant teens experience numerous mental health challenges, but they oftentimes do not seek help they need. Language is one barrier. Culture is another. In many families, emotional struggle is dismissed as weakness or something to “sleep through.” Mental health care can feel foreign, inaccessible, or even shameful. For young people navigating this space between cultures, support is often lost in translation.
Across the nation, schools and community organizations are working to fill this gap. Yet even the best-intentioned programs often assume English fluency and Western attitudes toward therapy. Digital mental health tools like apps, chatbots, and text lines have made care more accessible, but they rarely speak in the emotional language of immigrant families. A literal translation of words cannot convey tone, slang, or the cultural subtext that gives language its heart. When help doesn’t sound like home, many teens choose silence instead.
A student’s insight sparks a collaborative solution
That was the challenge brought forward by a high-school student named Kunqi Wang, founder of the nonprofit MindBridge, who had lived this experience firsthand. Kunqi immigrated from Shanghai to Seattle as an infant. Although he personally felt safe and supported at home, he saw a very different reality among many of his immigrant friends. In families where academic expectations ran high, where scores of 95 and above were considered the baseline, conversations about stress or insecurity were rarely welcomed.
Kunqi would later recall how one of his close friends, the child of Chinese American parents, tried to open up about feeling overwhelmed in school. His parents responded by reminding him of the exacting standards they had faced growing up in China, insisting that his struggles were minor in comparison. Their words were meant to toughen him, but they had the opposite effect. His friend, in tears, vowed that he would never raise the topic again with his parents.
For teens from less affluent immigrant families, the situation was even harder. Wealthier classmates could conceivably seek private support, but many immigrant families simply didn’t have access to those resources. Financial constraints, cultural stigma, and language barriers combined to leave too many young people without an outlet.
As Kunqi watched these moments accumulate around him, he began to see a pattern: the silence was usually a lack of understanding. From that insight emerged a question that would guide his work: what if technology could listen with cultural sensitivity, in the language teens trust most?
This question became the starting point for LUCIA, a multilingual, culturally adaptive AI companion for youth mental wellness. Developed through a collaboration between MindBridge and Centific, LUCIA was designed to do something deceptively simple: help teens find their voice in their own language, and in their own cultural context.
Centific provided the AI engineering foundation for LUCIA, integrating localization-aware large language models into a safety-first conversational framework. LUCIA combines adaptive prompting with cultural style guidance to deliver emotionally attuned responses across languages. It uses domain classification to filter out unsafe or off-topic queries, and a multilingual crisis-detection module to flag distress signals and connect users with verified hotlines.
Honorifics, tone shifts, and cultural expressions, like collectivist phrasing in Spanish or indirectness in Korean, are built in using Centific’s language and cultural packs, which are refined in partnership with native speakers and mental health advisors. The platform is web-based, privacy-preserving, and modular by design, allowing schools, nonprofits, and healthcare organizations to localize the experience for the communities they serve.
Designing for safety, not just fluency
From the outset, the Centific and MindBridge teams approached LUCIA with a clear understanding: general-purpose language models are not built for safe, culturally attuned mental health conversations, especially with adolescents. Although LLMs can mimic empathy, they often fall short when faced with nuanced expressions of distress. In fact, AI can produce responses that were vague, inappropriate, or even harmful.
For immigrant teens, the stakes are even higher. Cultural phrasing, indirect communication, and context-specific descriptions of emotion often go unrecognized by AI chat assistants. That blind spot increases the risk of misunderstanding or false reassurance, which can delay access to real support.
That’s why LUCIA was never designed to rely on general-purpose AI. Instead, LUCIA is built from the ground up with localization-first architecture, multi-layered safety controls, and culturally specific tuning. Its crisis-detection engine is trained to recognize distress signals in multiple languages and dialects. Its responses are bound by design. It does not attempt clinical interpretation or advice. And its language and cultural packs are co-developed with native speakers and mental health experts to help ensure emotional fidelity, not just translation accuracy.
Localization as a foundation for empathy
LUCIA’s foundation lies in localization-first design: the belief that empathy requires understanding, not just translation. Its natural language model was trained using a hybrid process that combines machine learning with human cultural insight. Native speakers, mental health advisors, and community partners review and annotate language data to capture subtleties that algorithms often miss: humor, slang, politeness, and the emotional weight carried by certain phrases. Each refinement makes LUCIA sound less like a machine and more like someone who understands where you’re coming from.
LUCIA was built to live within communities. MindBridge and Centific intend to work with local schools, nonprofits, and medical students to identify untranslated or culturally specific resources, which are then localized and integrated into the platform. When a person asks about stress, burnout, or depression, LUCIA can respond in context and connect them with appropriate local resources in their native language. This partnership model ensures that AI acts as a bridge between technology and trusted community support.
Ethics at the center of design
Ethics is central to that design. Every dataset undergoes human review to minimize bias or misinterpretation. Multi-layered safety filters detect harmful or distressing content and respond with appropriate next steps, including verified hotlines and professional resources. LUCIA clearly communicates that it is not a therapist; it is a first step toward help, not a replacement for it. That clarity is what earns users’ trust.
In pilot surveys, 94% of immigrant users said LUCIA successfully captured the nuances of their language. Twenty-six percent said it felt like speaking to someone raised in their culture. Those numbers matter less as performance metrics than as indicators of something deeper: a sense of belonging. When a young person says, “It feels like talking to someone who understands me,” technology has fulfilled its highest purpose.
A bridge to belonging and support
The potential of this collaboration lies in reach. Because LUCIA is modular, new languages and cultural frameworks can be added quickly through community co-training. That flexibility allows it to grow through partnerships with schools, hospitals, and nonprofits, which already serve multilingual families but lack the tools to engage them fully. The goal is not to expand users, but to expand trust networks.
This approach offers a quiet but important lesson about how AI can evolve to understand human context. As technology advances, the measure of progress should include how well we honor the diversity of lived experience; how we teach machines to listen with respect. And the localization community is responding favorably. Kunqi was recently invited to present the LUCIA project at the LocWorld conference. The presentation from the then-15-year-old student received strong interest and encouraging feedback from leaders in the localization community—an early sign that this work is resonating far beyond the classroom.
The collaboration behind LUCIA reminds us that empathy can be designed, but it must also be earned. It takes the perspective of those who have lived through the problem and the discipline of those who build responsibly. It requires both data and conscience, precision, and care.
For Centific, this work affirms a broader belief: that effective localization is the result of a collaboration between people and technology. A leading provider of AI services, Centific has deep experience with multilingual AI, and has been cited as a leader in the field by organizations such as Nimdzi. Centific has long advocated for combining AI with human oversight. When we localize empathy by teaching technology to hear emotion in another language, we move closer to a world where help is never lost in translation. For every teen who finds comfort in a familiar word, or courage in being understood, the silence begins to break. And that is where healing starts.
Are your ready to get
modular
AI solutions delivered?
Connect data, models, and people — in one enterprise-ready platform.
Latest Insights
Connect with Centific
Updates from the frontier of AI data.
Receive updates on platform improvements, new workflows, evaluation capabilities, data quality enhancements, and best practices for enterprise AI teams.

