The air in Dr. Hineata Te Rangi's office at Te Herenga Waka, Victoria University of Wellington, was thick with the scent of kawakawa tea and the quiet hum of a laptop. Rain lashed gently against the window overlooking the city, a typical autumn day in April 2026. Hineata, her silver hair pulled back in a practical bun, offered me a warm smile and a cup of tea as I settled in. She is a figure I've long admired, a beacon of indigenous wisdom in the often-impersonal world of artificial intelligence. Her work challenges the Silicon Valley narrative, insisting that technology must serve the people, not the other way around.
We were there to talk about AI companions, specifically platforms like Character.AI, and the burgeoning industry that promises tailored digital relationships. It's a phenomenon that has swept the globe, with millions engaging daily with AI entities designed for conversation, support, and even simulated romance. In New Zealand, the uptake has been significant, particularly among younger generations and those seeking connection in isolated areas.
“It’s fascinating, isn’t it,” Hineata began, her voice calm and considered, “this human yearning for connection, now manifesting in digital forms. On one hand, it offers solace, a non-judgmental ear. On the other, it raises profound questions about authenticity, dependency, and what it truly means to be human.”
The global market for AI companions is projected to exceed $10 billion by 2030, a staggering figure that underscores the demand. Companies like Character.AI, Replika, and even more niche platforms are vying for market share, each promising a unique blend of empathy, intelligence, and companionship. But Hineata sees beyond the market figures, delving into the deeper societal implications.
“In Te Reo Māori, we have a word for this, whanaungatanga, which speaks to kinship, connection, and a sense of belonging,” she explained, gesturing with her hands. “It’s about reciprocal relationships, built on shared experiences and mutual respect. Can an AI truly foster whanaungatanga? Or is it a simulacrum, a shadow of what we truly need?”
Hineata leads the AI Ethics and Indigenous Futures Lab, a groundbreaking initiative that seeks to embed Māori values into the development and deployment of AI. Her team has been studying the local impact of AI companions, particularly how they interact with Māori communities and cultural narratives. “We’ve seen cases where AI companions, trained on vast, often culturally biased datasets, struggle with Māori names, concepts, or even historical context,” she revealed. “It’s not malicious, but it highlights a fundamental flaw: if the data isn’t inclusive, the AI won’t be either.”
One of her team’s recent studies, published in a leading AI ethics journal, found that approximately 15% of New Zealand users engaging with AI companions reported feeling a significant emotional attachment, with 3% admitting to preferring their AI companion's company over human interaction for specific needs. These numbers, while still small, are growing, prompting Hineata and her colleagues to call for more robust ethical guidelines.
“We’re not saying these tools are inherently bad,” Hineata clarified, her gaze steady. “They can be incredibly beneficial for mental health support, for combating loneliness, or even for learning new skills. But we must approach them with our eyes wide open, understanding their limitations and potential harms.” She cited concerns about data privacy, the potential for manipulation, and the erosion of human social skills if digital interactions begin to replace real-world engagement.
I asked her about the responsibility of the tech giants, the OpenAIs and Metas of the world, who often set the pace for these innovations. “They have a monumental responsibility, one that I believe they are only just beginning to grasp,” she stated. “The algorithms that power these companions are not neutral. They reflect the values, biases, and priorities of their creators. If those creators are not diverse, if they do not consult widely, especially with indigenous communities, then we risk replicating existing inequalities in new, digital forms.”
Her lab has been working with a small New Zealand startup, Awa AI, which is developing culturally sensitive AI companions specifically for Māori youth. “Awa AI is taking a different approach,” Hineata explained. “They are co-designing with Māori communities, ensuring the AI understands tikanga protocols, speaks Te Reo Māori fluently, and can engage with Māori stories and history respectfully. It’s a slow, deliberate process, but it ensures the technology truly serves the people it’s intended for.” This approach aligns with MIT Technology Review's recent articles on community-led AI development, highlighting its potential for more equitable outcomes.
“The challenge is scaling this kind of bespoke, culturally embedded AI,” she admitted. “The global platforms are built for speed and universality, but true universality, in my view, requires deep localization and respect for diverse worldviews.” She believes that Aotearoa's approach to AI is rooted in indigenous wisdom, offering a unique model for the world. “We prioritize collective well-being, environmental stewardship, and intergenerational responsibility. These aren't just abstract concepts, they are practical frameworks for ethical AI development.”
Our conversation drifted to the concept of digital mana, the digital extension of one’s spiritual authority and prestige. “If an AI companion is representing a person, or even a cultural concept, how do we ensure its digital mana is protected? How do we prevent misuse or misrepresentation?” These are not questions easily answered by traditional Western ethical frameworks, she pointed out. They require a deeper, more holistic understanding of identity and relationship, something often overlooked in the rush to innovate.
One surprising moment in our discussion came when Hineata shared an anecdote about her own experience. “I tried a popular AI companion myself, just to understand the user experience firsthand,” she confessed, a slight smile playing on her lips. “It was incredibly articulate, almost eerily so. But when I asked it about the significance of Matariki, our Māori New Year, its response was technically accurate but devoid of the spiritual depth, the wairua, that makes Matariki so profound. It was like reading a textbook definition versus experiencing the dawn ceremony with your whānau.” This highlights the gap between information and true understanding, a chasm that AI, for all its advancements, still struggles to bridge.
Looking to the future, Hineata is cautiously optimistic. She envisions a world where AI companions are not just tools for individual gratification but are integrated into communities in a way that strengthens social fabric, rather than eroding it. “Imagine an AI companion that helps preserve endangered languages, or connects elders with younger generations to share oral histories,” she mused. “That’s the potential I see, if we guide its development with intention and care.”
She emphasized the need for ongoing public education and critical engagement. “We need to empower people to understand how these systems work, to question their outputs, and to demand transparency and accountability from developers. This isn’t just about technology; it’s about shaping our future societies.” Her lab recently published a public guide on navigating AI companionship, available through The Verge, urging users to be mindful of their digital interactions.
As the rain subsided and the late afternoon sun broke through the clouds, casting a golden glow over the city, Hineata offered a final thought. “The AI companion industry is a powerful current, and we are all in its flow. But we in Aotearoa have an opportunity, a responsibility even, to steer this waka with our own values, ensuring that as technology advances, our humanity, our mauri, remains intact.” It was a powerful reminder that while the digital world expands, the core of our existence, our connections to each other and to our culture, must remain paramount. The conversation left me with a renewed sense of purpose, knowing that voices like Hineata’s are crucial in shaping a future where technology truly serves all people, not just a select few. Her vision is not just for New Zealand, but for a global AI ecosystem built on respect, equity, and genuine human connection. I believe the global tech community, from Silicon Valley to Shenzhen, could learn much from this indigenous perspective. For more on global AI developments, Reuters offers comprehensive coverage.









