🌈 Indigenous Edition

Whats Live Today

English

What's Actually Running in Village Today


Series: To Hapori, To AI — Digital Sovereignty for Indigenous Communities (Article 4 of 5) Author: My Digital Sovereignty Ltd Date: March 2026 Licence: CC BY 4.0 International


Early Days

This article is about what exists today — not what we plan to build, not what we hope to achieve, but what is running right now in production. Where something is planned but not yet live, we will say so plainly.

Village AI has been in production since October 2025. It is a young system. Some parts work well. Some parts are still rough. We believe in telling you both, because a community that adopts a platform based on clear information will be a more resilient partner than one that adopts based on marketing.

What Village AI Can Do for Your Community Today

Answer questions about your community's content. When a whanau member asks "When is the next hui?" or "What was decided about the marae restoration?", Village AI searches your community's actual records — announcements, stories, event descriptions, published documents — and provides an answer grounded in that content. It does not guess or infer from general knowledge. If it cannot find the answer in your records, it says so.

Help with drafting. Village AI can help draft community announcements, event notices, and correspondence. Because it has been trained on your community's previous content, its drafts reflect your community's tone and style — not a generic corporate template. A moderator reviews and edits every draft before it reaches the community.

Summarise long documents. A lengthy set of hui minutes or a series of community announcements can be summarised into key points. This is useful for whanau members who want to stay informed but do not have time to read everything.

Translate between languages. Village supports five languages — English, German, French, Dutch, and Te Reo Maori. The AI assists with translation of community content, though human review is recommended for important communications. For communities maintaining te reo, having platform-level support for Te Reo Maori means the interface itself can operate in the language of the community.

Triage member feedback. When a whanau member submits feedback through the platform — a question, a suggestion, a report of something not working — the AI classifies it, investigates where possible, and notifies the member when it has been addressed. This happens automatically, freeing the moderator from manually sorting every piece of feedback.

What the AI Does Not Do

It does not make decisions for your community. When a question involves tikanga, values, cultural protocols, or judgment, the AI stops and routes it to a human. Your moderator, your kaumatua, your runanga — the people your community trusts with these decisions.

It does not access content it was not given. Private content stays private. Content from other communities stays with those communities. The AI cannot reach across boundaries, because those boundaries are structural, not policy-based. For indigenous communities managing sensitive cultural knowledge, this structural separation is essential — it is not a setting that can be accidentally changed.

It does not operate without oversight. Every AI response passes through Guardian Agents — the four mathematical verification layers described in the previous article. No response reaches a whanau member without being checked against your community's actual records.

It does not pretend to know things it does not know. When the AI's confidence is low, it says so. Every response carries a confidence indicator. Members can see at a glance whether the AI is drawing on solid records or venturing into less certain territory.

How Bias Is Addressed: The Vocabulary System

One of the subtlest forms of bias in AI is linguistic. When a system trained on corporate data calls your whanau "users" and your community announcements "posts," it is imposing a worldview — one where communities are consumer platforms and communication is content marketing.

Village addresses this through a vocabulary system that adapts the entire platform to your community type.

When you set up a Village for an indigenous community using the whanau product type, the system does not show you generic labels. It shows you the language of your community:

This is not cosmetic. The vocabulary shapes how the AI understands and responds to your community. When the AI has been trained with the term "whanau" rather than "user," it processes questions and generates responses within a community frame of reference. It understands that "How do we welcome new whanau?" is a different question from "How do we onboard new users?" — even though a generic AI system would treat them identically.

Each community type has its own vocabulary. An Episcopal parish sees "parishioners" and "vestry governance." A sports club sees "club members" and "season fixtures." An indigenous community sees its own terms. The platform is the same, but the language — and therefore the AI's understanding — is specific to your community.

For communities engaged in language revitalisation, this vocabulary system has additional significance. It means the digital platform your community uses daily is itself a site of language use — not a space where English is the unquestioned default and your language is an afterthought.

How the AI Learns and Improves

Village AI is not static. It improves over time through three mechanisms:

Scheduled retraining. The AI is periodically retrained on your community's latest content. During the beta programme, this happens weekly. New announcements, new stories, new event descriptions — they enter the AI's knowledge base so it stays current with your community's life.

Moderator feedback. When a moderator flags an AI response as inaccurate or unhelpful, that correction feeds back into the system. Over time, the AI learns what works for your community and what does not. This is not generic improvement — it is improvement specific to your community.

Guardian Agent learning. The fourth Guardian Agent — the adaptive learner — adjusts verification thresholds based on patterns of accuracy and error. If the AI consistently gets a certain type of question right, the guardian eases verification intensity for that type. If it consistently struggles with another type, the guardian tightens scrutiny. The system becomes more efficient without becoming less careful.

What Is Still a Work in Progress

The 8B deep reasoning model is trained and deployed, but the routing system that decides which questions go to the faster model and which go to the deeper model is still being refined. Some questions that would benefit from deeper processing are currently handled by the faster model.

Individual personalisation — where the AI learns individual whanau member preferences — is planned but not yet built. For now, the AI knows your community as a collective, not your individual members as individuals (unless they interact with it directly).

The moderator accreditation path — structured training for community members who take on the moderator role — is designed but being rolled out progressively. During the beta programme, founding communities have direct access to the founder for support.

Indigenous-specific training data is an area where the platform is candid about its limitations. The base AI model, like all current language models, carries a Western bias in its training data. The vocabulary system, the community-specific training layer, and the Guardian Agents mitigate this bias structurally — but they do not eliminate it entirely. Deeper alignment with specific indigenous knowledge systems would require partnerships with those communities that have not yet been established.

We mention these plainly because we believe you should know what you are adopting. This is a platform in its early months, built by a small team, used by a small number of communities. It is functional, it is improving, and it is clear about where it stands.

What This Means for Your Community

If your community is considering Village, here is what you are choosing:

Village is a platform where AI knows your community's actual content — your announcements, your stories, your events — not the internet's idea of what an indigenous community might be. Every AI response is mathematically verified against your records by independent watchers. The vocabulary reflects your language: whanau, not users; hui, not meetings; kaitiaki, not admin.

Your data stays within your community's boundary, is not used to train external AI systems, and can be exported or deleted at any time. The system is transparent about its limitations, improves from your moderators' corrections, and stops to ask a human when a question requires judgment rather than information.

You would also be joining a founding community — one of 20-25 communities, whanau, parishes, clubs, and businesses shaping the platform during its first year.

If that interests you, applications for the beta programme are open until 30 March 2026.


This is Article 4 of 5 in the "To Hapori, To AI" series. For the full technical architecture, visit Village AI on Agentic Governance.

Previous: Why Rules and Training Aren't Enough — The Governance Challenge Next: The Village Beyond AI — What Your Community Actually Gets

Published under CC BY 4.0 by My Digital Sovereignty Ltd. You are free to share and adapt this material, provided you give appropriate credit.