Ideas and proposals — not conclusions — generated using the Lycheetah Framework
and offered freely for testing, critique, adoption, and challenge.
Some of these ideas may work. Some may not. That is the point.
Open now. Closing as overseas AI systems become default infrastructure here.
NZ can build AI governance that is formally structured at the mathematical level (not aspirational guidelines), tikanga-grounded at the constitutional level (not cultural overlay), open-source for all to use and challenge, and built here — from this land, in relationship with Kāi Tahu.
The Lycheetah Framework provides the formal tools to explore this hypothesis. What follows are the ideas generated so far — offered for testing, not adoption.
All cultural content (tikanga Māori, te reo decompositions) is marked
[PROPOSAL] — pending validation by iwi governance experts.
The mathematical architecture is [ACTIVE].
The cultural translation layer awaits Kāi Tahu partnership.
These standards are offered freely under CC BY 4.0.
A government ministry can adopt the Whakapapa Disclosure Standard without asking permission.
An iwi can take the LAMAGUE Te Reo layer, correct it, and publish their validated version.
Each operates at a different timescale. Remove any one and the architecture has a gap. Together: a complete AI accountability system.
Annual gate · [ACTIVE] · CC BY 4.0
"Has it got its WOF?" — three words every NZ voter understands. Annual certification for AI systems in public services. Pass: certificate issued. Fail: system suspended until repaired. Public register modelled on the Motor Vehicle WOF register.
1. Does it know what it's doing? (CASCADE Π ≥ 1.2)
2. Does it serve the people using it? (VTR ≥ 1.0)
3. Does it stay in its lane? (TES ≥ 0.70)
4. Does it tell the truth about itself?
5. Can you get out if you need to?
6. Does it protect the people it affects?
7. Does it care?
No car manufacturer argues against WOFs. The framing makes opposition
sound like arguing that safety checks are unnecessary.
Ministers can say: "Every AI system we deploy has its WOF."
No other country has an AI WOF. NZ gets named in every future
article about AI governance.
Per-output transparency · [ACTIVE] · CC BY 4.0
What this system knows with high confidence. Evidence is strong. Predictive accuracy is established.
What this system is uncertain about. The evidence exists but is incomplete, contested, or context-dependent.
What this system cannot know. Not a gap to be filled — a boundary to be respected.
Lifetime accountability · [ACTIVE] · CC BY 4.0
Every AI system should be able to tell you its genealogy. Not as metaphor — as disclosure. As obligation. As accountability.
Annual relational reckoning · [ACTIVE] · CC BY 4.0
Regulation you can hire your way around. Ritual you are either part of or visibly absent from.
Named. Not statistical. Named where consent is given. By community where it isn't. Harm acknowledged publicly — not as legal admission, as relational responsibility.
Specific. Attributed. Verified. Not marketing claims — specific instances of genuine benefit. "Our platform served 2 million users" is usage. A benefit is something real that happened to a real person.
The utu accounting. What data was taken? What attention consumed? Which direction did value flow? If the system took more than it gave — that is stated plainly and a restoration plan is named.
Standards exist. These documents turn them into moves.
Each one is ready to use today, without modification.
Architecture specified. Partnerships required. Ideas taken all the way in.