The Weight of Ethics and the Reality of the Sprint

Last week, I opened the door to this eight‑week journey, a way to keep myself accountable as I navigate the final sprint of my master’s thesis while working full‑time in tech, being a mum, and being Māori. What I didn’t expect was how sharply I would feel the weight of the ethical questions sitting beneath my research this week.

When you spend months immersed in Indigenous data, language protection, and AI ethics, something shifts. You stop seeing AI as a tool and start seeing it as a space where power, history, culture, and technology collide. My ethics process, especially grounded in kaupapa Māori and Te Ara Tika, constantly reminds me that research isn’t neutral; it’s relational, political, and deeply personal.

This week has been about confronting that responsibility.

AI systems are accelerating faster than our ability to question them. They collect data without context. They reproduce patterns without whakapapa. They process language without understanding the cultural and spiritual threads that give it life. And yet these same systems are shaping the digital environments our tamariki will grow up in.

As Māori, we know that language and storytelling are not just content, they’re identity, memory, and connection. So when AI models absorb Indigenous narratives without tikanga, or when our stories become just another dataset, the risks are not abstract. They are cultural. They are historical. They are ongoing.

This week, I found myself returning again and again to a single question:
Who are these systems really built for — and who gets left out?

Not from a place of despair, but from clarity. The more I sit with this research, the more I understand the importance of Indigenous voices in shaping AI futures. And the more I feel the urgency of protecting our stories, language, and data from being misinterpreted, exploited, or flattened into something that no longer holds our meaning.

The thesis writing is hard; I won’t pretend otherwise. Balancing whānau, work, and late‑night thinking is a juggle. But the kaupapa gives this work its fuel. It’s not just a project. It’s part of a larger movement toward Indigenous autonomy in digital spaces, a theme woven through my learning agreement and my research aims to elevate ethical frameworks that honour our worldview.

So Week 2 has been less about word count and more about grounding. Remembering why this matters. Reconnecting to the responsibility that comes with writing about culture, data, and technology. And acknowledging that while AI may be the subject, the heart of this kaupapa is us, our people, our stories, our future.

Next week, I’ll share reflections on how the literature is reshaping my thinking and how much unlearning is needed when navigating Western AI narratives through an Indigenous lens.

For now, I’m ending Week 2 with a simple truth:
Ethics isn’t something you add to a project. It’s the foundation you build from.

Next
Next

Why This Thesis Matters