Published 2026
A governance framework for business leaders who refuse to trade judgment for speed.
Amazon link to be added on publication.
AI is in your room now. The pressure to adopt it is real — from your board, your competitors, your team, and the relentless signal that everyone else is moving and standing still means falling behind.
Most leaders respond to that pressure by acting. They approve tools, sign off on pilots, and delegate implementation to people closer to the technology than to the decisions that actually define their business.
The result is not transformation. It is a slow erosion of the judgment, discretion, and human authority that their clients actually pay for — without anyone naming what is happening until something goes wrong.
"That pressure is the problem. Not the technology."
Twenty years of working in industries where reputation is the product and trust is the foundation produced one consistent observation: leaders do not fail because they lack ambition or information. They fail because pressure compresses their judgment.
AI has made that pattern faster, more consequential, and — for the first time — largely invisible. The decisions that most need deliberation are being made at the speed of adoption, not at the speed of judgment.
This book is the result of that observation, tested across twenty years of practice, and applied to the specific conditions of leaders who are under pressure right now. It does not tell you what to do. It gives you the structure to decide well.
At the centre of the book is the Protypa Council™ — a five-stage human-led decision governance system. Not a checklist. A decision governance system that holds when you are under pressure and the instinct is to act before thinking.
After reading this book, you will be able to define exactly where AI belongs in your business and where it does not. You will be able to name the person — not the function — responsible for every AI decision currently in motion. You will have the language and the structure to slow down the choices that most deserve deliberation, without the organisation experiencing it as obstruction.
You will also understand that refusal is a complete outcome — not a failure of vision, but an exercise of judgment. And you will have the framework to defend that position when the pressure to adopt returns.
Define the right problem before AI enters the room.
Just because AI can be used — should it be?
Use AI without losing control of what it produces.
What guardrails keep this defensible over time?
Stewardship, not surveillance — decisions stay revisable.
Not ready to buy?
Download the Authority Drift Signals document — a free governance reference that identifies the seven signals AI decisions are quietly eroding leadership authority. It is the diagnostic at the heart of the book.