AI Governance Foundations

Oversight in Weeks

The Problem

AI is already part of how your organisation operates. People are drafting communications, preparing reports, and getting through work faster. Nobody told them to — they just found it useful. In a small team, that initiative is usually a strength.

But each person is judging what is safe, appropriate, and good enough. Without a shared understanding, those decisions add up. A gap opens between what your organisation says about quality and ethics, and what is actually happening.

A client asks how you handle their data when AI is involved. A board member wants assurance through evidence. Legislation is catching up. There is time to get ahead.

Pink neon outline of a vintage film camera with the words "Coming Soon" in pink below on a black background.
Line drawing of a person with an eye patch and a blindfold, labeled 'Executive Blindness.'

Most leaders know AI is being used but can't see where, how, or what risks it creates. A client asks how you handle their data when AI is involved — you don't have a clear answer. A board member wants assurance through evidence. Legislation is catching up: the EU AI Act is in force, UK regulators are issuing sector guidance, and stakeholders are asking "How do you govern AI?" before contracts are signed. Without documented governance, you're exposed.
Sources: KPMG/University of Melbourne — Trust in AI Global Report 2025

Pink gears and exclamation marks with the text 'Operational Chaos' underneath.

When people adopt AI tools without shared guidance — shadow AI — quality becomes unpredictable. Research shows 40% of AI-generated workplace content is "workslop": outputs that look finished but need significant rework. The cost is real — 4.5 hours per employee per week spent correcting AI outputs. Without shared standards, AI creates more work, not less.
Sources: BetterUp Labs/Stanford — AI Workslop Research · Zapier — AI Mistakes Survey, Jan 2026

A pink outline of a person with question marks around their head, and the text "CULTURE IMPACT" below in dark blue.

The people most engaged with AI are often the ones asking for clear rules. Without them, there is no safe way to innovate. 44% of tech employees plan to change jobs within 12 months, many seeking organisations that take AI seriously. Only 52% of employees trust their employer to implement AI responsibly — trust is lowest where guidance is absent. At your size, losing even one good person over that frustration is felt immediately.
Sources: Korn Ferry — Tech Worker Exodus · Workday — AI Trust Gap 2025

The Solution | Proportional Oversight

Your people want to innovate responsibly with AI. That takes a shared understanding, embedded in your culture, of how your organisation uses AI well. We build it with you.

We call it Minimum Viable Governance. It gives your organisation proportional policies, processes, and oversight to innovate with AI confidently — with clear sight of risk, sized for where you are now, with room to grow. A launchpad, not a ceiling.

Your values become your boundaries, and your board gets assurance through evidence. You have it in weeks, not months.

Your team gains AI literacy — a shared language for using AI well and the confidence to innovate safely. Your clients, board, and stakeholders see an organisation that takes AI seriously. And when your team wants to try something new, you have a clear way to say yes.

When someone asks how you manage AI, you have a direct answer.

Neon sign with a stylized classical building, surrounded by a pink circular outline, and text reading "Proportional AI Governance Framework".
Logo featuring a blue outlined box with a magnifying glass over it, encircled by a pink ring, and the text "AI SYSTEM INVENTORY" below in blue.
A digital logo featuring a computer window with a globe icon inside, encircled by a pink ring. Beneath the logo, the text reads 'AI PORTAL' and 'SINGLE SOURCE OF TRUTH' in dark blue.
Icon of a document with pie chart and line graph, magnifying glass over the document, surrounded by a pink circular arrow, and text below stating "Regulatory, Social and Ethics* Impact Analysis".
Neon sign with a pink circle surrounding a warning triangle containing an exclamation mark, and the words 'AI RISK REGISTER' underneath.
Logo of Uncover Shadow AI featuring a magnifying glass over a microchip with AI text, in pink and blue colors.
Pink circular design with the words 'AND MORE.........' inside in blue text.

A practical, agile framework to establish AI governance built upon the concept of Minimum Viable Governance. Focuses on rapid deployment of key policies, roles and processes to manage risk without stifling innovation.

Pink neon outline of a vintage film camera with the words 'COMING SOON' beneath it, on a black background.

AI governance doesn't need to break the bank. We only charge for what we do — if it takes three weeks instead of four, that's the cost. A discovery call shapes the scope: we agree on the weeks and deliverables that fit your organisation, from a focused sprint through to a comprehensive deep dive. Here are two examples of what that looks like.

£1,500 per week, minimum commitment 2 weeks. Underfold undertakes client work 3 days a week, leaving 2 days for research and development. Committing to our principle of Always Evolving.

Proportional Investment

Everything you need to get governance in place and start innovating with confidence.

  • Rapid discovery and AI use form

  • AI System Inventory

  • AI Principles & Core Policy

  • Top 5 Risk Identification

  • Lightweight Approval Process

  • Brief Legislative & People Impact Report

  • Self-Guided AI Literacy Plan

⚡️ 3-WEEK Sprint - £4,500

A thorough review of your AI landscape with governance built around what we find.

  • Comprehensive discovery with stakeholder interviews

  • Full AI System Inventory & shadow AI amnesty

  • AI Principles, Policy & Optional Public Charter

  • Risk Register with owners and mitigations

  • Legislative, Regulatory & Ethics Impact Analysis

  • Full Approval Process with workflows

  • AI Portal setup

  • People Impact Assessment & Change Management Plan

  • AI Literacy Partner Recommendations & Training Plan

  • AI Roles & Governance Board (where needed)

 🔍 8-WEEK Deep Dive - £12,000

Want to know more?