
This article is based on a thesis I have written for the Supervisory Board program (NCC 73) at Nyenrode University, which I will complete this month. I set out to answer a practical question: how can supervisory boards close the digital competency gap so their oversight of digitalization and AI is effective and value-creating?
The research combined literature, practitioner insights, and my own experience leading large-scale digital transformations. The signal is clear: technology, data, and AI are no longer specialist topics—they shape strategy, execution, and resilience. Boards that upgrade their competence change the quality of oversight, the shape of investment, and ultimately the future of the company.
1) Business model transformation
Digital doesn’t just add channels; it rewrites how value is created and captured. The board’s role is to probe how data, platforms, and AI may alter customer problem–solution fit, value generation logic, and ecosystem position over the next 3–5–10 years. Ask management to make the trade-offs explicit: which parts of the current model should we defend, which should we cannibalize, and which new options (platform plays, data partnerships, embedded services) warrant small “option bets” now?
What to look out for: strategies that talk about “going digital” without quantifying how revenue mix, margins, or cash generation will change. Beware dependency risks (platforms, app stores, hyperscalers) that shift bargaining power over time. Leverage scenario planning and clear leading indicators—so the board can see whether the plan is working early enough to pivot or double down.
2) Operational digital transformation
The strongest programs are anchored in outcomes, not output. Boards should ask to see business results expressed in P&L and balance-sheet terms (growth, cost, capital turns), not just “go-live” milestones. Require a credible pathway from pilot to scale: gated tranches that release funding when adoption, value, and risk thresholds are met; and clear “stop/reshape” criteria to avoid sunk-costs.
What to look out for: “watermelon” reporting— that stay green while progress/adoption is behind; vendor-led roadmaps that don’t fit the architecture; and under-resourced change management. As a rule of thumb, ensure 10–15% of major transformation budgets are reserved for change, communications, and training. Ask who owns adoption metrics and how you’ll know—early—that teams are using what’s been built.
3) Organization & culture
Technology succeeds at the speed of behaviour change. The board should examine whether leadership is telling a coherent story (why/what/how/who) and whether middle management has the capacity to translate it into local action. Probe how AI will reshape roles and capabilities, and whether the company has a reskilling plan that is targeted, measurable, and linked to workforce planning.
What to look out for: assuming tools will “sell themselves,” starving change budgets, and running transformations in a shadow lane disconnected from the real business. Look for feedback loops—engagement diagnostics, learning dashboards, peer-to-peer communities—that surface resistance early and help leadership course-correct before adoption stalls.
4) Technology investments
Oversight improves dramatically when the board insists on a North Star architecture that makes trade-offs visible: which data foundations come first, how integration will work, and how security/privacy are designed in. Investments should be staged, with each tranche linked to outcome evidence and risk mitigation, and with conscious decisions about vendor lock-in and exit options.
What to look out for: shiny-tool syndrome, financial engineering that ignore lifetime Total Cost of Ownership (TCO), and weak vendor due diligence. Ask for risk analysis (e.g., cloud and vendor exposure) and continuity plans that are actually tested. Expect architecture reviews by independent experts on mission-critical choices, so the board gets a clear view beyond vendor narratives.
5) Security & compliance
Cyber, privacy, and emerging AI regulation must be treated as enterprise-level risks with clear ownership, KPIs, and tested recovery playbooks. Boards should expect regular exercises and evidence that GDPR, NIS2, and AI governance are embedded in product and process design—not bolted on at the end.
What to look out for: “tick-the-box” compliance that produces documents rather than resilience, infrequent or purely theoretical drills, and untested backups. Probe third-party and supply-chain exposure as seriously as internal controls. The standard is not perfection; it’s informed preparedness, repeated practice, and to learn from near-misses.
Seven structural moves that work
- Make digital explicit in board profiles. Use a competency matrix that distinguishes business-model, data/AI, technology, and cyber/compliance fluency. Recruit to close gaps or appoint external advisors—don’t hide digital under a generic “technology” label.
- Run periodic board maturity assessments. Combine self-assessment with executive feedback to identify capability gaps. Tie development plans to the board calendar (e.g., pre-strategy masterclasses, deep-dives before major investments).
- Hard-wire digital/AI into the agenda. Move from ad-hoc updates to a cadence: strategy and scenario sessions, risk and resilience reviews, and portfolio health checks. Make room for bad news early so issues surface before they become expensive.
- Adopt a board-level Digital & IT Cockpit. Track six things concisely: run-the-business efficiency, risk posture, innovation enablement, strategy alignment, value creation, and future-proofing (change control, talent, and architecture). Keep trends visible across quarters.
- Establish a Digital | AI Committee (where applicable). This complements—not replaces—the Audit Committee. Mandate: opportunities and threats, ethics and risk, investment discipline, and capability building. The committee prepares the ground; the full board takes the decisions.
- Use independent expertise by default on critical choices. Commission targeted reviews (architecture, vendor due diligence, cyber resilience) to challenge internal narratives. Independence is not a luxury; it’s how you avoid groupthink and discover blind spots in time.
- Onboard and upskill continuously. Provide a digital/AI onboarding for new members; schedule briefings with external experts; and use site visits to see real adoption. Treat learning like risk management: systematic, scheduled, and recorded.
Do you need a separate “Digital Board”?
My reflection: competence helps, but time and attention are the true scarcities. In digitally intensive businesses—where data platforms, AI-enabled operations, and cyber exposure shape enterprise value and are moving fast—a separate advisory or oversight body can deepen challenge and accelerate learning. It creates space for structured debate on architecture, ecosystems, and regulation without crowding out other board duties.
This isn’t a universal prescription. In companies where digital is material but not defining, strengthening the main board with a committee and better rhythms is usually sufficient. But when the operating model’s future rests on technology bets, a dedicated Digital Board (or equivalent advisory council) can bring the needed altitude, continuity, and specialized challenge to help the supervisory board make better, faster calls.
What this means for your next board cycle
The practical message from the thesis is straightforward: digital oversight is a core board responsibility that can be institutionalised. Start by clarifying the capability you need (the competency matrix), then hard-wire the conversation into the board’s rhythms (the agenda and cockpit), and raise the quality of decisions (staged investments, independent challenge, real adoption metrics). Expect a culture shift: from project status to value realization, from tool choice to architecture, from compliance as paperwork to resilience as practice.
Most importantly, treat this as a journey. Boards that improve a little each quarter—on fluency, on the sharpness of their questions, on the discipline of their investment decisions—create compounding advantages. The gap closes not with a single appointment or workshop, but with deliberate governance that learns, adapts, and holds itself to the same standard it asks of management.