Frameworks · DIAGNOSE

AI Maturity
Spectrum™

Map your organisation across six dimensions and four maturity levels. Replace assumptions with evidence before making investment decisions.

The Problem This Solves

Most organisations operate on assumptions about their AI maturity. That's why investment decisions fail.

Without a diagnostic baseline, leadership makes AI investment decisions based on hope, vendor promises, or what competitors claim to be doing. The Spectrum maps reality before decisions are made.

80%+ of mid-market prospects score Developing — enough investment to have pain, not enough discipline to have results. This is the gap the Spectrum reveals.

Four Maturity Levels

Level 0 through Level 3

Level 0Score 18–23

Shadow Adoption

AI tools are being used informally by individuals. No governance, no ownership, no visibility into what's happening.

Level 1Score 24–29

Individual Tools

Some AI tools adopted but not integrated into workflows. Pilots may exist but nothing reaches production.

Level 2Score 30–43

Workflow Integration

AI is being built into operational workflows. Ownership exists but the operating model hasn't caught up. Pilots funded but stalling.

Most mid-market organisations sit here. This is where the Decision Discipline Program has the greatest impact.

Level 3Score 44–54

Agentic Systems

AI handles end-to-end workflows with human oversight at key checkpoints. Governance is mature. Delivery compounds quarter over quarter.

Six Dimensions

Where the Spectrum measures

Each dimension is scored independently. Your overall level is the composite — but the dimension-level view reveals exactly where to focus.

1.

Strategic Alignment

Emerging

Limited understanding. No documented strategy. No pilots shipped.

Developing

Moderate understanding. Ad hoc plans. Pilots running but none in production.

Mature

Deep expertise embedded in planning. Board-approved strategy with milestones. Multiple workflows live.

2.

Organisational Focus

Emerging

AI investment from existing budgets. Nobody owns it. Siloed efforts.

Developing

Partially dedicated budget. Senior champion but not primary role. Some coordination.

Mature

Dedicated budget with board visibility. Named team with exec sponsor. Systematic governance.

3.

Policies & Governance

Emerging

No usage policy. Data governance not addressed. Would struggle with regulators.

Developing

Guidelines exist but inconsistent. Basic data governance with gaps. Partial documentation.

Mature

Comprehensive policy, actively enforced. Integrated data governance. Full regulatory documentation.

4.

Staff Skills

Emerging

Under 10% daily adoption. No formal training. No tool evaluation process.

Developing

10–50% adoption, inconsistent. Some training, not systematic. Informal evaluation.

Mature

Over 50% embedded adoption. Structured training with benchmarks. Formal evaluation: security, privacy, ROI.

5.

Deployment

Emerging

No autonomous workflows. Not measuring ROI. Shutting off AI wouldn't change much.

Developing

1–2 partial automations. Some metrics, no framework. A few workflows would slow down.

Mature

Multiple end-to-end workflows. Clear before/after ROI. AI is load-bearing — disruption if removed.

6.

Continuous Improvement

Emerging

No review process. Underperformance ignored. Low 12-month confidence.

Developing

Occasional reviews. Ad hoc response to issues. Moderate confidence.

Mature

Quarterly reviews with scale/kill criteria. Defined process for iteration. High confidence, evidence-based.

Start with a free assessment

Find out where you sit on the Spectrum — in 5 minutes, for free.