Skip to main content

About this framework

AI is truly adopted when it is woven into how work gets done, and removing it would mean losing a capability your team now depends on. Most organisations track logins. That tells you nothing useful. This framework shows you what real adoption looks like, how to measure it, and how to move your team forward. Use it to guide your team over the next 12 weeks and beyond.

Understand signals

Use the six signals to understand what real adoption looks like in your team

Track health

Follow the five steps to track adoption health consistently

Advance maturity

Use the maturity scale to place your team, then use the transition playbook to move forward

Part 1: The Health Check

Six signals to watch

These signals tell you whether AI is genuinely changing how your team works. They go far beyond usage reports.
AI is the first thing people think of before starting a task. They do not need to be reminded. It is automatic.
People share wins and teach each other without being asked. Knowledge flows organically, not through formal channels.
People work together inside shared AI tools and projects. It stops being a solo activity.
People feel more capable over time, not less. Confidence is growing across the team.
Work is happening that simply was not possible before. Not just faster, but fundamentally new.
A recurring piece of work now has a step that runs through AI. This is the most concrete signal of real adoption.

What to actually track

Quantitative

1. Reasoning model usage. Are people using capable models for real work?2. Login frequency trending up. Directional only. Combine with other data.3. Workflows with AI in them. Count recurring processes with an AI step. Your strongest number.

Qualitative

1. Confidence pulse. Quarterly 1 to 5 survey. Is the team growing in confidence?2. Peer sharing. Is knowledge flowing without you organising it?3. Workflow stories. Can people describe how AI changed a specific piece of work?
The rule: You need both. Usage without confidence is forcing it. Confidence without change is wishful thinking.

Five steps to adoption health

1

Run the sentiment pulse survey

Targeted questions at week 6 and week 12. Measures confidence and how AI fits into real work. Pathfindr reviews your results and provides guidance.
2

Pull usage data from your AI platform

Login frequency and feature depth. A directional signal, not the full picture. Take a 30-day snapshot from your admin console.
3

AI win and opportunity tracking

Capture where AI is creating value. Run monthly. Use the Opportunity Tracker template.
4

Combine all sources in one project

Bring survey results, usage data, and opportunity tracking into a single AI project for analysis.
5

Run the analysis prompt

Use the Pathfindr adoption health analysis prompt with your data. Produces a one-page leadership update covering four dimensions.

Part 2: The Maturity Scale

Place your team honestly. This is not aspirational. It is diagnostic.

Level 1: Curious

Aware of AI. Have not tried it meaningfully. Most tasks still happen the old way.

Level 2: Exploring

Trying AI for one-off tasks. Some enthusiasm, some hesitation. Usage is inconsistent.

Level 3: Embedded

AI is part of recurring workflows. Teams share knowledge. Managers use AI visibly.

Level 4: Integral

Removing AI would break named deliverables. Use cases are expanding. New starters learn from the team.

Transition playbook

Three actions per level. That is it.
First, check this: Clear guidelines on what is approved, what data is safe, and that AI use is encouraged.
  1. Assign one real task per person per week that starts in AI
  2. Pair each person with a peer one level ahead for a 15-minute weekly check-in
  3. Address hesitation directly: starting slow is expected and supported
It is working when: people stop asking “am I allowed?” and start asking “how do I do this better?”
First, check this: Managers need to be visibly using AI. If the team never sees their manager use it, adoption stalls.
  1. Each manager shares one AI workflow they personally use with their team
  2. Each team member selects one recurring workflow and rebuilds it with AI handling one step
  3. Run a fortnightly show-and-tell: 30 minutes, screen share, real examples only
It is working when: the conversation shifts from “I tried AI for this” to “this is how I do it now.”
First, check this: The team is expanding beyond original use cases. If the same workflows have been unchanged for a month, you are still at Level 3.
  1. Select one team deliverable and rebuild it as a shared AI workflow the whole team contributes to
  2. Each person identifies the specific deliverable that would not be possible without AI
  3. Shift measurement from usage metrics to outcome metrics: what work is now possible that was not before
It is working when: removing AI would break named deliverables.

Part 3: Beyond the first 12 weeks

The first 12 weeks build habits. This section makes them stick.

Ongoing health tracking

The first 12 weeks used a baseline survey to understand where your team started. From here, adoption health needs ongoing measurement. Use the ongoing adoption health survey to track confidence, workflow integration, and where your team needs support.
Run the adoption health survey every quarter. Share results openly with the team. Celebrate movement, even small shifts.

Huddles

Every month. Short, focused check-ins on what is working and where people are stuck. Programme lead facilitates.

Lunch and learns

Every month. Team members share AI wins and workflows. Volunteers present. Real examples only, no slides.

Adoption health survey

Every quarter. Run the survey. Track confidence and workflow integration. Share results openly. Celebrate movement.

What if refresh

Every quarter. Re-ask: what would you build if AI did 80% of the execution? Prevents Level 3 complacency.

Three shifts that matter over time

As your programme matures, three things need to change.

Ownership

The team runs it themselves. The Driver role shifts from leading to facilitating. Adoption becomes self-sustaining.

Metrics

Track outcomes, not usage. The question moves from “are people logging in?” to “what work is now possible that was not before?”

Onboarding

New starters learn from the team, not from training materials. AI knowledge transfers through culture, not through courses.

The ultimate test

”What would break if we turned off AI tomorrow?” If things would fall apart, that is real adoption. If nothing changes, you are not there yet. Use this framework to close the gap.

Download templates and resources

Get the templates you need to run your AI rollout