Skip to main content

What you will learn

A Champions group without a rhythm becomes a group that meets once and fades. This page gives you the cadence, channels, and accountability structures that keep AI adoption alive in your organisation long after this cohort ends.

Set your rhythm

Establish weekly, monthly, and quarterly touchpoints

Build channels

Create async spaces for sharing progress between meetings

Run showcases

Use peer learning to spread adoption across teams

Drive with KPIs

Tie AI adoption to performance so it becomes part of the culture

The programme lead cadence

Your Driver needs a clear rhythm. Without it, AI adoption competes with the day job and loses every time.

Weekly

Quick check. Did people start the task in AI? Yes or no. One minute per person. Keep it lightweight.

Fortnightly

Show-and-tell. 30 minutes. Champions screen-share a real workflow. Real examples only, no slides. Steal ideas from each other.

Monthly

Workflow count. Count recurring AI workflows. This is your leading indicator. If the number grows, adoption is working.

Quarterly

Confidence pulse. Run a 1 to 5 score. Re-place the team on the maturity scale. Review governance and retire what is not working.

Monthly Champion Huddles

This is your core accountability meeting. The Driver facilitates. Champions attend. It should take 30 to 45 minutes.
For a detailed huddle agenda, engagement tips, and facilitation guide, see Run Effective AI Huddles in the Accelerator resources.
Each Champion shares one win or one thing they tried. It does not need to be polished. The point is to create a regular rhythm of sharing, not a formal presentation.The Driver captures every use case in the Opportunity Tracker (tool, team, use case, status, outcome). Over time, this becomes the evidence base for scaling and budget decisions.
Peer accountability is what keeps people experimenting when their day job gets busy. Knowing you will be asked “what did you try this month?” creates just enough pressure to keep moving.The social element matters too. Seeing a colleague solve a problem with AI that you also have creates a spark. People learn more from seeing someone else’s approach than from a training manual.
Do not turn huddles into status reports. Do not invite the entire organisation. Do not skip months because “nothing happened”. If nothing happened, that is the conversation.

Async channels

Not everything needs a meeting. Set up a dedicated channel in Teams or Slack for your Champions group.

What to post

Quick wins. Screenshots of AI helping with a task. Links to new features or tools. Questions for the group. Prompts that worked well.

How to keep it active

The Driver posts a prompt weekly or fortnightly. Something like: “What is one thing AI helped you with this week?” or “Has anyone tried the new feature in Claude?” A small nudge keeps the channel alive.
Assign one real task per person per week that starts in AI. This is the simplest weekly check: did they start the task in AI? Yes or no. It builds the habit of reaching for AI first.

Peer showcases

Showcases are different from huddles. Huddles are for Champions. Showcases are for the broader organisation. Once your Champions group has a few solid wins, start running fortnightly or monthly all-hands showcases. A Champion presents a real workflow. No slides. Just a screen share showing what they built, how it works, and how much time it saves.
People do not change their behaviour because you tell them AI is useful. They change when they see a colleague solving a problem they also have. The reaction you want is: “I want to try that” or “I want to beat what they did.”Cross-pollination is the real value. Someone in operations sees a finance workflow and realises it applies to their own process. That connection only happens when people share in front of each other.
Keep it to 30 minutes. One or two presenters maximum. Real examples from real work. Leave time for questions. Record it for people who cannot attend.The shift you are looking for: people move from “I tried AI for this” to “this is how I do it now.” When that language changes, you are on the right track.

Quarterly resets

Every quarter, step back and review the programme as a whole.
Use the AI Fitness Score to track adoption health across your organisation. It covers the six signals, the measures that matter, and a five-step health check process.
Is your AI policy still fit for purpose? Have new tools emerged that need evaluation? Are there use cases that should be retired? Governance is a living document.
The AI landscape changes fast. A tool that was the best option three months ago might have been overtaken. Use your quarterly reset to decide whether to renew, switch, or add tools to the mix.
Survey your team with a simple 1 to 5 confidence score. Compare it to last quarter. If confidence is growing, your programme is working. If it is flat or declining, something needs to change.
Re-ask: what would you build if AI did 80% of the execution? This prevents Level 3 complacency, where things are faster but nothing has fundamentally changed.

Tying adoption to KPIs

If AI adoption is optional, it stays optional. The organisations seeing the fastest uptake are the ones that build it into performance expectations.

KPI approach

Each team member identifies one process in their role to optimise with AI. They agree the scope with their manager, then deliver the improvement. Measured against time saved, quality improved, or new value created.

Why it works

The process of going out, learning, trying, and delivering creates the cultural shift you need. People start thinking “how can AI help?” automatically, not just for the KPI task, but for everything.
Refresh job descriptions every six months to include AI-related expectations. As the landscape evolves, so should the expectations. This keeps adoption as a living part of how people work, not a one-off initiative.

Staying current

The hardest part of AI adoption is not getting started. It is staying current when the landscape moves every week. Your Champions group is your first line of defence, but they cannot be expected to track everything. Consider bringing in external support periodically, whether that is a consultant, a training partner, or a dedicated data scientist who can upskill the team on what has changed and what it means for your business.

Quick checkpoint

Rhythm set

You know the weekly, fortnightly, monthly, and quarterly touchpoints

Channels planned

You have identified where your async communication will live

Showcases understood

You know how and when to run peer showcases for the broader team

KPIs considered

You have a starting point for tying AI adoption to performance

Next: the AI Adoption Framework

Use Pathfindr’s framework to assess your team’s maturity and plan the next move