Skip to main content

Your rollout toolkit

These templates give you a starting point for each part of your AI rollout. Copy them, customise them for your organisation, and iterate as you learn what works.

Governance

Champions charter and AI policy templates

Operations

Operating rhythm planner and opportunity tracker

Evaluation

Tool evaluation scorecard and adoption health prompt

Champions Group Charter

Use this to formalise your Champions group. Share it with the group on day one so everyone knows the purpose, expectations, and rhythm.
Champions Group Charter — [Your Organisation]

Purpose
Test AI tools and use cases in the context of our business, evaluate what works, and make recommendations for broader rollout.

Members
- Driver: [Name, Role]
- Champion: [Name, Role / Department]
- Champion: [Name, Role / Department]
- Champion: [Name, Role / Department]
- Champion: [Name, Role / Department]

Responsibilities
- Test use cases in your own workflows each month
- Compare tools against agreed criteria
- Log wins and learnings in the Opportunity Tracker
- Attend monthly huddles and share progress
- Flag risks, limitations, and governance gaps
- Help onboard the broader team when scaling

Meeting rhythm
- Monthly huddle: [Day, Time] — 30 to 45 minutes
- Async channel: [Teams/Slack channel name]
- Quarterly reset: [Month] — governance review and tool evaluation

Decision rights
- Champions recommend. Leadership approves.
- Tool trials under [dollar amount] can be approved by the Driver.
- Tool trials above [dollar amount] require leadership sign-off.

Review date
This charter will be reviewed every [3/6] months.

Operating Rhythm Planner

Map out your cadence for the first 12 weeks. Adjust the frequency based on what your team can sustain.
Operating Rhythm Planner — [Your Organisation]

Weekly
- Quick check: did each Champion start at least one task in AI? (Yes/No)
- Driver posts a prompt in the async channel

Fortnightly
- Show-and-tell: one Champion screen-shares a real workflow (30 mins)
- Open to Champions group only in the first 4 weeks, then broader team

Monthly
- Champion Huddle: wins, blockers, new use cases (30–45 mins)
- Update the Opportunity Tracker
- Count recurring AI workflows (your leading indicator)

Quarterly
- Run the confidence pulse survey (1–5 score)
- Re-place the team on the maturity scale
- Review governance: retire what is not working, evaluate new tools
- Run the "what if" refresh: what would you build if AI did 80%?

Key dates
- Week 1: [Date] — Champions kickoff
- Week 4: [Date] — First showcase to broader team
- Week 6: [Date] — First sentiment pulse survey
- Week 12: [Date] — Second sentiment pulse survey and quarterly reset

Opportunity Tracker

Track every AI use case your Champions test. This becomes your evidence base for scaling decisions and budget conversations.
The Accelerator has a full guide on setting up and using the Opportunity Tracker, including a Tracker Overview and a step-by-step guide to creating a SharePoint submission form.
Opportunity Tracker — Column Structure

1. Use Case — What is the task or workflow?
2. Tool — Which AI tool was used?
3. Team / Department — Who tested it?
4. Owner — Who is the Champion responsible?
5. Size — Small (saves minutes), Medium (saves hours), Large (changes a workflow)
6. Status — Testing / Working / Scaled / Retired
7. Outcome — What happened? Time saved, quality improved, or new capability?
8. Notes — Anything else worth capturing (limitations, risks, next steps)

Instructions
- Add a new row for every use case tested, even if it did not work
- Update the status column monthly
- The Driver reviews this tracker before every Champion Huddle
- Use this as the basis for quarterly scaling decisions
Set this up in SharePoint, Notion, or whatever tool your team already uses. Do not create a new system. Put it where people already work.

Tool Evaluation Scorecard

Use this when your Champions are comparing AI tools against specific business criteria.
Tool Evaluation Scorecard — [Tool Name]

Evaluation criteria (rate each 1–5)

1. Research and analysis
   How well does the tool handle research tasks, summarisation, and data analysis?
   Score: [ ] / 5

2. Content creation
   How well does the tool draft, edit, and format written content?
   Score: [ ] / 5

3. Workflow integration
   How easily does the tool fit into existing tools and processes?
   Score: [ ] / 5

4. Enterprise security
   Does the tool meet your data handling and compliance requirements?
   Score: [ ] / 5

5. Ease of use
   How quickly can a non-technical team member get productive?
   Score: [ ] / 5

Total score: [ ] / 25

Strengths:
Limitations:
Recommendation (keep / trial further / retire):
Evaluated by: [Name]
Date: [Date]

AI Rollout Plan (90-day)

Use this to map out your first 90 days after the cohort ends.
90-Day AI Rollout Plan — [Your Organisation]

Month 1: Foundation
- [ ] Finalise Champions group (Driver + 3–5 Champions)
- [ ] Share the Champions Charter with all members
- [ ] Set up async channel in Teams or Slack
- [ ] Run first Champion Huddle
- [ ] Each Champion identifies one use case to test
- [ ] Run sentiment pulse survey (baseline)

Month 2: Momentum
- [ ] Run first showcase for broader team
- [ ] Update Opportunity Tracker with all use cases tested
- [ ] Complete tool evaluation scorecards for primary tools
- [ ] Each Champion tests a second use case
- [ ] Driver reviews and updates governance if needed

Month 3: Scale
- [ ] Run second sentiment pulse survey
- [ ] Place team on maturity scale
- [ ] Run the adoption health analysis prompt
- [ ] Make scaling decisions: which use cases roll out to broader team?
- [ ] Quarterly reset: review governance, tools, and rhythm
- [ ] Present adoption health update to leadership

Success measures
- Number of recurring AI workflows: [target]
- Confidence pulse average: [target]
- Use cases in Opportunity Tracker: [target]
- Team members actively using AI weekly: [target]

Adoption Health Analysis Prompt

This is the prompt from the Measuring Adoption page. Keep it here for easy reference.
You are an AI adoption health analyst for a programme lead. I have uploaded my team's adoption data from three sources: our sentiment pulse survey results, AI platform usage data, and AI win/opportunity tracking.

Analyse this data and produce a one-page leadership update covering four dimensions:

Confidence and Skill — Rate as Strong, Developing, or Needs Attention.
Impact and Habit — Rate as Strong, Developing, or Needs Attention.
Culture — Rate as Strong, Developing, or Needs Attention.
Business Outcomes — Rate as Strong, Developing, or Needs Attention.

Then provide:
- Overall maturity placement: Curious, Exploring, Embedded, or Integral
- Overall health summary in one sentence
- Top three wins this period with evidence
- Top five AI use cases ranked by frequency or impact
- Top risk that leadership needs to be aware of
- One recommendation that requires senior support or a decision

Rules: plain language, no jargon, one page maximum, evidence-based only.

Quick checkpoint

Charter ready

You have a Champions Group Charter to share with your team

Rhythm planned

You have mapped your weekly, monthly, and quarterly cadence

Tracker set up

You know how to set up and maintain the Opportunity Tracker

90-day plan

You have a clear plan for the first three months after this cohort

Back to Week 7 introduction

Return to the Week 7 overview