Skip to main content

Agile Practice

Agile Maturity Assessment

Are you doing Agile, or being Agile?

Many teams follow Agile ceremonies without the underlying values and behaviours that make them work. This assessment evaluates five dimensions of genuine Agile maturity: how deeply values and mindset are embedded, how well practices are executed, how smoothly work flows, how seriously learning is taken, and how effectively teams collaborate with stakeholders.

Two versions available: one for teams, one for organisations. Answer 15 questions on a 1–5 scale and get an instant radar chart with targeted guidance.

Values & Mindset

Assess whether the team genuinely lives Agile values, individuals and interactions, working software, customer collaboration, responding to change, rather than merely complying with a process.

  1. 1.When the team faces a difficult trade-off, do they default to Agile values (e.g. customer value over following the plan) or to whatever is easiest or safest politically?

    We follow the path of least resistanceWe consistently apply Agile values under pressure
  2. 2.How psychologically safe is it to raise concerns, challenge estimates, or admit that something isn't working during ceremonies like standups and retrospectives?

    People stay quiet to avoid conflict or judgementHonest, uncomfortable truths are welcomed and acted on
  3. 3.Does the team treat Agile as a mindset that shapes how they think and communicate, or primarily as a set of rituals and meetings to complete?

    Agile is a checklist of ceremonies we runAgile values genuinely shape our daily decisions and interactions

Practices

Assess whether the team's ceremonies, daily standup, sprint planning, sprint review, and retrospective, are well-facilitated, purposeful, and genuinely useful rather than obligatory rituals.

  1. 1.After your sprint planning, does every team member leave with a clear, shared understanding of the sprint goal and how their work connects to it?

    Planning ends with tasks assigned but no shared goalEvery member can articulate the sprint goal and their role in it
  2. 2.How useful is your daily standup as a coordination and impediment-surfacing event? Would the team miss it if it disappeared?

    It is a status report to the Scrum Master that nobody valuesIt drives real coordination and surfaces blockers same-day
  3. 3.Does your sprint review create a genuine feedback loop with stakeholders, or is it a demo where the team shows work and stakeholders nod along?

    It is a one-way demo with no meaningful stakeholder inputStakeholders engage, challenge, and influence the next sprint

Flow

Assess whether work moves smoothly through the team's system, WIP is managed, bottlenecks are visible and addressed, and quality is built in rather than inspected in at the end.

  1. 1.Does the team actively manage work-in-progress limits, or does everyone have multiple items in flight simultaneously, creating context-switching overhead and invisible bottlenecks?

    No WIP limits; everyone juggles multiple itemsWIP limits are respected and flow is actively managed
  2. 2.When a bottleneck appears, a blocked story, a slow review process, a missing dependency, how quickly does the team identify and address it?

    Bottlenecks stay invisible for days or are treated as someone else's problemBottlenecks are visible within hours and the team swarms to resolve them
  3. 3.Does the team have a meaningful definition of done that includes quality criteria such as automated tests, code review, and acceptance criteria sign-off, and do they actually enforce it?

    Definition of done exists on paper but is routinely bypassed under pressureDefinition of done is non-negotiable and quality is genuinely built in

Learning

Assess whether the team's retrospectives drive real, sustained improvement, whether the team experiments deliberately, and whether feedback loops are short enough to inform decisions before they become expensive.

  1. 1.Looking back over the last three months, can you name specific, measurable improvements that came directly from your retrospectives? Or do the same problems keep surfacing?

    The same issues recur; nothing materially changesWe can name concrete changes that came from specific retrospectives
  2. 2.Does the team deliberately run experiments, trying a new practice for a sprint, changing a process, testing a hypothesis, and then review the results with evidence?

    We rarely experiment; change happens by mandate or accidentWe run deliberate, time-boxed experiments and review outcomes with data
  3. 3.How short is the feedback loop between delivering an increment and learning whether it solved the problem it was intended to solve?

    Weeks or months; we rarely close the loopDays; feedback from real usage reaches the team quickly and informs the next sprint

Collaboration

Assess the quality of the team's stakeholder engagement, cross-functional working, and use of real customer feedback to drive decisions.

  1. 1.Does the team regularly engage with real customers or end-users, not just internal proxies, and does that feedback directly influence what the team builds next?

    We never speak to real users; a product owner relays what they think users wantWe regularly talk to or observe users, and it meaningfully shapes our backlog
  2. 2.When the team needs input from another discipline, design, security, operations, legal, is that collaboration smooth and built into the workflow, or is it a source of delay and hand-offs?

    Cross-functional dependencies are a consistent source of delays and hand-offsCross-functional collaboration is embedded in our team and workflow
  3. 3.Do stakeholders know what the team is working on, why, and when they can expect to review it, and do they actively participate in sprint reviews?

    Stakeholders are surprised by what we deliver and rarely attend reviewsStakeholders are engaged partners who shape what we build through regular collaboration