Where Are You in the AI Adoption Transition?

What stage is your team actually in?

Most B2B SaaS teams already have AI activity across product, design, engineering, and QA. The issue is not whether AI is present. The issue is whether the gains are reliable, whether usage is consistent enough to trust, and whether any of that value survives the handoffs between teams. McKinsey's latest survey found that workflow redesign has the biggest effect on whether organizations see EBIT impact from gen AI, while Productboard reports that AI adoption is already widespread in product teams, but maturity still varies widely.

We have talked to hundreds of product and engineering leaders at mid-market B2B SaaS companies. This is the pattern we keep seeing: most companies fit into one of four stages right now. Most teams are in the messy middle, not at zero. Activity is visible, but value is still scattered. The move that matters is from tool rollout into workflow change, and then into operating-model change.

The four stages we see most often

Stage 1

Shadow and scattered AI

Individuals are experimenting. Learning is private. Leadership has low visibility.

Stage 2

Tool rollout

The company has bought access and created visible activity, but wins are still scattered and fragile.

Stage 3

Workflow infusion

AI is starting to improve recurring work across the PDLC. Gains begin to survive the handoffs.

Stage 4

System shift

The company is changing decisions, intake, measurement, and operating rhythms, not just workflows.

AI Catalyst helps teams move from tool rollout to workflow infusion faster, and helps later-stage teams strengthen what is already starting to work.

See where the gains are breaking down

Score each question from 1 to 5 based on what is most true in your organization today. You'll see what stage you're in, what pattern is holding you back, and what has to change next.

1. Functional spread of AI usage
Across product, design, engineering, and QA, how broadly has AI spread into day-to-day work?
This is about how far AI has spread across product, design, engineering, and QA, not whether every individual is using it.
1 AI use is isolated to a few individuals or experiments.
2 AI is showing up in one function, but barely outside it.
3 AI is active in two functions, but not broadly across the PDLC.
4 AI is active across most major functions, though unevenly.
5 AI is active across product, design, engineering, and QA as part of normal work.
2. Consistency of usage and output
Within the functions where AI is already present, how consistent is usage and output quality across teams and individuals?
This is not about how many functions are using AI. It is about whether usage is consistent enough inside those functions to create reliable value instead of hidden rework.
1 Very inconsistent. A few standout people drive most of the value.
2 Uneven. Some teams are using AI well, others barely use it or get weak results.
3 Mixed. There are pockets of consistency, but outcomes still depend heavily on who is doing the work.
4 Mostly consistent. Most teams have a workable baseline, though some variation remains.
5 Consistent. Capability is broad enough that results do not depend heavily on a few standout individuals.
3. Shared workflows
How many recurring workflows have shared AI practices, not just individual use?
1 None. AI is mostly personal and ad hoc.
2 One or two workflows have informal shared patterns.
3 A few recurring workflows have shared templates, expectations, or review norms.
4 Several important workflows have shared practices across teams.
5 Shared AI workflows are established across multiple recurring parts of the PDLC.
4. Handoff survival
When AI helps one team move faster, how often does that gain survive the handoff to the next team?
1 Almost never. The gain dies in the transition.
2 Rarely. Some speed is created, but the next team usually has to redo or reinterpret the work.
3 Sometimes. Some gains survive, but handoffs are still inconsistent.
4 Often. Handoffs usually preserve the value created upstream.
5 Consistently. AI gains carry through transitions and improve end-to-end flow.
5. Measurable outcomes
Can you point to clear PDLC outcomes that improved because of AI?
Think less rework, faster review cycles, better handoffs, fewer defects, faster release readiness, or stronger requirement quality.
1 No clear outcomes. Mostly activity and anecdotes.
2 Weak evidence. Some local wins, but no strong proof.
3 A few visible improvements, but not yet repeatable or broadly trusted.
4 Clear improvements in one or more PDLC outcomes.
5 Clear, repeatable improvements in multiple PDLC outcomes tied to delivery or business impact.
0 / 5 answered

Your stage and pattern

What's happening now

What's at risk

What needs to change next

If the gains are real, but they are not surviving the system, that is the work.

This is where AI Catalyst helps teams shorten time to value.

Fix where AI value is dying See how to turn uneven gains into shared workflows
Research behind this diagnostic

McKinsey, The State of AI: How Organizations Are Rewiring to Capture Value (2025): workflow redesign had the biggest effect on whether organizations saw EBIT impact from gen AI.

Workday, AI Friction Into Flow (2026): for every 10 hours of productivity gained, about 4 hours were paid back in rework, correcting, clarifying, and refining AI output.

Section, AI Proficiency Report (2026): significant perception gaps exist between leadership and individual contributors in whether AI is saving time and how people feel about it.