In 2025, 90% of developers use AI daily, spending about two hours on coding, documentation, and automation. Teams with strong practices, such as version control, observability, and small-batch delivery, are seeing real benefits: 80% report faster delivery and 59% report better outcomes. But AI also magnifies weaknesses: poor processes translate into friction, burnout, and rework. The takeaway is simple; AI boosts performance, and paired with discipline and continuous learning, DevOps multiplies it.
In this episode of AppDevANGLE, I spoke with Nathen Harvey, who leads DORA at Google Cloud, about the 2025 report (now focused on AI-assisted software development) and what leaders should change right now to capture the upside without breaking stability or people.
AI as Amplifier, Not a Silver Bullet
We started with the core finding: AI accelerates delivery, but only within healthy systems.
“As you use more AI in the SDLC, throughput and stability don’t automatically rise together,” Nathen told me. “In the 2024 report we saw throughput and stability fall with more AI use. This year, throughput turned around, but instability persists. If we only use AI to generate code, we ignore systemic concerns.”
Nathen pointed at code review as a recurring bottleneck: “It’s a place where AI can lean in; faster, higher-quality reviews; shortening the feedback loops that matter.”
Roles Are Shifting From Creators to Curators
With 90% using AI for coding, docs, and workflow automation, developers’ roles are evolving.
“My job isn’t just to write code,” Nathen said. “It’s to satisfy user needs. That means thinking more like an architect or team lead; understanding the problem, breaking it down, and guiding the solution.”
As AI handles more generation, engineers spend more time on prompting, reviewing, integrating, and deciding, which is the human judgment layer that aligns work to users and business outcomes.
Team Archetypes and the AI Capabilities Model
This year’s DORA study clustered nearly 5,000 respondents into seven archetypes, from foundational challenges / legacy bottlenecks, through productivity pragmatists, to harmonious high achievers.
“Leaders should assess their teams across performance, product outcomes, delivery, and individual effectiveness, including friction and burnout,” Nathen said. “Then apply the AI capabilities model. It’s not if you use AI, but how. If product performance is weak, the model highlights the 2–4 enabling capabilities that will amplify AI’s impact.”
Build Speed and Stability the Boring Way
The practices that worked pre-AI still work, and arguably more so.
“One fundamental capability is working in small batches,” Nathen explained. “Ship smaller, frequent changes. They’re easier to reason about, easier to roll back, and less risky when something goes wrong.”
“Pair that with strong version control. Check in very frequently. As AI generates more code, insist that it lands in small segments so humans can review and understand it. That’s where the role change shows up. More editing and reviewing, less raw keystrokes.”
Governance That Guides, Then Adapts
AI policy can’t be a one-and-done PDF.
“We need a clear, communicated stance on acceptable use. What data can be shared, where AI is allowed,” Nathen said. “But the space moves fast. Policies must be flexible and re-evaluated as tech and practices evolve. Too rigid, and you’ll stall learning; too vague, and you’ll invite risk.”
Psychological safety matters, too. Make space for skill development so juniors can still become seniors. “Technology has always changed,” Nathen reminded me. “We created paths for cloud engineers; we need paths for AI-assisted engineers.”
Path to Adoption
- Map your archetype: Use DORA’s seven profiles to identify where your team is today.
- Enable the right capabilities: Apply the AI capabilities model to select 2–4 leverage points (e.g., small batches, version control rigor, faster code review).
- Shorten feedback loops: Use AI for review and test assistance, not just generation. Instrument PRs, reviews, and rollbacks.
- Ship smaller, more often: Enforce small-batch changes and frequent commits to couple AI speed with stability.
- Codify policy, then iterate: Publish clear AI usage guidance; revisit quarterly as tools and risks evolve.
- Invest in skills and safety: Budget time for learning, mentoring, and safe experimentation to avoid burnout and hollowed-out career ladders.
Analyst Take
AI is proving to be the ultimate amplifier of software delivery practices; it accelerates the good and magnifies the bad. The 2025 State of AI-Assisted Software Development report makes it clear that performance gains don’t come from AI alone, but from pairing AI with DevOps discipline.
The most successful teams share three defining traits:
- System Over Snippets – AI’s value compounds when it’s embedded into the full lifecycle, not isolated to code generation. Teams that apply AI to reviews, testing, deployment, and observability see measurable improvements in stability and speed.
- Small Batches and Strong Control – The hallmark of elite performers remains small, frequent changes under tight version control. This model gives AI the space to iterate safely, shortens learning cycles, and ensures that automation can be rolled back quickly when it misfires.
- People, Policy, and Psychological Safety – Governance is not about restriction but rather about direction. Clear, evolving policies define what “good AI” looks like in context, while psychological safety ensures developers can experiment, learn, and upskill without fear of failure.
The research also signals a structural evolution inside DevOps teams:
- Developers are shifting from creators to curators, focusing more on orchestration, prompt design, and architectural oversight.
- Leadership is becoming system-centric, using DORA’s new AI Capabilities Model to tailor strategies across seven team archetypes.
- AI governance is becoming a DevOps competency, sitting alongside CI/CD, observability, and reliability as a pillar of performance.
Ultimately, AI’s impact on engineering performance mirrors the evolution of DevOps itself. The organizations that thrive will treat AI not as a shortcut, but as a continuous improvement engine that is measured, instrumented, and guided by human judgment.
In the next phase of software delivery, AI-assisted DevOps will define what operational excellence means. The best teams won’t just deliver faster; they’ll learn faster, adapt faster, and build cultures resilient enough to keep improving as AI keeps evolving.

