Internet & Software Tips

The Reality of AI in Engineering: Why Productivity Gains Are Capturing System Issues

The Reality of AI in Engineering: Why Productivity Gains Are Capturing System Issues

The adoption of AI in applications engineering is accelerating rapidly, yet organizations often struggle to translate early-stage testing into meaningful production results. In the latest SD Times Live! webinar, Will Lytle, CEO of Plandek, said the challenge is not the tools, but “how they are used in the system.” The most effective AI teams recognize that the benefits of AI are often absorbed by system constraints, preventing positive outcomes from being delivered.

The AI ​​Adoption Surge and the Perception Gap

The adoption of AI in all engineering organizations has become almost universal. Polling data from Plandek shows a significant increase: In the past 6 months, 30% of respondents have deployed AI to at least half of their engineering teams, but in a survey conducted last month, that number jumped to 93%. In addition, 48% of organizations have deployed AI to 90% or more of their teams, up from 12% 6 months earlier. This program aims for developers, product owners, and product teams to use AI in their various roles.

Despite this growth in adoption, Lytle pointed out a major disconnect: Developers often feel they are faster, produce code and run tests better, but this does not consistently translate to organizational speed. In fact, an MIT survey found that while only 20% of developers are experienced he heard immediately, an analysis of the quality of delivery systems showed that they were 19% slower.

Shifting Bottlenecks: Why AI’s Benefits Are Being Taken

The main problem is that the AI ​​doesn’t automatically correct for team strengths or system errors. “It’s because AI doesn’t fix the team, right? AI augments what’s already there,” Lytle explained.

Historically, constraints have often been related to engineering capabilities, but AI has changed this limitation. Delivery performance often remains flat because bottlenecks are found in parts of the system where AI has yet to have a direct impact. Lytle notes that these new constraints are presented by the acceleration effect of AI: “AI accelerates the way individuals deliver. But the constraints are now shifting to review cycles, scheduling, dependencies, ideas as part of the product development life cycle, and other aspects as part of your continuous delivery and continuous integration ecosystem,” he said.

Measuring Success: The Four Pillars of Productivity

For organizations to drive meaningful change, they must first develop a standardized way to measure productivity. Plandek uses a framework called the four pillars of productivity to measure software engineering performance. These pillars are:

  • Focus: Ensuring investment and energy is directed to things that drive the business forward, such as new revenue or customer satisfaction, while accounting for time spent on support and maintenance.
  • Flow: Driving an efficient flow environment using metrics such as lead time to value, cycle time, and new output and PR quotients introduced in 2026 benchmarks.
  • Prediction: Measuring reliability and consistency, ensuring that delivery meets customer expectations using metrics such as sprint capacity accuracy and speed flexibility.
  • Quality: Focusing on building a quality product, and more importantly, driving feedback loops faster to reduce the time a bug or feature spends in the backlog. Addressing quality is directly related to improving time spent on support and maintenance.

Addressing System Issues

Identifying issues requires combining quantitative and qualitative data. Quantitative data (cycle time, KPIs) are revealing there the system slows down, but quality indicators (developer frustration, stakeholder feedback) track the signal to why.

Lytle outlined seven common categories of barriers, stressing that high barriers have evolved. That’s right governance and compliance, workflow and process, codebase and architecture, tools, documentation, training and, ultimately, culture.

The most impactful change over the past six months has been the rise of governance and compliance and workflow and process as the leading categories of issues, reflecting increased regulatory requirements and complex processes. Additionally, the codebase and architecture have exploded, as modern AI tools reveal difficulties in working within legacy or non-modular codebases.

Ultimately, Lytle advises organizations to change their operating model rather than engaging in slow, multi-year change management programs. Instead, the focus should be on driving speed and speed with a tight feedback loop to quickly assess the impact of changes.

“I would say lead through change, rather than trying to change the management of everything in a one-year, 2-year, 3-year plan,” concluded Lytle.

view the full webinar here.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button