Skip to main content
  1. Posts/

AI Exposed the Process Debt You've Been Ignoring

·6 mins
Justin Smestad
Author
Justin Smestad
Table of Contents

Code has never been cheaper to produce. AI tools can scaffold a feature in minutes that would have taken a day. They write tests, generate boilerplate, refactor patterns, and autocomplete their way through problems that used to eat an afternoon.

And teams with broken processes are shipping broken software faster than ever.

That’s the part nobody warned you about. AI is an accelerant. If your team has clear scope, honest estimates, and rituals that actually surface problems, AI makes you dramatically faster. If your team has vague tickets, fictional estimates, and standups where everyone just reads off the board, AI helps you produce more of the wrong thing in less time.

The hard part of software was never writing the code. It was figuring out what to build, agreeing on what “done” means, and keeping everyone pointed in the same direction. Those skills got treated as optional for years. They weren’t, but teams could get away with it when the cost of writing code was high enough to slow everything down naturally. That natural brake is gone now. If your process has cracks, AI is going to blow them wide open.

The system, not the pieces
#

Walk into a busy restaurant kitchen. Every cook on the line is talented. They can each make great food. But a great kitchen isn’t great because of any one thing. It’s the prep list that feeds the line. The ticket system that sequences the orders. The calls between stations that keep timing in sync. The expo checking every plate before it leaves the window. Remove any one of those and the kitchen still functions, just poorly enough that nobody can pinpoint why. The food is good but the timing is off. The timing is fine but the wrong dish goes to the wrong table. Each piece is simple on its own. The system is what makes them a restaurant instead of six people cooking in the same room.

Engineering teams work the same way. Scoping should feed estimation. Estimation should set expectations. Expectations should drive what you talk about in standup. Standup should surface blockers. The board should reflect all of it. When one link is missing, the whole chain drifts. And the drift is slow enough that you don’t notice until someone asks “what are we even working on?” and nobody has a confident answer.

Now add AI to that drifting team. The code shows up faster, but the drift accelerates too. You’re merging PRs that don’t match the ticket because the ticket was vague. You’re shipping features that don’t solve the problem because nobody scoped the problem clearly. You’re moving tickets to “done” without anyone agreeing on what “done” meant. The velocity chart looks great. The product doesn’t.

What actually breaks
#

Most teams don’t have a broken process. They have half a process. Standups that don’t connect to the board. Tickets that don’t describe what “done” looks like. Estimates that nobody trusts, including the person who gave them.

The problem is that each piece was adopted in isolation. Someone read a blog post about standups. Someone else copy-pasted a ticket template from a previous job. Now you’ve got a Frankenstein workflow where nothing feeds into anything else. And when something slips, nobody can point to where the system broke, because there was no system. Just a collection of practices duct-taped together.

Before AI, that duct tape held. Barely. The slowness of writing code gave teams time to course-correct. A developer would start building, realize the ticket was vague, walk over (or Slack over) and ask clarifying questions. That back-and-forth was inefficient, but it was also a safety net. AI removes the safety net. The developer prompts, the code appears, the PR goes up, and nobody paused long enough to ask if this was the right thing to build. And when someone does pause, when a developer actually stops and asks “what are we solving here?”, the answer comes back vague or conflicting. That’s not an AI failure. That’s the process failing at speed.

The skills that matter now
#

Every blog post about standups tells you to answer three questions. Very few tell you what to do when the answer to “what are you working on?” doesn’t match what the board says. That’s the interesting part. That’s where the system either works or falls apart.

The same is true for estimation. Everyone argues about story points vs. hours vs. t-shirt sizes. Almost nobody talks about what you’re actually measuring, which is confidence and risk, not time. And once you frame it that way, the entire conversation changes. You stop arguing about whether something is a 3 or a 5, and start asking “what don’t we know yet?”

That question matters more now than it ever has. When code is cheap, the expensive mistakes are building the wrong thing, solving the wrong problem, and discovering too late that nobody agreed on the goal. Those are all process failures, not technical ones. And they’re the failures that AI accelerates.

You’ll hear people say that models will get better at this. That AI will learn to scope, prioritize, and ask the right questions on its own. Maybe it will. But even perfect AI-generated tickets don’t fix a team that doesn’t agree on what “done” means, or doesn’t trust its own estimates, or runs standups that nobody listens to. The system that connects scoping to estimation to communication to accountability is a human system. Better models don’t build that for you. And when multiple engineers are prompting AI against overlapping or poorly scoped tickets, the agents collide. You get duplicate work, conflicting implementations, and more rework than in the past. The coordination problem gets worse, not better.

The skills that separate effective teams from fast-but-chaotic ones were never optional. They just felt optional when everything moved slowly enough to self-correct. Clear scoping. Honest estimation. Rituals that earn their keep. A board that reflects reality. Communication that replaces the hallway conversations remote teams don’t have.

These are the skills that separate teams that use AI effectively from teams that use AI to ship chaos faster. The code is the easy part. It always was. The difference is that now there’s nowhere left to hide.

Building the kitchen
#

I’ve spent more than a decade running this problem at companies from seed stage to Shopify-scale. The details change every time. The principles don’t. Write clear tickets. Estimate confidence, not hours. Run rituals that earn their keep. Make the board reflect reality. Hold each other accountable without making it personal.

I wrote all of it down in a Dev Handbook. It starts with a parable about expectations and works through scoping, rituals (standups, planning, retros, estimation), the daily routine that keeps it all in sync, and templates you can use tomorrow.

You don’t need to adopt all of it. But if you adopt any of it, the handbook explains how each piece connects to the rest. Because the goal isn’t to add more practices. It’s to build a kitchen, not just hire better cooks.