Capacity planning has always been a high-stakes exercise in customer service. If you get it wrong, you’ll feel it quickly in backlogs and SLAs.
AI changes the dynamics of capacity planning because it changes the work your team does. It resolves the bulk of your volume, increases the speed at which work gets done, and makes the work your human teammates do harder and higher-value.
This is part five of our five-part series on customer service planning for 2026. We’ll be sharing all five editions on our blog and on LinkedIn.
If you’d rather have them emailed to you directly as they’re published, drop your details here.
This year’s challenge is the balance: you need ambition around how much your AI Agent will handle so you can plan your team’s responsibility and system-level work. But if those automation assumptions are wrong, you risk being understaffed.
This final edition in our 2026 planning series is about that tension. We’ll unpack how AI changes the logic of capacity planning, what we’ve learned from going through this exercise for the past few years at Intercom, and the traps to avoid.
How AI changes traditional capacity planning
Traditional planning rests on relatively stable assumptions:
- Volume grows at a predictable rate.
- Work types are relatively consistent.
- Handle times don’t change dramatically.
- Productivity (or “output”) can be held flat or improved over time.
In an AI-first model, none of that is guaranteed. It changes the fundamentals:
- The mix of work changes. AI absorbs a growing share of simpler conversations. What reaches humans is more complex, more time-consuming, and often requires more human-to-human connection.
- Demand can increase. When you remove friction, customers contact you more. AI can both resolve more and attract more volume at the same time.
- Human time is split differently. Your human teammates must solve customer problems, and also review AI behavior, give feedback, improve content, and support system-level work.
- Performance is dynamic, not fixed. Automation rate isn’t a one-time number. It can go up as you improve the AI, and down if you neglect it.
If you plan for 2026 using a pre-AI model, i.e., assuming similar productivity, similar work mix, and a simple linear relationship between volume and headcount, you’ll underestimate what it takes to run a high-performing support organization.
There are many metrics you can track, but the one you should really focus on is “automation rate” (AI Agent involvement rate × AI Agent resolution rate).
It tells you:
- What share of your total volume AI is actually resolving.
- How much work is left for humans to manage.
- How much additional volume humans can absorb if demand increases.
- How ambitious you can be with your headcount plan.
For teams earlier in the journey, the priority is often raising involvement (getting the AI involved in more conversations). For teams further along whose involvement rate is already high, the focus should be on moving resolution on the hardest remaining work, where each additional 1% of automation can represent several people’s worth of capacity.
In your 2026 plan, automation rate should sit alongside projected inbound volume, average “output” per person (for the more complex work that remains), and occupancy (how much time is allocated to customer-facing interactions vs. other operational and strategic work).
Together, these inputs give you a more realistic picture of how many people you need, and where they should spend their time.
Here’s our advice for capacity planning heading into next year.
1. Plan boldly on automation, but match it with investment
One of the biggest questions leaders wrestle with is “How bold can we be with automation assumptions?”
Don’t be afraid to plan for high automation rates, as long as you’re willing to invest in hitting them.
It’s tempting to be conservative and cap your automation assumptions at 40–50% “because AI is new.” But in practice, many teams are already planning for much higher automation rates in 2026 – 60%, 70%, even 80%+ – because they’ve invested properly in AI ownership and content.
The investment element here is crucial. To hit those numbers, you need:
- Named ownership for AI performance (AI ops, knowledge management, conversation design).
- Clear automation targets by work type (e.g. informational vs. personalized vs. actions vs. deep troubleshooting).
- Realistic expectations for what’s “easy” to automate and what’s not.
- A plan for how you will raise automation over time (monthly or quarterly steps, rather than a single jump).
For teams earlier in their journey, you should dig into your data to find these primed areas for investment in your own business:
- Start by looking at your biggest volume drivers.
- Separate issues that are mostly content-based from ones that depend on data or complex procedures.
- Assume higher resolution potential for content-led topics once your knowledge is in shape.
- Assume more modest initial resolution for complex, system-dependent flows, and build up from there.
It’s great to have bold automation goals, but it goes hand in hand with investment in the team structure and systems you need to realistically achieve them.
2. Expect human “output” per person to go down
Whether you call it “productivity,” “output,” or even just “cases closed,” the traditional volume-based metrics for support teams need to change in 2026.
This is one of the hardest mindset shifts for support leaders, because historically, capacity plans assume that individual productivity will either stay flat or improve slightly as processes, tools, and training get better.
In an AI-first model, the opposite is more realistic. As AI takes on more of the work, humans are moving into roles that are harder, time-consuming, and more complex and cross-functional. So even though they’re handling less conversations, they’re creating more value.
If you don’t factor this into your capacity plan, you’re not going to accurately represent the work.
In your 2026 plan, it’s safer to:
- Model a lower “cases closed per person” than your baselines in previous years.
- Explicitly assume that the remaining work will be more complex and time-consuming.
- Recognize that “productivity” now includes system-level work like AI Agent improvements, not just conversations or cases closed.
3. Rethink occupancy: more time off the queues, on higher-value work
In most capacity plans, occupancy is treated as what percentage of time agents spend in the inbox versus in training, meetings, and on breaks. But your team now has a growing list of “out-of-inbox” or “off-the-queue” responsibilities that directly affect AI performance and overall capacity, like:
- Reviewing AI-handled conversations.
- Improving AI Agent triaging and handovers.
- Contributing to content and procedures.
- Feeding insights back to product and engineering.
- Supporting system changes that reduce future volume.
This means you’ll likely need to set lower inbox occupancy targets than before. And while it might feel strange to lower these, just remember (and communicate up) that it’s not because people are working less, they’re working differently.
When you’re planning capacity for 2026:
- Assume more time spent on improvement and system work, not less.
- Make that visible in the plan (e.g. X% time in inbox, Y% time on AI and system improvement).
- Treat this work as critical, not “nice to have when there’s spare time.”
If you don’t proactively allocate time for this, it won’t be prioritized (and your automation and performance targets will suffer).
4. Work with the finance team early, and treat your plan as a set of assumptions
Capacity planning with AI is ultimately a set of bets you’re willing to make.
Those bets are based on automation rate, human output, demand growth, occupancy, and where surplus capacity (if any) will go. You need to make these calls in collaboration with your finance partners.
Here’s our advice:
- Bring the finance team in early. Help them understand that this plan is different: more dynamic, more assumption-driven, and directly tied to AI performance.
- Be clear that these are assumptions. Automation, demand, and complexity will ebb and flow, so you’ll need flexibility in your plan to adjust.
- Commit to a quarterly review cadence. Revisit the plan with finance every quarter (at least) to compare assumptions vs. reality and adjust headcount, targets, and investment as needed.
This matters because if you don’t plan openly and honestly with finance, there are some risks. E.g.:
- If you assume automation will grow faster than it does and cut or stop backfilling too early, you can end up understaffed for months.
- Hiring and onboarding take time; by the time you realize you’re short, it’s often too late to catch up without heavy strain.
- On the other hand, if your plan produces a surplus of people, you need a clear strategy to reallocate them to higher-value work rather than defaulting to reductions.
This is where your broader AI transformation story comes in: repurposing people to improve systems, feed insights back, support new channels, or drive proactive CX.
Set your team up for success in 2026
- Put ‘automation rate’ at the center of your plan. Use it to anchor your assumptions about how much of the work AI will handle.
- Plan for humans to handle fewer, harder conversations. Align targets with the reality of post-AI work.
- Protect time for system improvement. Treat “out of inbox” work as essential, not optional.
- Work with finance early and often. Align on assumptions, review quarterly, and keep the plan alive.
- Avoid shrinking too fast. Be ambitious on AI, but don’t put your customers at risk by cutting headcount before your automation performance is proven.
- Have a plan for surplus capacity. If AI over-delivers, know how you’ll redeploy people into work that compounds value.
If AI is going to handle the majority of your customer conversations, your plan has to be designed to help it do that well, and to keep your team set up for meaningful, sustainable work.
A 2026 plan built on adaptable assumptions, not fixed predictions, is one that will hold up as your team’s work, the system around it, and your customers’ expectations continue to change.
If you’d like to subscribe to future series like this, drop your details here.
