In 2025, “doing more with less” became a mandate. Health plans invested heavily in platform optimization—care management systems, analytics, interoperability, and workflow automation—expecting efficiency gains and cost relief.
As teams look toward 2026, a quieter question is emerging:
Why didn’t these efforts deliver the outcomes we expected?
The issue wasn’t technology. In most cases, platforms performed as configured. The gap was execution: systems were optimized faster than the operating models around them.
Across plans, the same patterns surfaced repeatedly.
⚙️ 1. Optimization Was Treated as a Technical Event
Optimization efforts moved quickly from a systems perspective, but operational change lagged. Adoption was assumed, not engineered. Workflows stayed inconsistent, and users reverted to familiar habits.
💡 What worked instead:
- Defining a small number of measurable operational outcomes upfront
- Engaging frontline users early
- Treating adoption as a deliverable, not a byproduct
The question shifted from “Was it configured?” to “Did the work actually change?”
📋 2. Governance Didn’t Keep Pace with Configuration
As platforms became more flexible, decision-making became more fragmented. Decision rights were unclear, documentation was thin, and configuration drift followed.
By year-end, many teams said the same thing:
“We optimized it last year, but everyone is doing it differently now.”
💡 What worked instead
- Clear, documented decision rights
- A single source of truth for configuration rationale
- Governance treated as an operating function, not a committee
The goal wasn’t control—it was continuity.
🧩 3. Staffing Masked Structural Issues
Temporary resources helped stabilize operations, but often delayed fixing root causes. Knowledge stayed with contractors, and durability suffered once they rolled off.
💡 What worked instead
- Using staffing to transfer knowledge, not just absorb volume
- Fixing workflows before adding headcount
- Identifying which capabilities needed to be internal and sustained
Capacity supported change—it didn’t replace it.
🧪 4. Testing Was Compressed
Testing happened, but often too late or too narrowly. Real-world clinical scenarios and downstream impacts surfaced after go-live, eroding confidence and ROI.
💡 What worked instead
- Scenario-driven testing tied to real operations
- Earlier visibility into downstream impacts
- Enough time to build user confidence before adoption
The objective wasn’t perfection. It was trust.
🥇 What the Plans Seeing ROI Did Differently
Plans that realized value didn’t rely on better tools. They relied on better execution.
They treated optimization as an operating model change—not a project. Governance held. Testing reflected reality. Knowledge transfer was intentional. Accountability extended beyond go-live.
They didn’t just optimize platforms.
They optimized how work actually gets done on the platform.
🛠️ Looking Ahead
The lesson from 2025 is clear: optimization success has far less to do with tools and far more to do with execution, ownership, and follow-through.
As health plans refine priorities for 2026, many are rethinking not just what platforms they use, but how they operate around them. Rather than relying on single vendors or loosely coordinated support models, we’re seeing increased focus on tighter alignment between platforms, operating models, and accountable execution.
The organizations making the most progress aren’t adding complexity. They’re simplifying how technology, people, and decision-making come together — and being more intentional about where partners fit into that equation.
In January, we’ll share more on how this shift is taking shape, including how deliberate partnerships are being used to close execution gaps and sustain results well beyond go-live.
