Category: Project Management.

nethaji-1

Karthick Nethaji Kaleeswaran
Director of Products | Strategy Consultant


Published Date: April 1, 2026

TLDR

Project variance reports fail when the baseline is overwritten by the latest plan. Approved scope changes get misclassified as execution overruns, and PMs take the blame for decisions made elsewhere. The fix is governed by baseline management that preserves the original commitment, tracks approved changes through re-baselining, and isolates true execution performance.

The quarterly review summary reads something like this: “Project Atlas Budget: GREEN. Schedule: GREEN. Cost variance: 0%. Within plan.” On paper, everything looks fine.

In this scenario, consider a senior PM, let’s call her Maya. The original project was approved with a $1M budget. In Month 3, two scope additions were approved. The sponsor signed off. The PMO Director reviewed and agreed. The client’s CFO formally authorized an additional $200K. The revised working budget is now $1.2M.

Because the plan was overwritten instead of baselined, the reporting now compares actuals against the revised number. Actual spend is $1.2M. Variance shows as zero. The dashboard stays green. But this is where governance breaks.

The original commitment was $1M. The project has effectively run at a 20% cost increase, along with corresponding shifts in scope and potentially schedule. None of that shows up in the variance report because the baseline has been lost. Finance sees a project that is on track. The steering committee sees no issue. There is no signal that a material change has occurred.

And yet, from a portfolio perspective, this is exactly the signal that matters.

Without a preserved baseline, every approved change rewrites history. Variance disappears, not because performance improved, but because the reference point moved. The same pattern applies to scope and schedule. Milestones shift, scope expands, and everything continues to be reported as aligned with the plan. This is the real governance failure. It is not about overruns being flagged incorrectly. It is about overruns not being visible at all.

Across enterprise portfolios, this plays out every quarter. Projects appear healthy in execution reports while quietly drifting away from their original investment case.

Plan vs. Baseline: They Are Not the Same Thing

Here is the conflation that breaks almost every variance report in the industry.

  • A project plan is a living document. It gets updated constantly as conditions change, risks materialize, and priorities shift. That is exactly what it is supposed to do.
  • A project baseline is a locked, time-stamped snapshot of approved scope, schedule, and cost at a defined point in time. Its entire purpose is to serve as a permanent reference for measuring performance.

When you measure variance against a plan that changes every week, you are not measuring variance. You are measuring drift from a moving target, and the number is meaningless.

According to PMI’s Pulse of the Profession, only 34% of organizations report high benefit realization maturity, a figure that has barely changed in five years despite widespread investment in PM tools. Baseline mismanagement is one of the most significant, underacknowledged contributors to that gap.

The 4-Date Model: The Only Honest Way to Report Progress

Most enterprise PPM implementations track two dimensions of project data, then wonder why the numbers tell a confusing story. Credible performance reporting requires four distinct dimensions working together.

Dimension What It Represents What Breaks Without It
Planned What was scheduled at this point in time Cannot distinguish disciplined replanning from unmanaged drift
Actual What has been spent and delivered to date No visibility into execution reality or efficiency
Baseline Locked original commitment or last approved rebaseline Variance becomes meaningless because the reference point keeps moving
Forecast Current projection based on actuals and remaining work No forward visibility, only hindsight reporting

To make this concrete, consider a hypothetical systems integration project with an original budget of $2.4M and a six-month timeline. In Month 4, the client requests two additional integration points. The PMO Director approves a $400K addition and a six-week extension.

Here is how the same project reads without baseline locking and with a baseline + rebaseline.

Metric Without Baseline Locking With Baseline + Rebaseline
Plan structure Original plan ($2.4M) is overwritten by revised plan ($2.8M) Original baseline ($2.4M) is preserved; revised plan becomes approved rebaseline ($2.8M)
Budget variance $2.8M vs $2.8M = 0% variance vs Baseline: $2.8M vs $2.4M = +17%
vs Rebaseline: $2.8M vs $2.8M = 0%
Schedule status Compared only to revised dates → appears on track (GREEN) vs Baseline: 6 weeks late (RED)
vs Rebaseline: on track (GREEN)
Scope visibility Scope increase absorbed into plan, not explicitly visible Scope change of +$400K is explicitly visible and governed
PM performance narrative “On track, within plan.” “Delivered approved scope increase; executing to revised commitment.”
True execution variance Not visible because plan keeps shifting +$32K (1.3%) vs rebaseline, isolated and manageable

Without baseline locking, the system rewrites the plan and hides variance. With baseline and rebaseline, you can separate three things clearly: original commitment, approved change, and execution performance.

The Change Request Gap

Most enterprises are reasonably disciplined about approving change requests. Where governance collapses is in the step that comes next: updating the baseline to reflect the approved change.

The gap is institutional, not procedural. Change request forms exist. Approval routing exists. But the connection between an approved change and a system-enforced baseline update is, in most organizations, manual, inconsistent, and dependent on the PM remembering to do it.

An approved change request without a baseline update is just a paper trail. Governance is incomplete until the numbers actually change and an audit record is attached.

What a governed baseline change workflow actually requires:

  1. A change request must explicitly quantify impact on budget, schedule, scope, and resources. Without this, it cannot be evaluated, approved, or baselined.
  2. Approval routing by materiality threshold with PM authority for minor changes, PMO Director for mid-tier, Sponsor or Investment Committee for material changes
  3. Baseline lock as a system event: once approved, new parameters lock automatically; the previous baseline is archived, not overwritten
  4. Automatic audit record with who approved, what changed, when, the rationale, and the financial authorization reference
  5. Variance reports recalculate against the new baseline, automatically, not via a manual refresh

When any of these steps is missing, PMs get blamed for decisions they did not make.

Baseline Maturity: Where Does Your PMO Stand?

Most organizations sit at one of three levels. Be honest about which one describes yours.

Maturity Level How Baselines Work The Symptom
Level 1 — No Baseline Plan is the baseline. Updated continuously, never locked. PMs blamed for approved changes. Finance has no confidence in PPM data.
Level 2 — Static Baseline Baselines set at kickoff, rarely updated. All changes are absorbed as variance. Approved changes look like overruns. PMO mediates recurring Finance-vs-PM disputes.
Level 3 — Governed Dynamic Baseline Change request workflows integrated with baseline locking. Rebaselining requires governed approval. Approved variance visible separately. True execution issues isolated. PM records are defensible.

Quick self-assessment: answer these five questions.

  1. Are baselines formally locked at project start? Y / N
  2. Is there a system-enforced workflow connecting approved change requests to baseline updates? Y / N
  3. Can your PMO distinguish approved variance from true execution overrun in portfolio reports? Y / N
  4. Are baseline changes traceable with approver, timestamp, rationale, and financial impact? Y / N
  5. Do PM performance reviews reference performance against approved baselines, not original estimates? Y / N

Three or more “No” answers mean baseline governance is a live risk exposure, not a backlog item.

What Good Actually Looks Like

Return to Maya. In an organization with governed baseline management, her quarterly review tells a different story entirely.

“Project Atlas, performance against approved baseline: FULLY ON TRACK. Approved scope additions: 2. Total approved budget increase: $400,000 (authorization refs: CR-2024-041 and CR-2024-055, both signed by Sponsor and PMO Director). True execution variance: +$32,000 (1.3% of approved budget). Forecast within approved parameters.”

The PM is accountable for the 1.3%. The sponsor and PMO Director are accountable for the $400K, which is exactly where that accountability belongs since they made the decision.

Baseline management is not an administrative detail. It is the foundational layer on which all meaningful performance measurement, EVM calculation, audit readiness, and PM accountability depends.

PMI research consistently finds that organizations practicing formal scope change management complete 22% more projects within original budget and 15% more within original schedule (PMI Pulse of the Profession, 2020). The PMOs that make this shift find that the first people to benefit are the project managers who no longer have to be blamed for decisions they did not make.

Ready to Stop Blaming the Wrong Person?

Profit.co’s baseline management capability, including the 4-date model (Baseline, Planned, Forecast, Actual), governs rebaselining workflows, role-based approval routing, and portfolio-level rollups and is built for enterprise PMOs that need to separate approved variance from execution variance at scale.

Frequently Asked Questions

A project plan is a living document updated continuously as conditions change. A project baseline is a locked, governed snapshot of approved scope, schedule, and budget used as a fixed reference point for measuring variance. Treating them as the same thing makes variance reporting unreliable

Related Articles