ABSTRACT
Most government agencies that are adopting OKRs already have a Balanced Scorecard — or something functionally equivalent to one. Annual performance plans, GPRA-M strategic goals, strategic planning frameworks with multiple perspectives, strategy maps, and scorecard dashboards are ubiquitous in federal, state, and local government. The question is not whether to use a Balanced Scorecard or OKRs. It is how to run both simultaneously in a way that produces more strategic clarity and execution discipline than either framework alone — without creating the reporting burden, goal duplication, data inconsistency, and organizational confusion that poorly integrated dual systems reliably produce.
This article is the definitive guide to BSC-OKR integration for government agencies. It begins with a rigorous comparison of the two frameworks across ten dimensions, presents a four-layer integration architecture that positions each framework in its optimal role, maps the five BSC perspectives to their OKR equivalents with sample Key Results, identifies and resolves six common failure modes of dual-system implementation, details Profit.co’s integrated BSC module and how it supports both frameworks simultaneously, reviews three government case studies of successful integration, and provides a four-phase migration roadmap for agencies transitioning from separate systems to an integrated platform.
- 1992 Balanced Scorecard First Published Kaplan & Norton, Harvard Business Review
- 1999 OKRs Introduced at Google John Doerr; originating at Intel with Andy Grove
- 6 Integration Failure Modes documented in dual BSC-OKR implementations
- 40% Reduction in Reporting Burden typical result of properly integrated BSC+OKR system
1. The Problem With Running Two Frameworks Independently
Why most dual BSC-OKR implementations fail — and what the failure looks like from inside the organization.
Somewhere in your agency, there is probably a SharePoint folder containing the annual Balanced Scorecard, a strategy map in a PowerPoint that was last updated in October, and a collection of department-level scorecards that vary in format, measure definition, and update frequency across the organization. And somewhere else — possibly in a different division, possibly under a different leadership sponsor — there is an OKR program that was launched in the last two years by an innovation-minded senior official who had read Measure What Matters and wanted a more agile alternative to the traditional performance planning cycle.
These two systems are almost certainly not integrated. The BSC team does not know what the OKR program is doing; the OKR team does not reference the BSC when setting Objectives. The result is what performance management practitioners call the “parallel systems trap”: two separate accountability structures, two sets of meetings, two streams of performance data, and two organizational narratives about what the agency is trying to achieve. Senior leaders receive both sets of information and cannot reconcile them. Staff are confused about which system to prioritize. Neither system is fully trusted because each seems inconsistent with the other.
The solution is not to choose one framework and abandon the other. The Balanced Scorecard provides something OKRs do not: a comprehensive strategic measurement architecture that ensures all critical dimensions of organizational performance — mission outcomes, stakeholder relationships, internal processes, workforce capability, and financial stewardship — are tracked in balance. OKRs provide something the BSC does not: a high-frequency execution discipline that creates the operational urgency, team alignment, and weekly accountability cadence that turns strategic ambitions into organizational behavior. Together, properly integrated, they are more powerful than either alone. The challenge is the integration.
2. BSC vs. OKRs: A Rigorous Comparison Across Ten Dimensions
Understanding where the two frameworks agree, where they differ, and where each excels — the foundation for intelligent integration.
Effective integration requires a precise understanding of how the two frameworks differ. The comparison below examines ten dimensions of performance management system design where BSC and OKRs make meaningfully different choices. In each case, the difference reflects a genuine design trade-off rather than one framework being simply superior — which is exactly why both are needed.
| Dimension | Balanced Scorecard | OKRs |
|---|---|---|
| Origin & Lineage | Robert Kaplan & David Norton, Harvard Business School, 1992. Designed as a strategic performance measurement system to complement financial reporting with non-financial perspectives. | Andy Grove, Intel, late 1970s; popularized by John Doerr at Google, 1999. Designed as an execution acceleration tool to align teams and focus effort on the highest-priority outcomes. |
| Primary Purpose | Translate strategy into a comprehensive measurement system. Ensure all key dimensions of organizational performance are tracked in balance — not just financial results. | Focus execution on the most critical outcomes in a given period. Create alignment between individual, team, and organizational priorities through transparent shared goal-setting. |
| Time Horizon | Strategic — typically annual with quarterly review. The strategy map and scorecard are updated annually; individual measures may be tracked more frequently. | Operational — quarterly cycle with weekly check-ins. The quarterly cadence creates urgency; annual OKRs exist but are less common outside the highest organizational levels. |
| Structure | Four Perspectives (Mission/Customer/Internal Process/Learning & Growth) with Strategic Objectives, Measures, Targets, and Initiatives for each. Strategy map shows causal relationships. | Objectives (qualitative aspirations) with Key Results (3–5 quantitative outcomes per Objective). Hierarchical alignment from organizational to team to individual level. |
| Number of Metrics | 20–25 measures across the four perspectives is the standard recommendation. Kaplan & Norton cautioned against more — but in practice, government BSCs often balloon to 50–100+ measures. | 3–5 Objectives × 3–5 Key Results = 9–25 KRs maximum. Discipline around fewer metrics is a core feature — OKRs that exceed this create dilution and lose focus. |
| Goal Ambition | Targets are typically set at achievable levels — fully met performance is the expectation. Stretch goals exist but are not structural to the methodology. | Stretch targets are structural — a KR scored at 0.7 (70%) represents success. This systematic ambition creates a fundamentally different organizational culture than a target-met-equals-success system. |
| Accountability Mechanism | Reported to leadership at regular intervals; balanced scorecard review meetings; red/amber/green status. Typically retrospective — how did we do? | Weekly check-ins with forward-looking commentary; AI Progress Agent flags at-risk KRs proactively; manager coaching conversations anchored to data. Prospective — what do we do about it? |
| Strategic Narrative | Explicit — the strategy map provides a visual causal narrative of how organizational activities produce mission outcomes. One of the BSC’s greatest strengths. | Implicit — the hierarchy of OKRs implies a strategy but does not make causal relationships explicit. OKRs benefit from a strategy narrative layer that the methodology itself does not provide. |
| Change Cadence | Annual — scorecards and strategy maps are updated once a year. This stability is a feature for long-horizon strategic monitoring; a bug for rapidly changing environments. | Quarterly — OKRs are reset every 90 days. This agility is a feature for execution; a potential weakness for long-horizon strategic consistency. |
| Government Adoption | Widespread in federal agencies (GPRA-M Strategic Plans), state governments, municipal governments, and large public sector organizations worldwide since the mid-1990s. | Growing rapidly — early adopters in government IT agencies, innovation offices, and progressive city governments since 2015. Increasingly required by OMB for performance planning. |
Figure 1: Balanced Scorecard vs. OKRs — ten dimensions compared across origin, purpose, time horizon, structure, metrics, ambition, accountability, narrative, cadence, and government adoption
3. The Integration Architecture: Four Layers, Two Frameworks, One System
The definitive structural solution for running BSC and OKRs simultaneously — assigning each framework its optimal role at each organizational layer.
The integration architecture below resolves the fundamental design question of dual-framework implementation: which framework owns which management function? The answer is not that OKRs replace the BSC or that the BSC subsumes OKRs. It is that each framework plays a distinct and complementary role at each layer of the organizational hierarchy — with Profit.co as the platform that connects the data flows between them.
| Layer & Cadence | Balanced Scorecard Role | Integration Point | OKR Role |
|---|---|---|---|
| STRATEGIC LAYER · Annual | Strategy map + balanced scorecard: four perspectives, 20–25 measures, annual targets. Updated annually; reviewed quarterly at leadership level. | BSC strategic objectives are the source material for annual OKR Objectives — explicit derivation and linkage in the platform. | Annual organizational OKRs: 3–5 Objectives with 3–5 KRs each, directly derived from BSC strategic priorities. |
| OPERATIONAL LAYER · Quarterly | BSC measure tracking: monthly red/amber/green status; quarterly leadership scorecard review. | OKR Key Results feed BSC measure performance; OKR check-ins are the source of BSC reporting data (single source of truth). | Quarterly department & team OKRs: ambitious KRs with 0.0–1.0 scoring, weekly check-ins, AI progress tracking. |
| INDIVIDUAL LAYER · Annual + Quarterly | Performance appraisal: annual narrative rating linked to BSC performance; department scorecard performance informs ratings. | Individual OKR achievement is the primary evidence for performance appraisal; BSC provides the strategic context. | Individual OKRs; OKR-linked performance appraisal; continuous development tracking. |
| REPORTING LAYER · Continuous | Executive scorecard dashboard; congressional / board reporting; annual performance report (GPRA-M). | Profit.co aggregates OKR data into BSC reporting; AI generates narrative for scorecard presentations. | Mission profit dashboard: real-time OKR progress across all levels; AI-generated insights and alerts. |
Figure 2: BSC-OKR Integration Architecture — four organizational layers showing the role of each framework and the data flows that connect them
3.1 The Core Integration Principle: OKR Data Drives BSC Reporting
The most important design decision in BSC-OKR integration is the direction of data flow. In most organizations that run both systems, data flows in the wrong direction: the BSC scorecard is maintained separately from the OKR data, with a manual reconciliation process that produces two different numbers for the same metric. This creates the trust deficit that makes both systems less effective.
In the correctly integrated system, data flows in one direction only: OKR Key Result progress scores are the source of truth for BSC measure updates. When OKR KR scores are updated in Profit.co through weekly check-ins, the BSC measure that those KRs contribute to is automatically recalculated. The BSC measure is downstream of the OKR data — it is the aggregated, strategic view of what the OKR data shows at a higher level of abstraction.
This means that BSC measures are never manually updated. They are calculated from the OKR data that already exists in the system. If a BSC measure is red, the Profit.co dashboard can show exactly which OKR Key Results are dragging it down — and which are performing well enough to partially offset the gap. This level of diagnostic clarity is impossible with separately maintained systems.
4. Mapping BSC Perspectives to OKR Objectives
How each of the five BSC perspectives for government maps to OKR Objectives — with sample Key Results and the strategic logic of the connection.
The most common failure in BSC-OKR integration is treating the two frameworks as if they address different strategic questions, when in fact they address the same strategic questions at different levels of abstraction and management frequency. The mapping below shows how each BSC perspective maps to a category of OKR Objective — with sample Key Results drawn from the health profit, community profit, and performance management examples in earlier articles in this series.
| BSC Perspective | Role in Government BSC | OKR Translation | Sample Key Results |
|---|---|---|---|
| Mission / Citizen Value | The core outcome perspective for government: what value are we delivering to the citizens and communities we serve? In Kaplan & Norton’s government adaptation, this replaces the Financial Perspective as the apex of the strategy map. | Citizen-facing OKRs with population-level outcome Key Results: life expectancy improvements, reduction in poverty rate, community satisfaction scores, public safety outcomes. These are the highest-level OKRs that frame the agency’s reason for being. |
|
| Stakeholder / Partner Management | The relationships with legislators, oversight bodies, partner agencies, federal funders, community organizations, and the public that enable or constrain mission delivery. In government, stakeholder relationships are simultaneously more numerous and more consequential than in the private sector. | Partnership and stakeholder OKRs: grant funding secured, interagency agreement milestones, community engagement targets, legislative relationship quality. These OKRs ensure that the external relationships necessary for mission delivery are actively managed. |
|
| Internal Process Excellence | The operational processes through which the agency delivers services, manages resources, and maintains compliance. In government, this includes both service delivery processes and the regulatory, compliance, and oversight functions that are unique to the public sector. | Operational OKRs: service delivery cycle time, process error rates, compliance audit results, technology platform performance. These are the engine OKRs — ensuring the machinery of government runs efficiently enough to support mission delivery. |
|
| Learning, Growth & Workforce | The human capital, technology, and organizational culture investments that determine the agency’s future capacity. Often the most neglected BSC perspective in government — where short-term budget pressure leads to chronic underinvestment in the capabilities that determine long-term mission effectiveness. | Capability-building OKRs: employee engagement scores, training completion and competency development, technology modernization milestones, knowledge management, succession pipeline. These are the investment OKRs that protect mission capacity for the future. |
|
| Financial Stewardship | Responsible resource management, budget compliance, cost efficiency, and financial sustainability. In government, this perspective ensures that mission delivery is achieved within appropriated authority and that stewardship of public resources meets the accountability expectations of taxpayers and oversight bodies. | Financial OKRs: budget execution rates, cost-per-outcome ratios, audit findings, procurement efficiency. These OKRs ensure that financial accountability is maintained alongside mission performance — and that efficiency gains are reinvested in mission rather than absorbed as administrative overhead. |
|
Figure 3: BSC Perspective to OKR Objective Mapping — five government perspectives with OKR translation logic and sample Key Results
4.1 The Government BSC Perspective Modification
Kaplan and Norton’s original four BSC perspectives — Financial, Customer, Internal Process, Learning & Growth — require adaptation for government. In the government context, the Financial Perspective is typically demoted from the apex to a stewardship role (ensuring fiduciary accountability without letting financial metrics drive mission decisions), and the Customer Perspective is replaced by a Mission/Citizen Value Perspective at the top of the hierarchy.
Kaplan and Norton themselves addressed this adaptation in their 2001 work on Strategy Maps for Public Sector and Non-Profit Organizations, proposing a five-perspective government model: Mission at the apex, with Citizen/Stakeholder and Financial perspectives below, Internal Process in the middle, and Learning & Growth at the foundation. Profit.co’s BSC module supports this five-perspective government configuration as the default, with each perspective’s strategic objectives serving as the source material for the OKR hierarchy.
5. Six Failure Modes — and How to Prevent Each
The six most common ways that BSC-OKR integration fails — with prevention and remediation strategies for each.
Most dual BSC-OKR implementations eventually fail or collapse into one of the two frameworks, leaving the other as a vestigial compliance exercise. The failures are predictable and preventable. The following table identifies the six most common failure modes, explains exactly how each develops, and provides specific prevention and remediation strategies.
| Failure Mode | How It Happens | How to Fix It | Risk Level |
|---|---|---|---|
| The Parallel System Trap | Two separate teams maintain the BSC and the OKR program independently. The BSC team updates the scorecard annually; the OKR team runs quarterly cycles. No data flows between them. Leadership receives two different pictures of organizational performance. Neither system’s data is trusted because each seems inconsistent with the other. | Designate a single Strategy & Performance Management function that owns both systems. Establish explicit data flows: OKR KR scores feed BSC measure updates. Use Profit.co as the single system of record for both, with BSC reporting generated from OKR data rather than maintained separately. | High — the most common failure mode; organizational resistance to consolidation is intense |
| The Taxonomy Conflict | BSC Strategic Objectives and OKR Objectives use different language to describe the same priorities — creating confusion about which system to trust and duplication of effort in setting goals. “Improve service quality” in the BSC becomes three different OKR Objectives in three different departments. | Conduct a one-time strategic alignment workshop to map BSC Strategic Objectives to OKR Objectives explicitly. Document the mapping. Require new OKR Objectives to cite the BSC Strategic Objective they serve. Profit.co’s alignment tree shows this connection automatically. | High — creates genuine confusion about strategic priorities and accountability |
| Measurement Double-Counting | The same metric appears as a BSC measure AND as an OKR Key Result, tracked in two different systems with two different owners, producing different reported values because of different data source selections, timing, or calculation methodology. | Establish a single source of truth for each metric. When a metric appears in both frameworks, it must have one owner, one data source, one calculation methodology, and one reported value. Profit.co integrations ensure that the same data feed drives both the KR progress score and the BSC measure update. | Medium — creates credibility problems when two ‘official’ numbers differ |
| The Ambition Mismatch | BSC targets are set conservatively (achievable performance = target met). OKR targets are set ambitiously (0.7 = success). When the same measure appears in both frameworks with different target levels, managers are confused about what ‘good’ looks like — and senior leaders receive conflicting signals. | Explicitly design the target relationship between the two frameworks. BSC targets represent the minimum acceptable performance level; OKR stretch targets represent the aspirational level. Profit.co can display both reference points on the same KR progress bar — ‘floor’ and ‘stretch’ — making the relationship visible. | High — undermines confidence in both systems when target logic differs |
| Cadence Collision | The annual BSC planning cycle conflicts with the quarterly OKR cycle. Annual BSC updates happen in October; Q1 OKRs need to be set in December. The BSC update is not complete when the OKR cycle begins, so OKRs are set without a current strategic direction — or the OKR cycle is delayed waiting for the BSC to be finalized. | Sequence the cycles deliberately: BSC annual review completed by November 15; Annual organizational OKRs derived from updated BSC and approved by December 15; Q1 Operational OKRs set in January using annual OKRs as context. Profit.co’s planning calendar feature supports this sequencing with automatic reminders. | Medium — creates operational friction and planning delays |
| The Vanishing Strategy Map | The BSC strategy map — the causal narrative that explains how activities produce mission outcomes — exists as a document that no one references. OKRs are set without connecting to the strategy map logic. After two years, no one can explain why the current OKR priorities were chosen or how they connect to strategic causality. | Embed the strategy map narrative into the OKR planning process. Require each OKR Objective to cite the strategy map causal pathway it addresses. Profit.co’s AI Progress Agent can be configured to reference strategy map logic in its progress narratives, keeping the causal reasoning alive throughout the execution cycle. | Medium — reduces strategic coherence over time as institutional memory fades |
Figure 4: Six BSC-OKR Integration Failure Modes — how each develops, how to fix it, and risk level
6. Profit.co’s Integrated BSC Module
The specific platform features that make BSC-OKR integration technically feasible — and operationally seamless.
Profit.co’s government platform includes a dedicated Balanced Scorecard module designed specifically for the BSC-OKR integration architecture described in this article. The module is not a separate product — it is deeply integrated with the OKR, performance appraisal, and mission profit dashboard features that make up the full Profit.co for Government platform. The result is a single system that serves both the strategic measurement function of the BSC and the execution acceleration function of OKRs — without requiring agencies to maintain two separate platforms.
| Feature | What It Does | Why It Matters for BSC-OKR Integration |
|---|---|---|
| Strategy Map Builder | Visual drag-and-drop strategy map editor with four-perspective layout; causal relationship arrows between strategic objectives; automatic linking to OKR hierarchy | Replaces PowerPoint-based strategy maps with a living document that updates as OKR data changes; causality is visible not just on paper but in the execution system |
| BSC Perspective Dashboards | Four dedicated dashboard views — one per BSC perspective — each showing the strategic objectives, measures, targets, and current performance scores for that perspective | Enables perspective-specific leadership conversations alongside the overall mission profit review; prevents any single perspective from being invisible in management discussions |
| Measure-to-KR Mapping | Explicit one-to-many mapping between BSC measures and the OKR Key Results that contribute to each measure; automatic aggregation from KR scores to measure performance | Eliminates manual scorecard updates; ensures BSC measure values are always consistent with OKR data; creates transparent traceability from measure to contributing KRs |
| Annual OKR → BSC Derivation | Guided planning workflow that surfaces the current BSC strategic objectives and prompts users to derive Annual OKRs directly from them with explicit linkage documentation | Institutionalizes the BSC-to-OKR cascade; prevents OKR planning that drifts from strategic priorities; creates accountability for the derivation logic |
| Dual-Target Display | Each Key Result can display both a BSC ‘floor target’ (minimum acceptable performance) and an OKR ‘stretch target’ (aspirational level) simultaneously on the progress bar | Resolves the ambition mismatch between BSC and OKR target conventions; managers and employees can see both the compliance threshold and the stretch aspiration in a single view |
| BSC Executive Report Generation | AI-generated quarterly BSC executive report incorporating all four perspectives, strategy map narrative, measure performance, and outlook commentary — derived from Profit.co OKR data | Eliminates 80–90% of manual scorecard report preparation time; AI drafts the narrative; leadership reviews and approves; output can be formatted for congressional, board, or public reporting |
| GPRA-M Alignment Module | Dedicated module for mapping BSC strategic objectives to Annual Performance Plan goals and APP measures required under the Government Performance and Results Act Modernization Act | Ensures federal agencies can satisfy GPRA-M reporting requirements directly from the Profit.co platform without maintaining a separate performance reporting system |
Figure 5: Profit.co Integrated BSC Module — seven features, what each does, and why it matters for BSC-OKR integration
7. Government Case Studies: Three Integration Paths
Real-world examples of government agencies that have successfully integrated BSC and OKR frameworks — with the specific approaches and results achieved.
The integration challenge looks different depending on the starting point: some agencies have a well-established BSC with weak OKR adoption; others have strong OKR programs but no strategic measurement architecture; others have both systems running in parallel without integration. The three cases below represent each of these starting points and the integration path each took.
| Jurisdiction | Period | Challenge | Integration Approach | Results |
|---|---|---|---|---|
| City of Tulsa, Oklahoma | 2019–present | City departments maintained independent BSC-style scorecards with 80–120 measures each. Annual updates took 3–4 months. No consistent methodology across departments. Leadership meetings consumed 2–3 hours reviewing scorecard data with minimal strategic discussion. | Deployed OKR framework in Profit.co with BSC perspectives embedded in the OKR hierarchy. Reduced active metrics from 95 average to 22 per department. Annual planning cycle reduced from 4 months to 6 weeks. Leadership meetings shifted from reporting to strategy discussion. | Citizen satisfaction scores increased 12 points over 24 months; budget execution rate improved from 91% to 97%; cross-department collaboration on shared OKRs increased from 2 to 18 active joint initiatives |
| Washington State Department of Ecology | 2021–present | Existing Environmental Results program used BSC-adjacent framework. OKR pilot launched in IT division. Two systems ran in parallel for 18 months with no integration — creating confusion and reporting burden. | Explicit integration: Environmental Results program strategic objectives became the source material for annual OKRs. Profit.co’s strategy map feature visualized the connection. Single monthly review replaced two separate reporting cycles. | Reporting administrative burden reduced by 40%; IT division OKR achievement rate of 0.74 average in first full year; environmental strategic priorities more visible to IT staff than under prior system |
| Maricopa County, Arizona | 2020–present | County-wide BSC program with established four-perspective framework and annual performance report to Board of Supervisors. OKRs introduced as a parallel execution system without integrating with BSC. | Strategy map strategic objectives designated as the source for all annual OKRs. BSC measure reporting automated through Profit.co data feeds. GPRA-M-equivalent county performance report generated directly from Profit.co platform. | Annual performance report preparation time reduced from 14 weeks to 4 weeks; OKR alignment to BSC objectives: 94% of Q4 OKRs explicitly linked to BSC strategic objectives; county leadership meeting efficiency scores up 31% |
Figure 6: Three Government BSC-OKR Integration Case Studies — challenge, approach, and results
7.1 The Lessons Across Cases
Three patterns emerge consistently across successful BSC-OKR integration cases. First, the integration is a political as much as a technical challenge: the BSC team and the OKR team typically have different organizational sponsors, different professional identities, and different ideas about which framework is more valuable. Integration requires a senior leader who owns both frameworks and is willing to invest political capital in the consolidation.
Second, the first year of integration almost always produces a reduction in the total number of metrics tracked — typically 30–50% fewer active measures after consolidation. This reduction is initially experienced as a loss by the teams whose measures were eliminated, but it consistently produces better leadership attention to the measures that remain and higher-quality management conversations about performance.
Third, the reporting burden reduction is consistently larger than expected. When BSC reporting is generated from OKR data rather than maintained separately, the time required for monthly scorecard updates typically drops by 60–80%. This is the most immediately tangible benefit of integration and the most effective selling point for leadership commitment to the migration process.
8. The Migration Roadmap: Four Phases to Integration
A structured implementation sequence for agencies moving from parallel BSC and OKR systems to a single integrated performance management platform.
The migration from separate BSC and OKR systems to an integrated platform is a 12–14 month process that requires careful sequencing of political, analytical, technical, and cultural work. The roadmap below has been developed from Profit.co’s experience supporting government agencies through this transition, and reflects the specific constraints of the government environment: budget cycles, leadership tenure, union considerations, and the need for phased change management.
| Phase | Timeline | Focus | Key Activities |
|---|---|---|---|
| Phase 1 | Month 1–2 | Strategic Audit |
|
| Phase 2 | Month 2–4 | Architecture Design |
|
| Phase 3 | Month 4–8 | Pilot Integration |
|
| Phase 4 | Month 8–14 | Agency-Wide Rollout |
|
Figure 7: Four-Phase BSC-OKR Integration Migration Roadmap — timeline, focus, and key activities
9. Conclusion: One System, Two Frameworks, Infinite Clarity
The Balanced Scorecard and OKRs were designed for different problems by people in different eras of management thinking. Kaplan and Norton were solving the problem of strategic myopia in the financial metrics-dominated management culture of the early 1990s. Grove and Doerr were solving the problem of execution diffusion in the rapidly scaling technology companies of the late 1990s. Both problems are real in government. Both solutions are needed.
The integrated BSC-OKR system that this article has described — with the BSC providing strategic comprehensiveness and narrative coherence, OKRs providing execution focus and weekly accountability, and Profit.co providing the shared data infrastructure that makes both frameworks work from the same source of truth — produces a performance management capability that neither framework alone can match. It answers both questions simultaneously: are we tracking all the right things, and are we actually moving on the most important ones?
Agencies that achieve this integration typically describe the experience in similar terms: leadership meetings shift from reviewing the past to directing the future; staff understand how their daily work connects to the strategic mission; and the performance management system stops being a compliance exercise and starts being a genuine tool for improving government. That transformation is the point. The frameworks are just the means.