In markets defined by rapid AI adoption, shifting regulation and persistent shocks, executives need a different lens for performance.
McKinsey’s 2025 HR Monitor reports that 26% of employees received no feedback in the prior year, with development fragmented and “as few as six days” of annual training for some roles, and “only about one third” of critical roles backed by succession plans, naturally very far from what a framework with metrics of success would recommend.
Adaptation matters because companies that develop dynamic capabilities outperform in turbulent contexts, a point established in strategy research on dynamic capabilities and adaptive strategy. Resilience matters because crisis performance explains a significant share of long-run value creation.
BCG finds roughly 30% of a company’s long-run relative total shareholder return is driven by how it performs during crises, with the interquartile TSR gap widening from 75 to 105 percentage points during the COVID shock period.
In uncertainty, organizations win by learning faster, adapting sooner and absorbing shocks better.
Why the Classic Dashboard Underperforms in Volatile Contexts
Classical dashboards emphasize lagging outputs such as revenue, cost, utilization and on-time delivery. Those are necessary, but they do not tell leaders whether the organization is getting better at navigating ambiguity.
The current operating context exposes that gap. Gartner reports that 47% of digital workers struggle to find the information needed to perform their jobs, up from a smaller application set pre-pandemic as knowledge workers now juggle an average of 11 work applications and a sizable minority use 26 or more. Friction of this kind slows learning and decision cycles.
At the same time, the technology landscape itself is accelerating. Gartner’s 2024 Digital Workplace Hype Cycle projected “everyday AI” and digital employee experience to reach mainstream adoption in less than two years, pushing leaders to track how quickly the workforce develops digital dexterity and removes digital friction.
McKinsey’s 2025 research adds that 92% of companies plan to increase AI investment over the next three years, while only 1 percent of leaders describe their current AI deployment as mature. This gap implies a sustained need to measure learning and adaptation, not simply technology spend.
If organizations are adding applications, investing in AI and still struggling with basic information flow, metrics of success and maturity, then output metrics alone are insufficient. Leaders need direct measures of the system’s ability to learn, adapt and endure shocks.
Learning Velocity as a Management Variable
Learning velocity is the speed and regularity with which an organization turns information into improved action. Two kinds of data support this construct. First, evidence about learning cultures and performance has accumulated for more than a decade.
Meta-analytic work links learning organization characteristics to positive performance and innovation outcomes, indicating that cultures which reinforce knowledge creation and sharing tend to outperform on a range of organizational metrics. Second, workforce data shows persistent shortfalls in the basic mechanics of learning.
Learning velocity is great in the context of metrics of success, as it can be operationalized without resorting to speculative proxies. Leaders can track cadence and coverage of feedback, time from signal to skill, and reduction of digital friction that impedes knowledge flow.
The McKinsey HR Monitor data provides a defensible baseline for feedback cadence and training time. On the friction side, Gartner’s finding that nearly half of digital workers struggle to find information, combined with the application-sprawl figures of 11 tools on average and 40% using more than that, gives a measurable target for improvement via digital employee experience programs and knowledge architecture.
If teams do not receive timely feedback, if they cannot find what they need and if skill development lags investment, then the organization’s capacity to respond will erode. Measuring and improving these new metrics of success is becoming a standard leadership responsibility.
Adaptation Before Anything Else
Adaptation is the competence to reconfigure resources and processes as conditions change. In strategy research, this is captured by “dynamic capabilities,” defined as the ability to integrate, build and reconfigure competences to address rapidly changing environments. The literature shows that firms actively developing such capabilities alter operating processes through stable, dedicated activities, which in turn improve performance under turbulence.
Consulting literature has long argued that advantage in uncertainty is adaptive rather than static.
BCG describes adaptive strategy as a mode that emphasizes continuous experimentation and real-time adjustment in non-predictable environments, prioritizing temporary advantage over fixed positions.
Can We Teach Resilience?
This is often treated as a defensive trait, but recent evidence makes a stronger claim regarding integrating it in the standard metrics of success. In BCG’s analysis of performance before and after the COVID shock, about 30% of a company’s long-run relative TSR is explained by crisis-period performance.
Resilience reflects anticipation, cushioning, adaptation and shaping. This bundle is measurable. Financial buffers can be tracked; operational modularity can be scored; recovery intervals can be timed; and the degree to which portfolios shift in response to durable signals can be recorded. Deloitte’s 2025 board and C-suite study complements this by showing what leadership teams actually do when they take resilience seriously: 66% of respondents said open, transparent communication between board and C-suite was the top leadership factor influencing resilience, while near-term priorities clustered around geopolitical and economic volatility at 55%, security at 50%, and rapid technological change at 42%, with talent close behind at 41%
Putting the Metrics to Work
Leaders can compose a modern scorecard by asking three questions that map to evidence.
First, are we learning at the rate the environment demands?
If more than a quarter of employees receive no feedback in a year, if training time is minimal and if critical roles lack succession coverage, then learning velocity is below threshold. Reducing digital friction is part of the same problem. If nearly half of digital workers cannot find what they need and the application landscape keeps expanding, then digital employee experience programs are not optional.
Second, are we adapting through deliberate mechanisms rather than ad-hoc reactions?
The dynamic capabilities literature encourages leaders to formalize reconfiguration routines, not simply empower local improvisation. Governance data from Deloitte indicates that boards and executives are elevating scenario planning and strategic risk oversight, with 71% prioritizing it and 73% increasing the time spent together on these activities in the past year.
Third, are we building resilience in a way that shows up in value creation?
BCG’s finding that about 30% of long-run relative TSR depends on crisis performance, reframes resilience as a driver of advantage rather than an insurance policy. A resilient scorecard would include time to recover core services after incidents, liquidity and buffer ratios, concentration risk exposure, and the speed and extent of portfolio rebalancing in response to durable signals.
Across the board, output measures still matter for accountability. The claim is that they are incomplete indicators of future fitness unless paired with measurements of how quickly the organization learns, how deliberately it adapts and how predictably it absorbs shocks.
Implications for AI-Enabled Work
The performance question becomes whether those tools accelerate learning velocity and adaptation, or whether they add friction and sprawl. Measurement is the difference. If leaders track feedback cadence, time from issue detection to fix, information-finding success rates and decision cycle times, they can see whether technology is amplifying human capability or simply adding noise.
Transparency also counters a practical obstacle to learning velocity. If workers struggle to locate information, if applications proliferate and if feedback is sporadic, then clarity from leadership about priorities and standards becomes a performance intervention, not a communication flourish. The Gartner and McKinsey figures cited earlier provide a defensible basis for setting targets and measuring progress at board level.
Conclusion
In stable times, output was enough to measure success. In an environment where information is hard to find for a large share of workers, where many companies plan to invest in AI but few feel mature, and where crisis performance explains a material share of long-term returns, it is rational to elevate learning velocity, adaptation and resilience to the same level as revenue and cost.
If you’re looking to elevate your standards without having to increase internal headcount, drop us a line.


