[Many of you know that my husband, Ron Wallace, was a mission manager at NASA for international Search and Rescue Programs (official program name COSPAS/SARSAT) until he took early retirement in 1999 so that we could relocate to our present home in Fort Myers, FL. We were both fascinated by all things spacey, and continue to be, so I’ve been using the mission control analogy in my work with HRM analytics since I began that work in 1987. I should also mention that this post was inspired by a presentation I gave on 5-9-2013 to the Human Resource Policy Institute, of which I’m a Fellow, which is hosted at Boston University’s School of Management, of which I’m coincidentally an MBA graduate, class of ’72.]
Given our goal of driving positive business and/or mission outcomes via effective HRM, we are faced with three primary issues in developing our analytics program. In the order in which we must figure them out, they are:
- By what metrics will senior leadership know how HRM is impacting those results? How will be prove the hypothesized line of sight between specific HRM policies/programs/practices/plans and specific organizational outcomes?
- What analytics should be embedded in what HRM processes to bring about what improvements in what decisions by what organizational roles to drive those organizational outcomes? And what analytics should be used by which HR partners and specialists to shape those HRM policies, practices, plans, processes etc. toward that end?
- How do we avoid drowning in more analytics than we can absorb and upon which we can act? How can we organize the important analytics to ensure that they are presented to the right people and used effectively to drive results?
All too often, efforts to infuse HRM decision-making and shape the design of HRM programs/policies/practices/etc. with analytics begin with the obvious and easily measured rather than with the important and necessary to measure aspects of our business. Yes, it’s helpful to know how many “paychecks” were “cut” in a cycle (forgive me for using the lingo of your youth — in mine it was still pay envelopes filled with cash), but it’s a lot more important to driving business outcomes to know if our scarce compensation dollars are eliciting the desired behavioral results.
Figuring out what those more valuable analytics may be and then delivering them, just in time, to the right decision makers goes to the heart of effective analytics programs. If we’re not willing or able to do the heavy lifting here, we might as well go back to counting performance reviews completed on time (that’s always assuming that we haven’t nuked the traditional review in favor of a much more ongoing and social performance process) rather than trying to determine how to hire, grow and retain great performers. But assuming that we’re all serious about the strategic use of HRM analytics, let’s start tackling those issues.
Issue #1 — By what metrics will senior leadership know how HRM is impacting those results?
To figure this out, we must follow the Yellow Brick Road of strategic HRM planning. I’ve included links below to my four part blog post on my strategic HRM and HR technology planning methodology, and there is also a Webinar and whitepaper on Workday’s site www.workday.com if you’d like more information. But to summarize the thought process, it’s:
- What is the organization’s raison d’etre?
- What is the organization’s value proposition to its customers, shareholders and employees?
- What does the organization want to do and be?
- Why should shareholders, customers, employees, etc. invest their resources in the organization?
- What are the organization’s drivers of business value?
- By what measures and target values over what time period would we recognize achievement of this vision and the creation/achievement of business value?
- What must the organization do well to achieve its vision? To meet stakeholder expectations? By what strategies will the organization drive customer satisfaction, shareholder value and workforce effectiveness?
- How will the organization recognize, via objective measures and target values within a defined time frame, to what extent its organizational business strategies and needed results are being achieved?
- What must the HRM business (led by the HR function) do well to enable the organization to achieve its vision? By what HRM business strategies will the HRM business enable the organizational business strategies, i.e. help drive customer satisfaction, shareholder value, and workforce effectiveness?
- How will the organization recognize, via objective measures within a defined time frame, to what extent its HRM business strategies are achieving the intended results?
- By what measures and target values within a defined time frame will the organization’s progress toward its HRM business strategies and results be validated and/or problem areas revealed?
From the drivers of business and/or mission outcomes to HRM strategies that impact those drivers to the metrics that measure the effectiveness of those HRM strategies, this groundwork must be laid during strategic planning if we’re going to develop an effective program in strategic analytics. One great almost universal (at least in the for profit world) example of such a strategic metric is the growth (or retrenchment) in revenue and/or profitability (or similar) per workforce FTE, where you:
- include in workforce FTEs both employees and contingent workers as appropriate; and
- use as a denominator the best/most appropriate measure in your industry for business and/or mission outcomes.
Other good possibilities include measures of improvement in specific drivers of business and/or mission outcomes, e.g.:
- time-to-market and/or cost-to-market for new products, perhaps divided by the investment in specific HRM strategies intended to effect that improvement; paired with the
- utilization and outcomes of an incentive compensation plan focused on improving the effectiveness of the people and teams in the relevant roles.
Issue #2 — What analytics should be embedded in what HRM processes to effect what improvements in what decisions by what roles to drive results?
To address this issue, and to build the metrics “starter kit” that has been a part of my no longer licensed HRM object model/architectural “starter kit,” I developed Naomi’s hierarchy of metrics. Like Maslow’s much more famous (and deservedly so) hierarchy, mine starts with the more easily accomplished metrics (which also are much less valuable strategically, unlike Maslow’s foundational needs) and moves upward toward the really important, one might say critical, but very difficult to achieve metrics. And while I urge clients to avoid getting into analysis paralysis determining where a particular metric fits, I did develop definitions for what goes where. However, to keep this post from becoming a tome, I’ve just used a couple of examples to give you a flavor of each level, from lowest to highest, in the hierarchy.
HRM process activity metrics:
- How many expressions of interest came through our corporate career site?
- How many performance reviews were conducted?
HRM process outcome metrics:
- How many expressions of interest came through our corporate career site which passed our automated filters?
- How many performance reviews were conducted which were mechanically complete and completed on
HRM process activity pattern recognition metrics:
- How many expressions of interest which came through our corporate career site and passed our automated filters resulted from employee referrals?
- How many performance reviews were conducted which were mechanically complete and completed on time were conducted by managers with greater than 3 years experience and an excellent rating as managers?
HRM process outcome pattern recognition metrics:
- How many expressions of interest came through our corporate career site and passed our automated filters that resulted from employee referrals then passed our initial interview screenings?
- How many performance reviews were conducted which were mechanically complete and completed on
HRM process activity and outcome prediction metrics:
- What are the characteristics of those who express interest in employment via our career Web site which are predictors of successful employment? Of long term success in our organization?
- What are the characteristics of managers who coach/lead/extract better on average performance from their teams? How could we recognize pre-hire and/or cultivate post-hire those characteristics?
Issue #3 — How do we avoid drowning in more analytics than we can absorb/act upon?
Every metric or analytic, no matter how obvious and well-established, must be (subject to effective-dated, role-based security — but of course you knew that):
- surrounded with context, explanation and guidance;
- presented with the capability to drill down and drill around;
- actionable and configurable; and
- organized via what I call dashboards, cockpits and mission control.
Context, explanation and guidance should include at least:
- how is this metric derived? what does the value in this metric mean?
- what types of responses, corrective or other actions are expected, and with what downstream implications?
- what other information should I look at, including other metrics, to dig deeper into this matter? and
- who in the organization has greater experience in this matter upon which I may want to draw?
Drill down and drill around should include at least:
- show me the algorithm and values used to drive this metric;
- show me more about the people/positions/plans/etc. which are at the heart of or affected by this metric;
- show me how this metric applied to these people/positions/plans/etc. compares to the same metric applied to others; and
- show me what actions on my part have impacted/could impact in the future this metric.
Making metrics/analytics actionable means:
- now that I know what happened and why, let me make decisions, change business rules, undertake specific actions on the people, positions, plans, etc. the doing of which are intended to improve the outcomes; and
- now that I see the impact of previous actions and decisions on current results, let me propose future actions and decisions intended to improve the outcomes and show me the likely impacts provisionally so that I can then decide whether or not to take the proposed actions and decisions.
Dashboards, Cockpits, And “Mission Control”
Dashboards are for everyone — managers, employees, applicants, contingent workers, everyone. They contain what each person, depending on their role (again, subject to effective-dated, role-based security), needs to know delivered “point of sale,” including both overall organizational and local progress as well as progress toward that role’s needed outcomes. Dashboards don’t require any real training beyond knowing the basic navigation features of the software; they take full advantage of all that context, explanation, guidance, drill down, drill around, etc. Dashboards are also limited to those types of actions and configurations the doing of which have a limited scope of direct consequences in case of an error (but with lots of embedded intelligence to guide user to correct transactions, decisions and responses). Finally, only obvious actions are expected by dashboard users, and they’re reminded until they take them or, in critical cases, prevented from moving until we take them.
If you rent cars at all, you know that you expect to jump in and drive off without agonizing over where the fuel gauge or speedometer might be. But you also expect that, every now and then, the automobile industry goes through a major technology and dashboard redesign process, and then you do have to get with the program even if change is painful (as it certainly is to me) to learn how the new dashboard works. I personally fancy one that talks to me about its innermost feelings (i.e. the status of its fuel, tires, fluids, whatever) before those things get out of whack.
So what do strategic HRM dashboards contain? Here are just a few examples of what might be important for all employees (but not necessarily for contingent workers or applicants):
- business and/or mission outcomes — targets and progress toward them;
- growth in revenue and/or profitability or similar per workforce FTE;
- improvement in specific drivers of business and/or mission outcomes, e.g. time-to-market/cost-to-market of new products, divided by the investment in specific HRM strategies intended to effect that improvement;
- incentive compensation plan and/or workforce development program focused on improving the effectiveness of the people and teams in the relevant roles.
Here’s another example focused on specific processes and the related decisions that might be included in a manager’s dashboard:
- how did I allocate scarce compensation budget across my team and across the different components of compensation compared to other teams in terms of business/mission/team results?
- how quickly and well am I filling key roles in my organization compared to other parts of the organization and with what impact on the business/mission/team results?
- how many applicants/interviews/offers/etc. does it take, on average, to fill my xxx roles? how many on average days-to-fill? what on average cost-to-fill?
Cockpits are for skilled HR leaders and specialists. Just like the pilot’s cockpit in an airplane, they take serious training to understand and act quickly upon what’s displayed and upon the directions given. Cockpit analytics, if acted upon incorrectly, have serious and broad consequences, but here too should be added lots of embedded intelligence to guide the user to a correct interpretation, recommended action, resulting transactions, decisions and responses. Many, perhaps most, HRM decisions that directly affect improvements in the business and/or mission results originate with these analytics, from allocating scarce compensation budgets across an organization (with individual compensation decisions made by managers via their dashboards) to making effective sourcing decisions (with final hire decisions made by managers via their dashboards).
For the head of sourcing, their cockpit might include production ratios, cost/time/quality of hire by sourcing strategy and within strategy by actual source. For the head of total compensation, perhaps budgeted versus actual health care insurance costs and how they are impacted by specific wellness initiatives. And for the head of succession and talent mobility, perhaps planned and actual % of covered positions which were filled internally with ready replacements.
Mission control is for the operational experts, for those responsible, day to day, for the operations of the HRM delivery system in order to ensure that there is operational efficiency and robustness, strategic enablement and high quality/cost effective/rapid time-to-market HRM service delivery. Mission control is where you identify the strengths and weakness in the design of HRM policies, practices, plans and processes as the basis for assessing their results and fine-tuning their designs. It’s used by highly trained analysts, especially data scientists and statisticians, to look for insights, patterns, predictions, etc. and then to act on them. Just like for mission managers at NASA, lifelong training is required across several disciplines, and many and complex actions taken here have systemic but not always immediately visible consequences when errors are made, so there must be lots of built-in checks and balances.
The Bottom Line
Analytics programs are a journey, and there’s no one right answer for every organization. You have to really know your business, know what makes it tick, to determine by what analytics you should run the HRM aspects of it. And you need deep knowledge of HRM to know by what analytics you’ll know if your HRM policies, practices, plans and processes are working properly to drive improvements in business and/or mission outcomes. For those of you wanting a little more on this topic, I’ve also included links below to a series I did several years ago, at the very start of my blog, that may be of interest. As with Twitter feeds, the earliest posts in a series are at the bottom.
If you feel like you’ve been stranded along the way or that we’ve (or you’ve) been off on various scenic detours, my apologies for not providing the final installment of our Yellow Brick Road travelogue as quickly as I had hoped. Life just keeps happening; we keep up as best we can. And if you’re just […]
My apologies for the long detour from the Yellow Brick Road while I attended to heavy business travel, client deliverables, more shoulder rehab, and the final business details for closing on M/V SmartyPants. More on SmartyPants in a later next post, complete with pictures. For now, we’ve got a lot more work to do along […]
In Part I of our journey down the yellow brick road to great HRM and HRM delivery systems, I set the stage in terms of the environment in which our organizations must operate and what they must do to be successful. By now you should have decided for your own organization – or will do this shortly […]
In my 2/9/2010 post, I announced that I would be publishing my strategic HRM delivery systems planning methodology on this blog, so I thought I’d better get started. Although there’s a very geeky set of materials to guide me on these projects, I call the version of my methodology intended for clients, “follow the yellow […]
Thinking | 7 comments | Edit this post
By now you’ve either decided that you NEVER want to hear another word about HRM/HRMDS metrics, or your metrics spreadsheet is ready to roll. You’ve used the highest level processes of the HRM domain model, mine or yours, as the columns and a metrics taxonomy, mine or yours, arrayed as the rows. What’s next is to […]
In my last two posts, I introduced the importance and use of metrics in the running of the HRM business and it’s HRM delivery system (HRMDS). I then introduced my HRM domain model to provide a precise and consistent terminology for the HRM processes when discussing HRM and HRMDS metrics (or any other aspect of […]
In my last post, I began discussing the importance and use of metrics in the running of the HRM business and it’s HRM delivery system (HRMDS), to include those metrics needed in the service level agreements for shared services and when any part of the HRM business and/or HRM delivery system is outsourced. And please […]
Since we’re already on the subject of metrics and more metrics, let’s take a deeper dive before making further investments in HRM and the HRM delivery system (HRMDS). We know we can’t improve what we don’t measure, but the dirty little secret of HRM and HRMDS metrics is that we will only get improvement in […]
Thanks for coming back in spite of the math — or because of it. Now that we know what we’re trying to do, to improve revenues and profits, we’ve got to explore, getting down and dirty, what drives financial results in a specific organization. Do newer products garner the greatest increases in sales and profits? […]
If the real purpose, the only purpose, of HRM is to achieve organizational outcomes, then we’d better be able to measure the effects of specific investments in HRM on those organizational outcomes. Otherwise, why would anyone trust us with a budget?
The primary outcome measures for private sector organizations are revenues and profits, so we’d […]