HR Curator Blogs & Articles
How Talent Acquisition Teams Can Show Their Value - Webinar with @HRCurator
Five essentials for HR analytics success
Find one source of truth
Remember data is more important than tech
Build strategic internal alliances
Make analytics part of the HR skillset
Be passionate about HR analytics
The Geeks Arrive In HR: People Analytics Is Here
Today, for the first time in the fifteen years I've been an analyst, human resources departments are getting serious about analytics. And I mean serious.
I was in a meeting several weeks ago in San Francisco and we had eight PhD statisticians, engineers, and computer scientists together, all working on people analytics for their companies. These are serious mathematicians and data scientists trying to apply data science to the people side of their businesses.
This last week I had another similar meeting and we had three of the world's leading insurance companies, two large retailers, three health care companies, and two manufacturing companies with serious mathematicians and scientists assigned to HR.
What's going on?
As I recently discussed in the article "How People Management is Replacing Talent Management?" there is a major shift taking place in the market for people analytics. After years of talking about the opportunity to apply data to people decisions, companies are now stepping up and making the investment. And more exciting than that, the serious math and data people are flocking to HR.
A little history is in order.
The area of HR analytics, talent analytics, or as it is now called "people analytics" has been around for a long time. As an analyst (and former analytics product manager) I've been talking with companies about how to measure learning and HR for a decade. Back in 2005, after several frustrating years trying to figure out how to measure training, I wrote a book called The Training Measurement Book, which sets the stage for L&D teams to move beyond the traditional Kirkpatrick measurement model. Today learning organizations continue to try to analyze the impact and effectiveness of training, but it no longer stands alone.
If you look back in time, ten years ago companies tried to build "HR Analytics" systems (typicall called HR data warehouses) to help companies look at simple metrics like "total headcount," "time to hire" and "retention rate" and clean up their messy, often inaccurate people data. Quite a few companies built these databases, but they were primarily used to be a single system of record across the many HR platforms in place.
In the 1990s vendors like PeopleSoft, Oracle, and NCR/Teradata built analytics products to support this market. They didn't sell very well, primarily because companies had such complex HR systems they didn't have the budget or IT support to build the HR data warehouse. (Some companies did do this, and they have been benefiting from this for many years.)
About five years ago the book Moneyball came out, and we started a global marketplace called "Big Data." Tools like Hadoop, R, and other parallel data management tools became productized and industries like marketing, advertising, and finance started to analyze massive amounts of data. Much of this started at Facebook, Google, LinkedIn and other internet companies who simply had to analyze enormous amounts of data to run their businesses.
Along the way the term "Data Science" was invented, and today there are hundreds of jobs for "Data Scientists." (Typically defines as people who understand information management, Big Data tools, statistics, and modeling - a rare breed.)
During the last ten years we watched the discussion with HR stay very tactical, focused on operational reporting and simply fixing the mess of incompatible HR systems we have. There were many HR and learning analytics presentations and a few conferences, but most of the focus was helping technical practitioners improve their reporting systems. The idea of predictive analytics was little more than ROI studies to look at whether a training program worked.
(Full disclosure, I was the head of product management for two companies that built advanced learning analytics solutions in the early 2000s.)
Suddenly around 2011, with the focus on Big Data, we sensed a shift in the market. To understand how well predictive analytics was taking hold, we started our early research on "Big Data in HR" and developed a maturity model (it was published in the Fall of 2012). We discovered a world of "Haves" and "Have Nots." A small number of companies were investing heavily in predictive people analytics, but most were barely getting started.
The whole idea of our focus on "Big Data in HR" was to help HR organizations realize that they, too, could enjoy the wave of interest in Moneyball and BigData. HR is not as interesting a topic as homeland security or cyberwarfare, but it's a big area of spending (more than $4 trillion is spent on payroll around the world) so there's a lot of opportunity in this huge data set. And the world of "People Analytics" was born.
There is a deep history of data analysis in the HR profession, starting with Frederick Taylor in the late 1800s. The article "The Datafication of HR" describes this evolution, and I think everyone in this space should read this article and get to know the history. Today we are standing on the shoulders of some giants and very innovative thinkers - they just didn't have computers to help.
Today, while the topic is hot, HR teams are just starting to get good at analytics. The problem has not been the concept, but rather the focus. We spent far too much time trying to measure HR and L&D spending, and figure out which HR programs were adding value. While that seems interesting HR managers, typically business people just don't care. What they want is information that helps them run the company better: "Get me the right people into the job, make them productive and happy, and get them to help us attract more customers and drive more revenue. I don't care if your L&D program has a 200% ROI or not."
(Slideshare History of People Analytics: The Datafication of HR: People Science is Here )
We now see this as a huge trend, so we launched a focused research area on this topic. With the help of my partner Karen O'Leonard and others on our team, we launched a series of industry studies on what we called "Talent Analytics." Our biggest report, entitled High-Impact Talent Analytics, established the first-ever research-based maturity model for analytics. It showed that there were a small set of companies (less than 5% of the market) that were way ahead of the curve. These advanced companies were looking at people-related data in a very strategic way, and they were making far better decisions about who to hire, who to promote, how much to pay people, and much more.
Since then, interest in this market has exploded. And I mean like an atomic bomb. Everyone is now talking about it, and the whole concept has changed.
A few weeks ago I had a meeting with five major Silicon Valley and New York companies who are focused in this area, and the room was filled with statistics PhDs, engineers (like me), and I/O psychologist PhDs. Thus the title of this article:
The geeks have arrived, and we're all happier for it.
At this point, entering 2015, I believe "The Geeks have Arrived." Statisticians, mathematicians, and engineers have entered the people analytics space.
In this meeting I recently attended, the practitioners, who are among the leaders in this space, were all experienced in bringing together data, cleaning it up, and doing all types of analysis. Of course their companies have various issues with data quality, systems, and infrastructure - but they, as a group "get it." They understand the potential, they understand the problem, and they have the skills to get work done. And they are not just analyzing HR issues, they are analyzing the business.
Today this new business function is called "People Analytics." And over time, I believe it doesn't even belong within HR. While it may reside in HR to begin with, over time this team takes responsible for analysis of sales productivity, turnover, retention, accidents, fraud, and even the people-issues that drive customer retention and customer satisfaction.
· High tech companies now know why top engineers quit and how to build compensation and work environments to get people to stay.
· Financial services companies are now analyzing why certain people commit fraud and what environmental or hiring issues might contribute to such violations.
· Product companies are now analyzing the demographic, educational, and experiential factors that correlate with high performing sales people and why top sales people quit.
· Health care companies are looking at why certain hospitals or departments have higher infection rates and what people issues are behind these problems.
· Manufacturers and product companies are looking at the patterns of email traffic and communications to understand how high performing managers behave and what work styles result in the highest levels of performance.
These are all real-world business problems, not HR problems. The data which helps support these decisions includes experience, demographics, age, family status, as well as training, personality, intelligence, and dozens of other factors. More and more this will include data on email communications, employee sentiment, and ad-hoc feedback.
Many of the factors which contribute to fraud or turnover have nothing to do with the people - they are environmental. Where is the manager physically located? Who else is hiring in this location? So People Analytics requires a look at external data, not just internal data.
This is why this function eventually belongs outside of HR, it is really a part of a company's bigger "business analytics" team.
Just for grins I did a Google Trends search on the terms HR Analytics, Talent Analytics, and People Analytics, and look at what I found. "People Analytics" is winning.
As we talk about in our research, this is a huge market opportunity for business - one that is just beginning. Vendors of all shapes and sizes are starting to grow, and most of the large platform providers now include predictive analytics tools embedded in their core HR software. (Flight risk indicators are a good example - not necessarily accurate yet, but the right idea.)
And exciting new companies are joining the marketplace. (Read People Analytics Heats Up for more on all the vendor activity.) Not only are the large ERP players involved, but serious software entrepreneurs are joining the market. Last week I met with two senior software executives (both from large search engine companies and other companies they had sold) now entering the market for HR engagement analytics and measurement systems. I wont mention the company yet (it's not yet launched), but this is a company that is likely to bring serious software engineering to the people analytics market.
While most HR organizations are still struggling to clean up their data and build their teams, the momentum is coming on strong. And technical talent has now figured out that the old-fashioned backwater HR department may be one of the most exciting places to work.
We'll be doing a lot more research on this topic over the coming years, but let me simply state clearly "The Geeks have Arrived: People Analytics is Here."
Organizational Development, Design and Learning
Performance management – the ”soul-sucking monster” of HR
The way we currently do performance management, to me, is one of the most destructive things HR has ever created. [It’s a way] to reduce employee engagement and really piss off all your managers.
Nick Holley, co-director of the Centre for HR Excellence at the UK’s Henley Business School, certainly doesn’t sit on the fence when it comes to performance management.
But it’s an opinion he has arrived at after extensive research, talking to a series of major global companies.
Despite his invective lobbied at performance management, Holley is not advocating scrapping performance management.
Nor is he hawking some new prescriptive new performance management tool that will magically solve the problem:
What I’ve learned from talking to a huge number of organizations is that there actually isn’t a right way of doing it. It’s not about whether you do use ratings or you don’t use ratings.
Holley adds that following best practice is “irrelevant” without context: it’s about organizations individually coming up with a process that works for them not blindly following the latest management group-think.
During his research, however, Holley did begin to discern similar approaches and attitudes among companies with successful performance management approaches. He has distilled their experiences into eight key pieces of advice:
All of the organizations with successful performance management approaches were fascinated by the latest research coming out of neuroscience and its theories about what motivates people.
In particular, there were three experts that people kept mentioning. The first of these was Daniel Pink and his book Drive: The Surprising Truth About What Motivates Us, which shows how the traditional carrot-and-stick style approach to motivating staff doesn’t work.
The second major influence was David Rock’s SCARFmodel, which basically pinpoints five areas – Status, Certainty, Autonomy, Relatedness and Fairness- which trigger the reward or threat circuitry in our brains.
The third is the idea of encouraging the growth mindset, outlined in the work of Carol Dweck, Lewis and Virginia Eaton Professor of Psychology at Stanford University.
How that is being translated into the workplace, says Holley, is the idea that performance management should move from being a historical process, looking at past achievements, to how things can be done better in the future. It also means changing it from a negative process of what people have done well (or failed at), to how people can do better in the future.
What’s also important is that performance management moves from away its obsession with individual goals and targets, when what’s important is the overall team or organization performing better. As Holley notes:
We are the most collaborative species in the world and yet what this does is force us to stop collaborating and to compete.
Own practice, not best practice
Instead of researching best practice, these successful organizations researched their own internal practice. Holley expands:
What they wanted to find out was: what is the impact on our organization? Not measuring process completion but measuring the unintended consequences of the process.
One person gave me this great quote: the problem with performance management is that it’s become the Swiss Army knife of talent. What he meant by that was the core concept was good, but all the different people in HR have all added their own little bits.
As a result, organizations end up with, what Holley colorfully describes as:
A soul-sucking monster that’s attached to the business, which sucks the life out of people.
Ratings systems are a clear example of “soul sucking” at work. According to Holley, anyone who receives a ‘3’ rating, no matter how it is dressed up with positive language, will hear the word ‘average’.
It’s no better for the high performers. Those with high ratings won’t be motivated to work harder – why should they when they are already scoring highly?
Rating systems only generate resentment and short-termism. Worse still, as a rating is given a number, it’s seen as an objective fact. This is simply “rubbish” contends Holley, and it says more about the person doing the rating than the person rated.
So organizations need to be honest about how much time they are wasting internally on performance management procedures that don’t contribute to the business and simultaneously wreck motivation of staff.
Think business, not HR
When these organizations design their performance management systems, this is not an HR-driven process. According to Holley:
The CEOs I interviewed all said this is the single most critical tool in delivering our strategy.
If they focus too narrowly on HR concerns, then there’s a danger the “Swiss Army knife of HR” will rear its ugly head.
Instead, what they successful organizations do is identify the area of performance that is most going to make a difference to their business and that fits in with their company culture. Microsoft, for example, identified that it is collaboration that is the key quality that drove business success, so it sets up a performance management approach that values and encourages collaboration.
The organizations focused on one or two key things rather than try and take on too much. The emphasis was on the quality of the conversation between manager and employee, not the process.
This could involve scrapping ratings. No matter how they are dressed up, ratings are basically used as a tool to decide compensation, but there is actually a far simpler solution: managers can decide for themselves. One EMEA executive summed up the absurdity of the system, pointing out to Holley:
I have the power to make multi-billion dollar Capex decisions, but I couldn’t decide percentage pay increases for my immediate reports.
Far from creating mayhem and massive pay increases, in practice is that managers will simply talk to their compensation and benefits expert about the current market situation and make their decision accordingly. And he or she will take that decision far more seriously, because they cannot hide behind a rating.
According to Holley, the successful organizations involved key stakeholders in the design:
Didn’t go away into some darkened bat cave in the middle of HR and create something and then think how they’ll sell this to the business.
Instead, they worked with the business from the get-go, taking longer on the process if necessary, to ensure they had the full backing of all relevant people.
This could also involve talking to people outside the organization. Companies with European Works Councils or those closely monitored by regulators will need to talk to them and include them in the process.
Most of the organizations piloted their new system rather than design in one go, because, says Holley, no matter how much a design is modified, unexpected things can happen:
Have the self- confidence to admit when you’re wrong. Whatever you do, it won’t work the way you expect and if you try and defend that you’ll undermine your credibility.
These successful organizations built in an expectation that they may get it wrong at first. This flexibility extended not just to the design process but how it was rolled out to employees.
The organizations were all rigorous about what they measured. But measurement does not equate to ratings and doesn’t mean putting a specific number to success. Holley explains:
Measure but never forget what is meaningful is rarely measurable and what is measurable is rarely meaningful. It’s not just about producing numbers. How do you measure if they are the right goals? You have conversations with people and it’s subjective. You can measure that, but you just don’t put a number against it.
Quality of dialogue.
All of the firms recognized that the real issue was not the performance management process itself, but the quality of the dialogue. So they focused on providing training, not just for the managers, but the employees being appraised as well. Holley adds:
They also made training mandatory, because otherwise, guess what, the people who’d go on the training were the ones who’d be good at it anyway.
To emphasize how important performance management was to managers, it was made quite clear that there would be consequences for people who did not go on the training.
According to Holley, these eight stages:
Are actually how we should think about HR in general. To me, good HR people are curious; we are fascinated by the art and science and heart of HR and by the way, we are actually fascinated by our business, by understanding what a particle accelerator actually does or whatever business you’re in.
Holley acknowledges that changing entrenched approaches to performance management involves a lot of risk and takes an “awful lot of courage” to step into the unknown.
But then, if the alternative is to carry on living with a “soul sucking monster”, this doesn’t sound such a terrifying step.
Holley’s take on what has become the whipping boy for all HR’s ills is enticing, but it is by no means an easy fix. It’s not simply a question of following the latest methodology and shoehorning it into your organization.
What Holley’s proposing is a more individual approach than a one-size-fits-all. And one of the key messages, writ loud and clear, is that ratings systems in their current form do not work.
Changing HR Operating Models - 'You can't put in what God left out': not everyone can be a strategic HR business partner
In recent years the Ulrich three-box HR model (shared services, centres of excellence and HR business partners) has become the standard delivery model for HR. The model is fundamentally a sound model and has taken HR forward, but in our research we have found a big gap between intention and reality, especially in the role of HR business partners. Why?
Historically a lot of HR work has been about delivering processes to the business, administering payroll, keeping out of tribunals, writing terms and conditions, and so on, so HR has attracted people with the requisite skills and mindset. The HR business partner role is very different. It's about delivering innovative ways of developing organisational and people capability, building on deep data-driven insights into the strategic and commercial direction of the business. This requires a different level of thinking, as the complexity and degree of ambiguity inherent in the role, and in the environment, in which organisations are operating has increased exponentially.
In some cases the issue has been that no one has actually articulated to the newly rebadged business partners how the role is different or the new level it is operating at. In others, no one has helped those with whom they are partnering understand what is on offer and how it differs from the past. In many cases, however, there has been a failure to understand the business partner role and how it differs from the old HR model and then match this to existing HR capability. The simple fact is that the 'ask' has risen faster than the capability of many people in HR to deliver it. As a result, many HR business partners have been unable to deliver what is required in the role or have dumbed down the role to a level they are comfortable with but which doesn't deliver what is required by the business.
One of the causal factors has been that as organisational structures become leaner and ever more matrixed, partner roles become the knot in the bow tie, where they are pivotal in ensuring the whole model functions effectively. Nowhere has this been more prevalent than in HR. This means that it becomes vital that you have a 'big enough' person in the role, which often isn't the case because they are the same person as before the organisational change.
Elliott Jaques1, one of the gurus of organisational psychology, identified the challenge that lies behind this problem. In his research he identified seven levels of work complexity, each defined by increasing ambiguity, longer timeframes for decision-making success and greater delivery breadth. He also identified that people can only engage with complexity up to a level related to their intellectual capability to understand it. As Sam Mussabini said to Harold Abrahams in the film Chariots of Fire, 'You can't put in what God left out.' The essential problem with HR business partnering is that in many cases we are asking level 3 capability people to do work at level 4. The issue isn't about developing them; the issue is that they are simply incapable of operating at the right level, either at that time or potentially at all during the span of their natural careers. In our most recent research we asked what CEOs look for from their HR directors (HRDs) and one of the questions we asked was why they had sacked their HRDs. Three issues came out. One was a lack of integrity, which was the most consistent and most important insight from the whole research. The second was great talk but no delivery. The third was that they either weren't up to the role or had outgrown it:
- 'When we started we employed an HR admin lady who made sure the payroll worked, but we outgrew her.'
- 'It was a function of the agenda. The individual didn't have the capability to step up again.'
- 'We had taken the game up a notch. We had someone who was successful in the old agenda but not in the new. I would give them a reference. They weren't a failure; it depended [on] what we wanted from them.'
- 'Intellect was the key. They didn't have the ability to make sure my thinking on strategy was matched to their deep knowledge of the capability to deliver it.'
- 'We are dealing with more complexity on a broader scale. Once we got six variables to think about versus four, they didn't have the capability to think at that level on a broader scale.'
In each case they didn't blame the person. They were good at what they were good at, but the role required them to be good at a different level. In these quotes lies the answer to the conundrum. We shouldn't ask people to operate at a level they simply can't operate at. We need to help people be the best they can be, not try to get everyone to be something they can't be. This has several implications.
Fit the role to the person
Not all HR business partner roles need to operate at a strategic board level. Not all HR business partner roles are the same, so match your level 4 people to level 4 roles and level 3 to level 3. If you have too many roles at the highest levels compared with people who can operate there, match the best people to the roles that have the biggest impact on the bottom line or on patient service or whatever the key value driver is.
A simple test is to list on the left-hand side of the page the business units and how critical and material they are to creating value. On the right, list your HR business partners by their capability. Does the left-hand list match the right? Do your best business partners face off to the most critical business areas? One final point here is don't build the list only on current returns but also on future growth opportunities. It may be you want to match your best HR business partner to the smaller but higher potential and therefore more strategically critical growth opportunities rather than a larger cash cow.
There is a strong organisational design driver here because level 4 is the point at which you have the biggest mismatch between roles featuring work at that complexity level, and the natural incidence of people in the population with the ability to work at that level. This is not an isolated issue within HR, but is true of many roles in many functions. HR just sees it more frequently because I would argue that the ratio of role complexity increase to individual development has been higher than other functions in recent years.
Be clear what we are recruiting for
This isn't just about a competency framework; it's about being realistic about the level we are asking people to operate at. It's become unfashionable to use tests of verbal and numeric reasoning skills, but perhaps we should look at more sophisticated and rigorous ways of assessing what level a person can operate at. We are letting our people and the business down if we recruit people to do a job they simply can't do. Levels of work suggest that by far the best predictor of success in higher complexity roles is judgement – but this is rarely assessed.
Match your development spend to what can actually be developed
It is very difficult to send someone on a programme that develops their intellectual capability or their systemic thinking ability. But these capabilities can be more swiftly developed through a broader career-pathing approach which tries to develop perspective (for example across different functions) and hence judgement. But this takes time and our research shows that this kind of development is the least often used by HR.
Equally there are some key hard skills that can be developed: understanding the business strategy and where value is created, data-driven insight development, and so on. We should focus our HR development spend in these areas. What is disturbing is when HR people tend to focus their development on HR-related rather than business-related areas: 'And here's one more slice of telling SHRM data: When HR professionals were asked about the worth of various academic courses toward a "successful career in HR," 83% said that classes in interpersonal communications skills had "extremely high value." Employment law and business ethics followed, at 71% and 66%, respectively. Where was change management? At 35%. Strategic management? 32%. Finance? Um, that was just 2%.'2
It also might be that you don't develop all these skills in every business partner or even within HR. As an example, not everyone needs to be a data scientist, but everyone needs to be comfortable with data. It might be that you access the deep data analytical skills from elsewhere in the business or from contractors who work closely with your HR business partners, but your HR business partners must recognise the value that issue-driven data analytics will bring to HR.
Be willing to throttle back the promise
In a desire to be seen to be responsive and relevant, there is a danger we overpromise and under-deliver. Perhaps we need to be willing to promise a bit less and deliver a little bit more or deliver where it is most critical versus trying to do it everywhere. Many people will say, 'but that will impact our short-term credibility'. Isn't it better to be rigorous about assessing the real capability of the HR function and our HR business partners and match what we promise to the business to what we can actually deliver? Perhaps a dash of realism and humility might serve us better in the long term. As a previous boss once said to me, 'the longest route is often the quickest way to get somewhere.'
1. JACQUES, E. (1997) Requisite organisation: total system for effective managerial organisation and managerial leadership for the 21st century. London: Gower.
2. HAMMONDS, K. H. (2005) Why we hate HR. New York: Fast Company.
The Talent Returns on an HR Analytics Investment
If you’re reading this article, you’ve already bought into the critical need for talent analytics as an HR capability. Analytics is a prime focus for HR professionals for good reason. It’s a time-intensive, perilous path full of obstacles and plenty of chances for missteps. Few companies—even multinational corporations pouring millions of dollars into Big Data about their people—use high-value analytics effectively, or in many cases, much at all. For HR, analytics are a clear path to stronger executive influence as an “anticipator” rather than a “reactor” or “partner,” and “future-oriented” is the most notable characteristic of HR data seen by senior business partners as valuable and ideal (even more so than “relevant” or “frequent”).
So talent analytics are well-recognized as an input to HR success—but what’s the output? If analytics are a means to an end, what are the near-term outcomes and long-term returns from getting talent analytics right, and how does a high-caliber analytics program translate to how companies actually manage their leadership talent?
In our Global Leadership Forecast 2014|2015, we researched 1,500 organizations ranging across the spectrum of sophistication for analytics, and gathered information on how many of these seven types of leader-focused analytics they did well:
- Gathering efficiency/reactions metrics about leadership programs—30% did it effectively
- Benchmarking leaders internally—27% did it effectively
- Gathering results metrics about leadership programs—24% did it effectively
- Using data to forecast future leadership talent needs—23% did it effectively
- Using data to design/optimize leadership talent programs—22% did it effectively
- Gathering business impact metrics about leadership programs—21% did it effectively
- Benchmarking leaders externally—13% did it effectively
Across this set of analytics techniques, only 5 percent of organizations have mastered all of them, while 47 percent were either not doing any of them well or weren’t doing them at all. Alongside the questions on analytics, we also asked HR professionals in these organizations to tell us about their talent practices and outcomes. We found five near-term talent outcomes to best differentiate the 1 in 20 companies executing all types of analytics well (Analytics Masters) from the worse half of their peers doing none effectively (Analytics Laggards):
How Analytics Masters Differ from Analytics Laggards: Near-Term Outcomes
- Leaders have high-quality development plans, which they review regularly with their managers—Talent-focused analytics drive current know-how of what leaders need to succeed, making actionable development stickier and more targeted.
- Organizations measure the effectiveness of high-potential programs—High-potentials often get (and justifiably so) an outsized time/expense investment; Analytics Masters gauge whether these programs are worth it and how to course-correct if they’re off-track.
- Know the up-to-date status of leader talent—Knowing who you have and where they’re strong and weak is heavily fueled by analytics, particularly workforce planning.
- Use a systematic process to identify the quantity and quality of leadership to drive business success—Analytics Masters gather high-quality information about what it takes for leaders to succeed, and how many leaders are needed.
- Use formal programs to ensure smooth leadership transitions at all levels—Leadership transitions are extremely risky; analytics collect information about the diagnostic assessment programs used to identify and prepare leaders for new roles—are leaders reacting to these programs positively, and is data in place to prove their impact?
The practices above are lead indicators of analytics success—looking further out, at talent outcomes whose effects take longer to observe, which of these do Analytics Masters achieve? In our research, four outcomes rose above the rest as key long-term differentiators:
How Analytics Masters Differ from Analytics Laggards: Long-Term Outcomes
- Stronger bench strength for next three years—Analytics Masters use analytics to understand and reduce talent risk, and when looking three years forward, have a much stronger set of future leaders in place—on average they can fill 19 percent more critical roles immediately with internal candidates compared to Laggards.
- Higher current quality at all levels, front line to senior—Though Analytics Masters don’t outperform Laggards in current leader quality as much so as in bench strength for the upcoming generation of leaders, they do have higher-quality leaders from the front line to the C-suite.
- Success rate for high-potential and expatriate leaders—Analytics Masters are more disciplined in their management of costly high-potential and expatriate talent, and their success rates for these roles are 15 percent higher than Analytics Laggards.
- “Stickier” leadership development—Analytics Master companies build better personal development plans for their leaders and know more precisely which characteristics drive success—helping leaders apply learning back on the job at a much higher rate.
For the few organizations able to reach Analytics Mastery, advanced and deep proficiency in these methods comes at a high price, but generates a healthy payoff. These benefits are rooted in accurate talent audits, well-aligned leader development plans and programs, ongoing, systematic measurement of program effects and impact, and ultimately, a stronger current and future roster of talent to lead them forward. In contrast, Analytics Laggards’ shortfalls not only leave them lacking in these same areas, but in a lengthy state of data-blind ignorance about just how far behind they are.
For more information about the Global Leadership Forecast 2014|2015 research, including 25 highly actionable findings about the current state of leadership, an evidence-based roadmap for leadership development, a scoreboard of 20 common talent management practices, and global benchmarks for 11 metrics about leadership talent, see http://www.ddiworld.com/glf2014.