Assessing and Planning Learning Programs with Analytics

It’s 2019: the war on talent is real and the economic outlook shows less growth and more uncertainty. Meanwhile, according to Training Magazine’s Training Impact Report 2018, there’s an uptick on learning spend for large to mid-sized organizations in the US.

Since an organization’s most expensive line item on the balance sheet is their workforce–and L&D has the responsibility to lift the capabilities of the entire organization–it has never before been this important for learning leaders to show the impact their programs have on business goals.

Unfortunately, many learning leaders are still struggling to tell their ROI story. According to the Association of Talent Development (ATD), in 2016 only 35% of the learning and development community believed their efforts were helping their businesses meet goals. Additionally, only 36% of learning professionals evaluate business results from learning, while the other 64% of learning leaders aren’t making the effort at all to measure the result of learning programs on the overall business goals.

I had the privilege of building and launching the first ever leadership development program for a successful retailer. Approval to build the program was in direct response to company-wide engagement survey results that found leadership development was in demand and employees were hungry for this.

The program had high stakes and visibility; there were hopes it would move the needle on some significant desired cultural shifts.

However, apart from the annual employee engagement survey, which included a few questions directly correlated to leadership performance, I was surprised that key stakeholders were satisfied with gauging success and impact by the “buzz” created around the office after the program launched.

It seemed evident that more data points, and an easy way to interpret and analyze the data, were needed to prove the learning program’s impact and value.

Data visualization showing breakdown of employee learning by program type
Data visualization showing breakdown of employee learning by program type

Forget about Kirkpatrick’s Model for Learning Program Evaluation

A common way of assessing program effectiveness and value is to use the pervasive Kirkpatrick’s four levels of training evaluation. This model was created 50 years ago and outlines four levels of evaluation:

  • Level 1: Participant reaction
  • Level 2: Testing for new learning
  • Level 3: Assessing for changes in behavior
  • Level 4: Impact the training has had on the business

With the exception of level four, all of these are lagging indicators, meaning they capture what’s happened after the training has occurred.

Levels 1-3 are easy to measure, but difficult to change because they are after the fact.

Level 4, on the other hand, is a leading indicator: hard to measure, but the results show if the learning program achieved the intended objectives or if the objectives simply need to be adjusted. The learning team can course correct and modify objectives of the program to meet the needs of the business.

Unfortunately, it’s difficult to go beyond level one: participant reaction.

This is because L&D professionals need a combination of their learning management system, their HRIS data, engagement scores, learning assessments, and feedback surveys. Once this data has been pulled from each system–a process that can take weeks, if not months, of manual work–there’s still a question about the direct correlation learning activities have on company-wide goals since there are other people initiatives beyond learning that need to be considered.

Pulling all this data together for analysis can seem daunting, but there are some sophisticated learning analytics solutions that seamlessly bring all the data in these systems together, enabling you to focus on answering your most critical L&D questions.

Visier Learning data visualization showing how learning and development impacts resignation rates
Data visualization showing how learning and development impacts resignation rates

Assess ROI by Going Beyond Learning

In order for the learning function to be a trusted strategic partner, we must show broader employee development and correlation to business impact over time. To do this, answer questions that look just beyond L&D, such as:

  1. Does learning impact retention?
  2. What are our future certification requirements? Can we forecast training needs? Do we have compliance risks?
  3. When will new hires be fully ramped?
  4. Does learning impact promotions?
  5. Do training programs drive productivity?

These are the leading indicators that help draw a direct correlation to how learning is impacting the business going forward–not looking back at evaluations on training events that have already happened.

Answering these questions go beyond Kirkpatrick’s model, providing a broader picture of employee development over time, as well as valuable insight for future forecasting and planning to ensure targets are met and the business is not at risk.

The Way Forward

Is your desire is to be a trusted strategic partner with a mission to help people be the best they can be–while aligning to business goals?

If you’re relying on the “buzz” around the office, or a stack of completed feedback forms to show how impactful your initiatives are, history—and decades of experience from your peers—prove that status quo is not going to lift your role to a strategic level.

As we continue to slide into an economic downturn and L&D spend continues to be significant, delivering learning program impacts to business leaders will not be optional—it will be the cornerstone of true L&D strategy.

  • 0800-123456 (24/7 Support Line)
  • 6701 Democracy Blvd, Suite 300, USA