How to Develop Your Learning Analytics Maturity

Measurement, data, and benchmarking naturally go hand-in-hand. Many in L&D will instantly make the connection with how this statement could apply to either learner or program performance. But take a step back and you’ll realize the same principles apply when it comes to assessing your organization’s learning analytics capability.

This blog post explores the results of recent survey questions on learning analytics maturity and capability. It also suggests strategies for developing learning analytics maturity in your organization.

The state of learning analytics: Our approach & methodology

This blog series explores the State of Learning Analytics, touches on a range of topics—including levels of maturity—and offers recommendations for advancing your own learning analytics journey.

We’ve combined the results from two recent surveys, which provide insights from more than 1,000 L&D professionals on applying learning analytics in their organizations.

  1. Measuring the Business Impact of Learning. Since 2016, Watershed has partnered with LEO Learning to conduct the annual Measuring the Business Impact of Learning (MBIL) survey.
  2. Adopting Learning Analytics: Closing the C-Suite/L&D Language Gap. Watershed teamed up with Chief Learning Officer magazine to survey its readers on topics relating to L&D and learning analytics.

What are the levels of learning analytics maturity?

Before we dive into our findings, it’s important to understand the different levels of learning analytics. We’ve defined our own learning analytics maturity model, which sets out four sequential levels.

  1. Basic Measurement. Simply collecting data and tracking metrics. No questions are asked of the data at this level. Example: 4,000 people completed the course.
  2. Data Evaluation. Beginning to evaluate the data and understand what’s happening. This level considers whether what’s happened is good or bad. Example: More people completed this course than expected.
  3. Advanced Evaluation. Looking for relationships within the data and investigating why things happened. This level examines the reasons behind the data. Example: More people completed this course because it was featured on the front page.
  4. Predictive & Prescriptive Analytics. Using data to predict and prescribe what will happen. This level builds on the previous three levels and uses an understanding of how positive outcomes have previously been achieved to attain positive results in the future. Example: If we promote important courses on our front page, we will see at least four times as many completions than if we don’t.

Note: While the examples above used course completions for simplicity, these analytics maturity levels can and should be applied to metrics around performance improvement and business impact tailored to your organization. Explore more on this topic with our webinar recording: The State of Learning Analytics: Views from 1000 L&D Professionals

Why should I understand my organization’s placement on the learning analytics maturity scale?

Simply put, understanding where you are helps you benchmark where you are, compared to where you want to be.

Placing yourself on the scale helps you:

  • Set goals to progress your maturity.
  • Provide the benchmarks to define what success looks like in your organization.
  • Have data-backed conversations to expand your learning analytics budget and goals.
  • Identify existing strengths and shortfalls in your current setup.
  • Progress along the maturity scale, which ultimately helps you measure the business impact of learning.

Most organizations are still developing their learning analytics maturity

When CLO survey respondents were asked to place their organizations on this scale, 81% put themselves at the bottom half of the scale:

  • data evaluation (36%)
  • basic measurement (37%)
  • none (8%)

Only 19% put themselves in the top half of the scale:

  • advanced evaluation (12%)
  • predictive and prescriptive analytics (7%)

CLO respondents were also asked to select which types of learning analytics their organizations use:

  • 59% use learning experience analytics.
  • 55% use learning program analytics.
  • 44% use learner analytics.
  • 15% use none of the above.

With most organizations only looking at what’s happening or whether what’s happening is good or bad, how can L&D teams progress their analytics to ask why things are happening and start using data to make decisions and influence what happens?

How to mature your learning analytics capability

First, L&D teams need a clear learning analytics strategy that starts during the design of the learning program rather than being left until after the program launch.

The learning program’s design needs to address a particular business problem. It should map out the chain of evidence from completion of the learning program through performance improvement to business results.

Tracking metrics relating to the links in that chain help you understand what is and isn’t happening, and clearly defined targets for those metrics tell you whether that’s good or bad.

Comparing the metrics for each link in the chain reveals whether or not the program has been successful and where issues occurred. These insights then inform your next steps to address those issues and achieve the business goal.

Three key ingredients underpin your strategy and work together to deliver it:

  1. People. It’s vital to have a learning analytics team to implement the strategy. This might be a dedicated learning analytics team, or people with other L&D roles who are passionate about learning analytics and given time for analytics work.
  2. Technology. Technology supports your learning analytics strategy by enabling and automating the collection and analysis of data. In particular, a learning analytics platform can aggregate all your learning data and facilitate dashboards and reports with the latest data—which can save a lot of time compared to crunching the data manually in a spreadsheet. And you can use this saved time for more advanced analytics.
  3. Budget. Investing in people and technology for learning analytics requires a budget. This means developing a business case to convince budget holders of the value of learning analytics. (Check out our Building a Business Case for Learning Analytics series for how to get started.)

How do organizations measure learning program effectiveness?

When asked about how they measured the effectiveness of their learning programs and how:

  • 30% of respondents said they used the Kirkpatrick learning evaluation model.
  • 17% said they used happy sheets.
  • 14% said they used net promoter score.
  • 19% selected “Other” (most respondents said they used a mix of approaches).
  • 18% said they do not measure at all.

The combination of organizations using happy sheets and those using net promoter scores means that 31% of organizations rely on learner satisfaction metrics, a similar number to those using the Kirkpatrick method.

How to align learning evaluation metrics to your goals and strategy

When choosing evaluation metrics, it is essential to align them with your goals and strategy. 

Let's put it another way. You need to understand the purpose of the learning programs you're measuring and ensure you have metrics in place to measure your success in fulfilling that purpose—plus any steps in the chain of evidence toward achieving that purpose.

For example, a learning program aims to improve employee retention by providing development opportunities people enjoy. As a result, a combination of learner satisfaction and retention metrics would be appropriate.

Alternatively, let's say a learning program is meant to drive improvements in employee performance and business outcomes. Your metrics should relate to the desired business outcome and performance goals in addition to metrics around:

  • the completion of the program, which is designed to improve performance, and
  • assessment results measuring the competence of learners following the training.

The business uses data—so why doesn’t L&D?

Most MBIL survey respondents (79%) agreed that data significantly impacts their organization—including 43% of respondents who strongly agreed. Only 8% disagreed, including only 3% who disagreed strongly.

The strength of agreement is even greater for organizations that set aside a budget for learning analytics. For example, 93% of organizations agreed—including 61% who agreed strongly—and only 2% of organizations disagreed, none of them strongly.

Organizations recognize the impact of data, as it informs business decisions across many departments and operational units. Yet too often, L&D lags behind.

Chief Learning Officer’s report “Closing the C-Suite/L&D Language Gap” describes a communications gap between upper management and HR/L&D due to a lack of hard learning metrics.

The report proposes  adopting quantitative metrics, taking advantage of the current shift to remote working (and the increased importance of both learning and data during that transition) to get the C-Suite’s support for learning analytics.

That’s not to say that everybody outside of L&D is now data literate.

In fact, many of our clients say that a critical part of their role in learning analytics is helping stakeholders understand and use learning data. This task requires more than simply presenting charts and figures to stakeholders.

Instead, it requires engaging with stakeholders to understand their business challenges and questions, deploying data and analytics to answer their questions, and then coaching them to use the results of that analysis to inform their decisions and make improvements.

Up Next: Do you need a learning analytics team?

People are key in implementing your learning analytics strategy, but do you have the right team in place? Our next post examines survey results around learning analytics teams.

We'll ask whether organizations have a dedicated team and the recommended skill sets. We'll also explore if organizations plan to change their staffing in the near future. (We'll cover related technology and budget in subsequent posts).

Subscribe to our blog

Join us as we discuss the state of learning analytics!

What's the state of learning analytics? Is measurement that drives data-driven decisions a pipe dream or a reality? Find out more in this webinar that explores key trends—from L&D’s need to embed business impact metrics into learning design to the importance of nurturing a data-driven mindset.

eLearning Learning

This website stores cookies on your computer to improve your experience and the services we provide. To learn more, see our Privacy Policy