Many organizations have vast, evolving libraries of learning content. But how do you know if it’s working effectively? Without proper systems in place, learners are often left to sift through content, hoping to find what they need. At the same time, content creators must often guess about the content’s relevance and impact. And that’s where learning content analytics comes into play.
This post explores how to use L&D data to improve training content, outlines the business case for using analytics to improve your offerings, and informs instructional designers about what works (and what doesn’t).
If you’re new to our series on Building a Business Case for Learning Analytics, be sure to read the introduction—which provides an overview and recommendations for making the most of this series.
What is learning content analytics?
Learning content analytics focuses on the learning content itself. Specifically, it:
- explores content usage and impact, and
- compares content demand with content provided.
The purpose of content analytics is to make the content better. This is achieved by:
- Informing improvements to existing content. Learning content analytics gives you insight into what is and is not working with existing content.
- Identifying content to remove. Learning content analytics gives you insight into what content is not being used or is not having a positive impact.
- Informing design, creation, and curation of new content. Learning content analytics helps inform instructional designers, content curators, and others about what good content looks like—enabling them to produce better content.
- Identifying content gaps. Knowing what content you have and how it is used can also help you identify content gaps. An understanding of your desired learner outcomes, combined with content search analysis (see image below) can help spot content that’s missing.
And it’s primarily geared toward those who create and curate content, including reporting stakeholders such as internal content teams. Learning content analytics can also inform the feedback you give to vendors and empower you to evidence that feedback with hard data.
These example Watershed reports show how to use learning content analytics to see what people are searching for and the content they are accessing. Another form of training content analytics is monitoring specific courses, like the Leadership Training course, as shown in the right-hand report.
How can learning analytics help improve training content?
Travelers Insurance has a team of instructional designers that creates elearning content for their staff—including insurance policy knowledge training for salespeople. For example, underwriters must know which exposures are insured by a given policy because this can impact risk, customer satisfaction, and financial results.
Using content analytics in Watershed, the Travelers team identified that while 90% of learners passed the test for a particular policy training, most of the learners incorrectly answered a question about the coverage of a specific exposure. This insight meant the team could then address this knowledge gap with updated training content.
This example question visualization from Watershed’s Activity report shows that while 28% of learners got this question correct, 53% incorrectly thought no action was needed, and 14% incorrectly guessed that a coverage B endorsement was required. This level of granular reporting helps you understand the specifics of learner misunderstandings so you can address them.
Alfonso Riley, L&D Strategist at Caterpillar, used content analytics to identify popular training videos on CAT’s Kaltura video platform. Comparing several months of data, he discovered that about five or six videos had significantly more views than other videos among the hundreds of videos on the platform.
By looking at these videos in more detail, he explored the reasons for their popularity. Then, he used these insights to inform his strategy for releasing and promoting new video content.
This demo Watershed report illustrates how a handful of learning content pieces (in this case, Degreed pathways) can experience significantly more activity than other items. This pattern is common in real data, too.
Recommended Reading
How does Watershed support learning content analytics?
Watershed offers easy-to-use dashboards that instructional designers and other learning team members can use to see and keep track of how learners interact with their content. But that’s not all!
Watershed’s Report Builder feature empowers your team to dig deeper into the data to discover the reasons behind some of the insights highlighted on the main dashboard.
What questions does learning content analytics help you answer?
- Which content is being used, and which isn’t?
- What content are learners struggling with?
- How are learners interacting with the content? Are they making use of a particular feature?
- What content medium is most popular (e.g. video vs. elearning, short-form vs. long-form, etc.)?
- What topics are most popular?
The xAPI Cohort is a free project-based experience designed to teach people about xAPI. This Watershed dashboard is the output of one of their Fall 2020 projects and shows reporting on the usage of various videos by cohort members.
Making the case: Why learning content analytics is key to the cusiness
A surprising amount of learning content is created or curated for learners without considering its usefulness or effectiveness. Even content that was once useful can fall out of date, become less relevant, or age in presentation style.
And there’s often just too much information in training content libraries to vet all of it. As a result, your organization probably has a lot of scrap learning, but you may not have any way to identify it.
At the same time, those creating and curating learning resources may not have adequate means for gathering feedback on their efforts. Without content analytics, your people are flying blind, and your learning offer isn’t going to be as valuable as it could be. As Monica Griggs, training and development leader, puts it:
“The numbers help tell the story of whether something is useful or not. We don’t always have success the first time and learn constantly about what works with different audiences and what doesn’t.”
Watershed can help you to address both of these issues and more. For example, identifying scrap learning can help reduce hosting and licensing costs and save learners time by cutting the scrap they have to dig through to find the most helpful content. And by empowering your content team with analytics, they can improve themselves and the rest of your people.
How can you convince stakeholders of the value?
We've all experienced ineffective training that wasted time, and chances your stakeholders have experienced this, too. So remind them of that wasted time and reflect on how many hours people in your organization spend completing substandard learning or struggling to find the valuable content amongst the scrap learning.
Be courageously honest about the need for analytics in understanding what approaches to learning work best in your organization. It’s easy to present yourself as the expert who just mysteriously knows what works and what doesn’t. But if we’re honest, without good analytics and evaluation, we’re just guessing.
Real experts know when they don't have the answers, but they also know what they need to do to get them. As L&D professionals, we believe that people can improve in their work—and for us L&D professionals to improve in our own work, we need to understand when we're successful and when we're not.
Why not run a pilot to see the benefits of content analytics for yourself? Choose a single elearning course that's launching soon or is currently in design, and work with us to develop a learning analytics and evaluation strategy for that course.
Select a course that's at least an hour's worth of content to run a comparative analysis of different parts of the course. Be sure to include detailed xAPI tracking as part of the course development so you can capture the data needed.
A pilot will help you see the kinds of insights learning content analytics can offer and provide a powerful story to sell a broader learning analytics implementation to your stakeholders.
This example shows Watershed reports on points scored, questions answered, and time to answer for a demo quiz game on the Scrimmage mobile learning platform.
Understand your stakeholders and how they will benefit from learning content analytics
Meet Your Stakeholders
Stakeholders | Pain Points | Benefits |
---|---|---|
C-Suite (CLO, CEO, CFO) | Limited understanding of the value generated by investments in learning content. | Insights into the usage and impact of learning content. |
Human Resources | Limited understanding of the value generated by investments in learning content. | Insights into the usage and impact of learning content. |
Learning Leaders | Limited understanding of what makes effective learning content. | Insights into what kinds of content are most effective. |
Instructional Designers | Getting feedback on content usage. | Feedback on how learners use their content and what makes it effective content. |
Compliance | Understanding the use and impact of compliance content. | Insights into the usages and impact of compliance content. |
Learners | Time wasted in ineffective learning content. | Good learning content analytics should lead to improvements in content. |
Next Course: The business case for scrap learning analytics
This post has made a case for learning content analytics, but you might prefer to focus on other specific types of learning content analytics. These include identifying scrap learning, vendor management, enhanced platform reporting, and learning impact measurement.
Over the next few posts, we'll dig into those more specific business cases in more detail, starting with scrap learning.
About the author
As a co-author of xAPI, Andrew has been instrumental in revolutionizing the way we approach data-driven learning design. With his extensive background in instructional design and development, he’s an expert in crafting engaging learning experiences and a master at building robust learning platforms in both corporate and academic environments. Andrew’s journey began with a simple belief: learning should be meaningful, measurable, and, most importantly, enjoyable. This belief has led him to work with some of the industry’s most innovative organizations and thought leaders, helping them unlock the true potential of their learning strategies. Andrew has also shared his insights at conferences and workshops across the globe, empowering others to harness the power of data in their own learning initiatives.
Subscribe to our blog