Learning analytics seem like a bit of a dark art, a mystery. You’ve read the theory, but how does it actually work in practice? It’s this question that inspired us to conduct our Learning Analytics Research Study so we could better understand how our clients are implementing these analytics in practice. This blog post recaps our findings and shares some actionable insights to help you get started.
How Mature Is Your Learning Analytics Program?
We aimed our research efforts at placing these organizations on the learning analytics triangle as a measure of maturity and for creating an overall picture of how real people are doing real learning analytics. As a result, this blog series is meant for everyone—whether you are beginning your learning analytics strategy, are well underway, or are somewhere in between.
Table of Contents
We've listed each post chronologically to help guide you through the series.
- Corporate Learning Analytics in Practice [Introduction]
- Learning Analytics Analytics
- Learning Analytics Categories
- Learning Analytics Dimensions, Learner Analysis
- Learning Analytics Dimensions, Learning Program Analysis
- Learning Analytics Dimensions, Learning Experience Analysis
- Complexities & Analysis Types
- Data Evaluation Complexity
- Advanced Evaluation Complexity
- Predictive & Prescriptive Complexity
- Experience Types
- Report Types
Recommended Resources
Actionable Insights & Key Takeaways
Along the way, we discovered some key findings about how clients use these analytics to not only report on and share information about their learning programs and initiatives, but also show L&D’s impact across the organization.
So, whether you’re just getting started or have a more advanced learning analytics strategy in place, here are a few important things to keep in mind.
- You have options. There are more than 100 ways to slice learning analytics, so find at least one more new approach that can add value to your organization. (Tip: Don’t forget to take inspiration from others.)
- Build the foundation first. Start with the basics that people expect from the reporting (e.g., completions), but don’t stop there. Even within data evaluation, there’s a lot you can do beyond completion and utilization, but few organizations are doing so.
In fact, most organizations are still at the data avaluation level of complexity, but some are starting to trail-blaze into advanced evaluation. So, consider how you can gather additional data about engagement and other areas of analysis for deeper insights. (Tip: Check out this worksheet as a starting point.) - Use the right reports and visualizations. For example, reporting on data about learning programs lends itself to specialized learning program reports (e.g., Watershed’s program report), rather than more generic visualizations (e.g., pie or bar charts).
- One size does not fit all. There’s a distinction between how the L&D team uses reports compared to how managers use them. In other words, know your audience and design L&D reports and dashboards for different stakeholders.
For instance, learner reports tend to be used less than the other categories, so they may not be the best place to start your learning analytics journey. Start with learning program reports for managers, or learning experience reports for the L&D team and content creators. - Location, location, location. Try using reports that compare learners in multiple locations. These reports may get more traffic than you think, as these reports may help viewers identify how well learners in certain locations are performing or how active learners are by location.
- Don’t forget about seasonal patterns. Report usage levels vary depending on the time of the year—such as key dates, deadlines, or year end—so expect numbers to fluctuate.
- Don’t let reports go stale. Keep them up to date to ensure stakeholders remain engaged.
- Learning happens everywhere. Chances are, there’s untracked learning happening somewhere in your organization. If you don’t think you can track it, think again. Remember, we’ve seen organizations tracking all kinds of learning experiences—so, if they can track them, you can too!
Recommended Reading
About the author
As a co-author of xAPI, Andrew has been instrumental in revolutionizing the way we approach data-driven learning design. With his extensive background in instructional design and development, he’s an expert in crafting engaging learning experiences and a master at building robust learning platforms in both corporate and academic environments. Andrew’s journey began with a simple belief: learning should be meaningful, measurable, and, most importantly, enjoyable. This belief has led him to work with some of the industry’s most innovative organizations and thought leaders, helping them unlock the true potential of their learning strategies. Andrew has also shared his insights at conferences and workshops across the globe, empowering others to harness the power of data in their own learning initiatives.
Subscribe to our blog