Our data requirements blog series explores the data you might collect from the various tools in your learning ecosystem, how you might get that data out, and what questions it might answer. In this post, we’ll look at one of the most critical platforms in your learning ecosystem—the learning management system (LMS).
Whether you’re looking to extract data from an existing LMS, or need help choosing a new one that’s easy to integrate with your learning analytics platform (LAP), this post will help you ask the right questions of yourself and of your vendors to get the best result.
What’s an LMS and how does it fit in a learning ecosystem?
As the name suggests, an LMS (or Learning Management System) is a system that manages all kinds of learning. LMSs vary in feature set, but most will provide hosting, launching, and tracking of online training as well as management and assignment of both online and classroom training.
Typically, the LMS is the first platform in an organization’s learning ecosystem, and most LMS platforms can do a little of everything ranging from hosting and launching content to playing videos and managing in-person training, just to name a few.
The LMS is normally particularly focused on formal and compliance training, and may be augmented with an LXP which serves informal and self-directed learning.
Other platforms that add improved or additional capabilities—such as a video platform, survey tool, or learning analytics platform—may also be added and integrated depending on the organization’s needs.
Organizations will typically add a learning analytics platform to their ecosystem in order to provide best-of-breed L&D reporting and analyses data not available in LMS reports alongside data from other learning platforms beyond the LMS.
LMS data requirements and what to ask your vendor
What data should I look to pull from my LMS that it doesn’t already provide via reporting?The kinds of data organizations typically pull from an LMS are:
- Learning data generated by xAPI eLearning courses
- Learning data generated by non-xAPI (typically SCORM based) eLearning courses
- LMS course assignments, completions, and results
- In-person training registration and attendance
- Search terms, if the LMS has search functionality
- Interactions with other LMS features, such as discussion forums or recommendations
Some organizations may also pull non-LMS data from the LMS, where that data is easier to get from the LMS than the source system due to technical or organizational challenges.
For example, people and organizational hierarchy data will normally originate from an HRIS, but if there are challenges in getting the data from the HRIS, you may then choose to pull that data indirectly via the LMS.
What does the data tell you, or what kind of things can you learn from it?Because LMS data can be varied, it can be used to support almost any kind of learning analytics for e-learning. Here are examples from each category of learning analytics:
- Learner Analytics: Displaying a complete learner transcript across LMS and non-LMS platforms to empower learners and their managers to build a comprehensive picture of all their learning.
- Learning Experience Analytics: Identifying scrap or problem learning content in the LMS by looking for content that is not used, has a low success rate, or has an unexpectedly high or low average time taken. This learning content can then either be removed if it is scrap learning, or improved if there are specific, identifiable issues. (And sometimes the insights you gain from learning experience analytics can be surprising.)
- Learning Program Analytics: Monitoring completion of a program of learning in the LMS—or including elements within and outside the LMS (e.g., for compliance tracking).
When purchasing an LMS to integrate with a learning analytics platform, there are a few questions to ask for each piece of data you want to include:
- Does the data exist in the LMS? You need to have data if you want to extract it. So if the LMS is not capturing the data in the first place, or not saving that data once it’s captured, then it will be impossible to extract any data.
- Red Flag: For many data points, some LMSs capture data about current statuses and, as a result, may overwrite historical data with new data. For example, if a learner fails a course and then passes on a second attempt, the LMS may overwrite the first attempt with the second, so you have no record of the first attempt. If you want to track all attempts, this will be an issue.
- Is it possible to get the data out? It’s no use having data that you can’t access. You need to be able to get the data out of the LMS and into your learning analytics platform. The gold standard is having the LMS send the data to the LAP via xAPI, but if data is available as a CSV export, conversion tools—such as Watershed’s Data Conversion Engine—can convert that data to xAPI instead.
- Red Flag: For anything other than a pilot project, you’ll want to automate the export of data from the LMS into the LXP. If the only way to get data out of the LMS is a manual file download, that’s an extra admin task for you—which is going to get real tedious, real fast.
- Is the data accurate? Inaccurate data is not only misleading, but also can undermine confidence in your analytics. It’s therefore important to establish that the source data is accurate. This can be hard to establish during the procurement process, but talking to existing clients or testing with a pilot can help assure that the LMS is generating reliable data.
- Red Flag: Ensure you and the vendor have the same understanding of particular data points. Specifically, the definition of “completion” can mean anything from simply opening a course to passing the assessment. Make sure the completion data available from the LMS matches your expectations.
- Red Flag: Just because an LMS implements xAPI tracking does not mean that the xAPI data is good data. It’s good practice to check a sample of xAPI data generated by the LMS and test it with your LAP’s reports. (See our xAPI governance guide for more details.)
- Is the data complete? It’s important to be clear with the LMS vendor on exactly what data is available and if there are any exceptions. You don’t want to get as far as reporting on the data before you realize that some important data points are missing.
- Red Flag: For e-learning course data exported via xAPI, make sure that you’re getting data from non-xAPI courses, too. Some LMSs (including those that integrate Rustici Engine by Rustici Software) can automatically convert SCORM data to xAPI data. Of course, the LMS can only pass on data that the e-learning course generates, so some data limitations are beyond the LMSs’ control.
Recommended Resource: Confused by the differences between SCORM and xAPI? Check out "Between Two eLearning Standards: SCORM vs. xAPI," and watch as our friends from Rustici Software discuss the benefits between these two standards.
If you already have an LMS, the questions to ask are the same. You’re limited to extracting data that exists and can be extracted, and it’s important to identify any issues in terms of accuracy and completeness early on.
Where data is inaccurate or incomplete, it may be possible to pull the data from elsewhere instead, for example if the LMS links out to a video platform, data about video views and interactions may be available from that platform in lieu of the LMS.
A good xAPI implementation is still ideal, but if not, being able to extract a CSV via API is a good plan B.
The LMS and your learning ecosystem
How do I map LMS data with the other systems in my ecosystem to the LRS?Mapping data from multiple systems requires a common learner identifier (e.g., email address or employee ID number) used across systems. Ideally, the LMS will use that same common learner identifier in its learner data.
However, if the LMS uses an internal identifier, a mapping of learner identifiers in the learning analytics platform is required so records can be linked to the respective learners.
You can use Watershed’s people and groups import functionality to provide this mapping so learner data across multiple systems can be linked—even if different systems use different identifiers.
This requires the LMS, or another system, to provide a data export mapping the various learner IDs together.
If you have multiple LMSs, there can also be benefits of comparing the data sets from both. For example, one Watershed client uses one LMS for mandatory training and another LMS for optional training.
By exploring these two data sets, they found that salespeople who were assigned more mandatory training in one LMS tended to spend less time completing the optional training in the other LMS.
This makes sense; if sales people use their time completing mandatory training, they have less time available for optional training. What’s interesting though, is that they also found that those salespeople who completed more of the optional training actually had better sales figures.
The optional training was directly beneficial to their job performance, while the mandatory training, by reducing the learners’ time available for the optional training, was actually indirectly harmful to the salespeople's performance.
Recommended Reading
Typically, outside of small and focused pilots, the LMS is the first data source organizations want to integrate with their learning analytics platform because it contains so much useful learning data—being the platform where the majority of the organization’s learning takes place. This data is brought into a learning analytics platform for:
- more flexible and detailed reports than the LMS can provide internally, and
- reporting that combines LMS data with data from other platforms.
Combining LMS data with data from other systems gives you a complete view of the learner’s activity. For instance, LMS data about course views and completions can be compared with search data from the LXP to identify when people are searching for a topic, but not finding LMS content for that topic.
This enables you to rapidly identify and address emerging learning needs as people start to search for content and resources.
Additionally, LXP and LMS data relating to completions can be compared to ensure that when somebody ticks the box in the LXP to say they completed a course, the LMS data confirms this is the case.
What about future changes to the ecosystem—how will adding, replacing, or removing certain tools affect LMS data, or will it?Typically, adding a new platform to the ecosystem should result in more-detailed data being brought into the LAP. This is because while the LMS is a general platform that does a little of everything, other platforms tend to be more specialized and should, therefore, have more-detailed data about their particular areas of speciality.
For example, an LMS might track when videos are played, but a specialized video platform is more likely to track in-depth details, such as where videos are paused or skipped.
Bringing LMS data into your learning analytics platform can also make replacing your LMS a less painful experience. For instance, Visa’s learning ecosystem presents learners with an LXP (on the front end), and uses Watershed for reporting, with content served by the LMS, on the back end.
This has enabled Visa to switch out their LMS without neither their learners nor their report users being affected.
The role of the LMS in skills and compliance
Is an LMS a good system of record for compliance reporting? If not, what is?In some cases, using an LMS that specializes in compliance training is the absolutely the best system of record for compliance reporting. There are, however, three scenarios in which an LAP may be preferable:
- You’re already using an LAP for other reporting and want all your reporting in one place. In this case, the LAP can also be used for compliance reporting.
- Your compliance training includes elements of training outside of the LMS. For example, in a medical context, compliance training might include real-world simulations—the details of which are captured using a checklist application, such as Xapify. In this case, the LAP can bring together all of your compliance data in one place.
- You’re required to show that the training had an impact, and not just that it was completed. Increasingly, compliance regulators are asking to see evidence of training effectiveness, not just completion. An LAP can help to bring data from business sources together with learning data to measure the effectiveness of the training.
LMSs sometimes contain data relating to a person’s skill set. This data may include some or all of:
- Learning completed, thus indicating they may have the skills the learning was intended to provide.
- Self-declared skills (e.g., in a personal profile). In some cases, these may be verified by others.
- Formal qualifications, such as a university degree or professional membership, suggesting a certain level of skill in a domain.
- Badges or other digital credentials awarded for completing specific learning or proving competency in a particular area.
An LAP can aggregate this data together alongside skills data from other platforms in your ecosystem. This means you can report on which skills a person has or which people have certain skills, based on data from across your learning ecosystem.
For example, when looking to fill a vacancy requiring certain skills, the LAP could be used to produce a report of potentially suitable internal candidates.
The LAP can also be used to provide high-level insights into the prevalence (or lack of prevalence) for particular key skills in your organization, informing your decisions about which skills require additional training provision.
Up Next: Learning experience platform rata Requirements
The LMS is a central platform in most learning ecosystems and often one of the first that organizations report on. But often LMS data does not paint the full picture, and data from other platforms is required to join the dots. In the next post, we’ll look at the learning experience platform (LXP)—which contains valuable data that’s not only about what learners are learning, but also about what they want to learn.
About the author
As a co-author of xAPI, Andrew has been instrumental in revolutionizing the way we approach data-driven learning design. With his extensive background in instructional design and development, he’s an expert in crafting engaging learning experiences and a master at building robust learning platforms in both corporate and academic environments. Andrew’s journey began with a simple belief: learning should be meaningful, measurable, and, most importantly, enjoyable. This belief has led him to work with some of the industry’s most innovative organizations and thought leaders, helping them unlock the true potential of their learning strategies. Andrew has also shared his insights at conferences and workshops across the globe, empowering others to harness the power of data in their own learning initiatives.
Subscribe to our blog