Surveys are a vital tool in the learning analytics toolkit, capturing the learners’ ratings and feedback on training and more. This data is most valuable when reported alongside data about the learning itself, but how do you get that data out of the survey tool in the first place?
This post explores the L&D data requirements you should look for when extracting data from a survey tool into a learning analytics platform.
What’s a survey tool, and how does it typically fit into a learning ecosystem?
A survey tool is a platform used to create and distribute surveys, and then view and analyze the results of those surveys.
There are several dominant players (e.g. Momentive/Survey Monkey, Alchemer, Google Forms) in the survey market, and it’s likely that somebody in your organization already has access to at least one of them.
It’s also worth noting that dedicated survey tools are not the only way to run surveys. For example, many L&D teams use their existing authoring tools to create surveys and then report on the data in their learning analytics platform (LAP). Authoring tools may or may not have specifically designed survey functionality, but they all include quizzes that can be repurposed as surveys.
Generally, the marketing team owns survey tools to conduct customer and industry surveys— which means these tools aren’t integrated into the learning ecosystem. Instead, the survey might be delivered via email after the training is completed, or linked to from the system that hosts the training content.
Recommended Reading
Survey software data requirements and what to ask your vendor
What data should I look to pull from my survey software?Clearly, the primary data that you need from your survey tools is the responses to the questions—but how that data is structured and exactly what it includes matters, too.
The ideal data structure for a CSV to be translated into xAPI contains a row for every response to every question, and each row includes the question text and the respondent’s name and identifier.
This structure means that you can translate each row into a single, complete xAPI statement. More commonly, however, survey data CSVs will include one row per complete response with each question represented as a column in that row.
This is still possible to translate into xAPI data, but requires a new template for each survey created because the questions need to be included within the template.
This is an example of a CSV extracted from a popular survey tool. As you can see, the survey questions are used as the column headings.
Recommended Reading
Primarily, survey data provides insight into people’s knowledge or opinions. For example, we use survey data in Watershed to analyze the results of our annual Measuring the Business Impact of Learning survey. Here’s how we visualize the data:
L&D teams commonly use surveys to measure learner satisfaction regarding a piece of training content. But you can use surveys beyond that.
For example, you can identify skills and skills gaps to assess the extent to which learning has been applied to work. Or you can use surveys to facilitate research into the kinds of learning content learners want.
You can even use survey data to measure the utilization of surveys themselves. For instance, you can compare the completion numbers of an e-learning course with the completion numbers of the related NPS survey to measure the survey response rate.
You also can look into drop-off rates to determine the best survey length to balance getting enough data while ensuring enough learners complete the entire survey.
Surveys can include questions about the respondent to get an idea of who is completing the survey, or to enable you to segment the data. The example above looks at company size and sector.
What questions should I ask my vendor about getting the data out?Survey tools are used for a range of purposes, and aren’t used primarily for L&D. As such, these tools do not usually implement xAPI.
Where they do, however, the xAPI implementation may only be available on certain price plans. And if the tool offers a wide range of question types, xAPI tracking may only be available for certain types of questions.
Take care that the price plan you select includes the tracking functionality you need. And always, always test your survey’s tracking before you send it out to participants!
Even survey software that doesn’t support xAPI almost always has functionality to extract the data via CSV. And survey data tends to be relatively easy to translate to xAPI data using something like Watershed’s data conversion engine.
But remember—unless the CSV data follows the ideal structure described above—each survey you create with different questions is likely to generate a CSV file with a different structure.
As a result, this will require a different CSV import template. If you go down this route, you will need to have somebody on your team, or an allowance of managed service time, to create these templates.
The other good news is that there are loads of learning tools that you already have—and when combined with a learning analytics platform, actually make great survey tools! Think about it: a survey is effectively a quiz with no right answers.
So rather than using a specialist survey tool, many L&D teams simply use their authoring tool and LMS to create a survey as a “quiz.” This survey data can then be brought into the learning analytics platform in the same way as the rest of your learning data.
What if I already have a survey tool?Most organizations will already have at least one survey tool within the company (most likely owned by marketing), plus other tools within the L&D team that you can use to conduct surveys.
When thinking about connecting your survey tool to your learning analytics platform, consider if you can use any of your existing software for L&D surveys. In particular, if your survey tool does not have xAPI support, then use an authoring tool with xAPI support to create surveys for a simpler integration.
If you decide to use your survey tool and import the data via CSV, discuss how changes you make to the survey will affect the data structure with your LAP vendor.
For example, some CSV exports from surveys use the question for CSV column headings, so any changes to the question may affect the data structure of future imports. Where possible, try to avoid changes to the survey and ask the same questions each time. This method has the added benefit of making the data more comparable than if different questions are asked for different courses.
Recommended Reading
Survey Tools and Your Learning Ecosystem
How do I map survey tool data with the other systems in my ecosystem to the LRS?Because the survey tool is likely not a learning system, it is important to think carefully about how survey data is linked to the rest of your learning data—both in terms of learner identifiers and linking surveys to their related courses.
For the learner identifier, avoid processes where the learner has to enter their own identifier into the survey. That’s because some learners are likely to make typos or enter a personal email address that prevents linking their survey data to their respective learning data.
Similarly, when surveying learners about a particular course, consider how you can avoid having the learner themselves select, or type in, the identifier of the course.
Some survey tools can receive and pass data—such as course and learner identifiers from the sending system—such as the LMS or an email link—into the survey. This process is a little too technical to go into detail here, so discuss these options with your survey vendor to see what’s possible.
What am I missing out on by not aggregating all this data?L&D normally conducts surveys related to the learner satisfaction with, or impact of, particular pieces of learning content. While this data can be viewed on its own, combining it with data about the learners’ interactions within the content can offer additional insight into why learners gave a particular answer.
Taking NPS surveys as an example, questions you might ask of the combined data include:
- Did learners reporting a high NPS for this video watch more of the content than those reporting a low NPS?
- What is the relationship between quiz score and learner satisfaction?
- Do learners who report lower NPS go on to undertake less learning?
- How do NPS vary between instructors delivering the same content?
- Is NPS affected by factors such as location, time, or day of the week when people complete training?
- Are there people who have answered the survey even though they didn't complete the learning?
- What are the differences between courses with high NPS and those with low?
L&D NPS surveys are designed to capture data about learner satisfaction with training programs. You can use this valuable data to identify what leads to high satisfaction so you can replicate successful programs. This also means you can identify and improve/remove content with low levels of learner satisfaction.
It’s worth noting that while learner satisfaction is an important metric by which many L&D departments are measured (as we know from our annual survey), it is not necessarily the best measure of effective training.
Some of the best learning happens when we are stretched outside our comfort zones, which can be unpleasant; but on the flip slide, enjoying a learning experience does not necessarily mean someone has improved as a result.
Acquisition of skills, application of those skills, and impact of the business are arguably much more important metrics. You can use surveys to collect some of this information (e.g., surveying managers about whether they have observed the skills being applied on the job).
Other data will need to be collected in other ways and from other systems.
Real-World Example: One Watershed client launched a new learning platform, which experienced periods of downtime that adversely affected learners. They imported survey data into Watershed to explore how NPS scores relating to that learning platform were affected around the periods when the platform was unavailable.
The Role of Survey Tools in Skills and Compliance
Is a survey tool a good system of record for compliance reporting?You can use surveys to collect data about the application of skills in the workplace by surveying the learner’s manager, colleagues, and even clients.
This data is useful for skills reporting, as it enables you to identify which learners have actually applied the skills in practice. It is also useful for compliance reporting, as it enables you to evidence the effectiveness of the training in leading to application of the learning.
Recommended Reading
Up Next: Badging & Credentialing Data Requirements
Beyond collecting learners’ NPS metrics, you can use surveys to capture data from learners and their managers about how learning has been applied to work tasks and projects. This data is useful for reporting, and also can be used to inform the achievement of digital badges and credentials—which also generate useful data.
In the next post, we’ll look at the data requirements for digital credentialing services that award these credentials to learners, and consider how that data might be extracted and reported on in a learning analytics platform.
About the author
As a co-author of xAPI, Andrew has been instrumental in revolutionizing the way we approach data-driven learning design. With his extensive background in instructional design and development, he’s an expert in crafting engaging learning experiences and a master at building robust learning platforms in both corporate and academic environments. Andrew’s journey began with a simple belief: learning should be meaningful, measurable, and, most importantly, enjoyable. This belief has led him to work with some of the industry’s most innovative organizations and thought leaders, helping them unlock the true potential of their learning strategies. Andrew has also shared his insights at conferences and workshops across the globe, empowering others to harness the power of data in their own learning initiatives.
Subscribe to our blog