Since we started our What's Learning Analytics series, there seems to have been an explosion of dialogue about this topic. We’re particularly excited to see so many of the “Predictions for 2017” include data and analytics as a trend that will blossom in the coming year.
In particular, the following two articles help define this term and the best-practice processes for getting started.
- What Is Learning Analytics? by John R. Mattox, II, Ph.D. (In addition to this online article, you can take a deeper dive by reading John’s book.)
- Learning Analytics: A Practical Pathway to Success by Sharon Vipond, A.D. Detrick (This is only an online summary, you can read the full article with an eLearning Guild membership.)
In What Is Learning Analytics, John defines it as:
the science and art of gathering, processing, interpreting, and reporting data related to the efficiency, effectiveness, and business impact of development programs designed to improve individual and organizational performance and inform stakeholders
And in this blog series, we've defined it as:
the measurement, collection, analysis, and reporting of data about learners, learning experiences, and learning programs, for purposes of understanding and optimizing learning and its impact on an organization’s performance
Recommended Reading
I’m encouraged to see a lot of consistency between these two definitions. I like John’s explicit inclusion of “efficiency” as a primary goal of learning analytics versus our implied notion of efficiency via our use of the word “optimizing.” That encapsulates our concept of learning operations analytics nicely.
Sharon and A.D. define the levels of complexity of analytics according to the traditional Gartner model.
- Descriptive
- Diagnostic
- Predictive
- Prescriptive
At Watershed, we define levels of complexity as:
- Measurement
- Data Evaluation
- Advanced Evaluation
- Predictive and Prescriptive
The Gartner levels are perfectly good and widely used, but, as an industry, we’re not sophisticated users of analytics yet. So, Watershed broke up the traditional lower levels to better reflect the earlier parts of the journey and make learning analytics more accessible. We feel making the first levels achievable and worth celebrating is essential.
As expressed, the Gartner model omits any form of evaluation. We think it's crucial to add explicit language in our Data Evaluation step around determining what happened and whether that represents a positive or a negative outcome.
While it might not yet be possible at this level to demonstrate statistically significant results, it is an excellent time to establish baselines and targets. These conversations are a great force to ensure organizational alignment and inspire a team to shoot for lofty goals.
Sharon and A.D. define a process consisting of six steps:
- Hypotheses/Assumptions
- Capture and Clean Data
- Analyze and Report
- Use the Findings
- Refine Offerings
- Build Supporting Content
Watershed defines our five-step process to getting started as:
- Plan and Gather Data
- Review and Clean Data
- Operationalize
- Explore and Analyze
- Build and Refine
These processes are relatively similar, but Sharon and A.D. make one assertion I can't entirely agree with. They encourage a deliberate capture of a predefined data set intended to validate or refute a hypothesis.
This practice is a great way to start an Advanced Evaluation of a Learning Program. For that type of analysis, it's essential to have a known set of clean data to build an experiment that will drive organizational change.
However, Sharon and A.D. also assert that:
While data mining is effective in specific situations, most L&D departments do not have the volume and depth of data on their own to mine for valuable insight. Instead, learning analytics is a process that benefits from advanced planning.
I don't see it this way. While most L&D departments don’t have enough high-quality data to do the sophisticated data mining common in “big data,” they certainly have enough data to find some unexpected insights just by looking around at what they have.
At the “small data” level, some of the most immediately actionable insights come from the simple act of starting to visualize whatever data you have access to. Let’s call this “data digging” the simple act of looking at the data you have and getting to know it.
Data digging doesn’t often lead to robust statistically significant proofs of causation, and it certainly won’t predict future outcomes. But, it is quite useful for spotting outliers, abnormalities, and unusual trends to investigate.
In other words, data digging is a great way to spot things that aren’t working as intended. Those issues are often immediately actionable and lend themselves to quick fixes. Quick wins are incredibly important to organizations getting started with learning analytics and looking for ways to demonstrate its worth.
For example, one Watershed client was data digging and noticed that “no shows” for a particular ILT seemed to spike every Thursday. A bit more digging easily attributed these absenteeism spikes to a particular group and job function that has an important deadline to meet every Friday. A simple reconfiguration shifted these learners’ schedules away from days when they have tight deadlines.
Recommended Resources
Let's Keep It going!
As we wrap up this blog series, I’d like to thank everybody who participated. We’re truly appreciative of all the reviewers and guest contributors who helped make our learning analytics series a success. It's also really exciting for us to see this conversation starting, and we're looking forward to more shared ideas and debate in the coming months. Let’s keep the discussion going!
About the author
As an innovative software developer turned entrepreneur, Mike Rustici has been defining the eLearning industry for nearly 20 years. After co-founding Rustici Software in 2002, Mike helped guide the first draft of xAPI and invented the concept of a Learning Record Store (LRS). In 2013, he delivered on the promise of xAPI with the creation of Watershed, the flagship LRS that bridges the gap between training and performance.
Subscribe to our blog