Our spotlight on learning and development practitioners continues this week with Monica Griggs, as she shares her professional insights from working with Bridgestone Americas and her recommendations for using learning analytics to create a world-class L&D program.
Sometimes data from your learning management system (LMS) can feel like a black hole. How do you select the data you want to analyze?
Our goal is to show how many learning experiences happened as a result of our efforts to help surge the business. When you have blended learning methodologies and reach your audiences in multiple ways, you have to be able to correlate it from many different places. Then, layer on the different types of deliveries—for example field versus training centers—and make sure they’re consistently tracked. It can be very challenging.
If your reporting isn’t relevant to the business you serve, it won’t properly tell your value to the organization.
How do you find baselines for learning data?
We consider what is most important to the business and manage and track it closely.
You have to pick your battles and understand what is critical to know and report and what isn’t relevant to moving the needle.
Recommended Resources
What are your best practices for defining and monitoring KPIs?
Overall reach in learning experiences by business or function is very important, but what’s more compelling is how they learned something and applied it for behavioral change that yielded results in performance. Our target is to show effectiveness, but it can be very challenging and takes tedious program management and monitoring to gather testimonials and do 30/60/90+ program reviews and analytics.
Knowing exactly what data to look at and its meaning can be challenging. What are some recommendations or examples for learning benchmarks and KPIs?
There are some basics that will continue to matter in telling your story to stakeholders. Survey data on what people will apply on the job and their initial reactions to usefulness of learning is still very powerful.
What’s been one of the biggest surprises you’ve found in your data? What did it tell you and how did it affect your program?
One of the biggest surprises we got was going from 2,000 learning experiences a year to 106,000 experiences in six months by deploying a different type of method of learning. The numbers help tell the story of whether something is useful or not. We don’t always have success the first time and learn constantly about what works with different audiences and what doesn’t.
What are a few small ways people can improve their learning programs every week (or on a regular basis)?
In designing effective learning, we mix science and art to manage behavioral changes, and that’s a tall order. Experiments sometimes work and sometimes they don’t, but we constantly measure and refine according to what we learn.
What are the basics of designing a good experiment for a new learning program?
Learning professionals can get buried in the neatest new thing in learning instead of focusing on the business outcomes. What will get you from your current state to a future state—and is it different knowledge, skills, or abilities? If so, start with the end in mind. Experiments for new learning programs should be micro-chunks of the design with backup plans.
Data can tell a story if it’s used correctly. How do you design programs that provide the data to answer questions and tell your story?
Trying new designs in a safe, economical way and piloting them is where the best success and failure is hatched, but you always learn something from it. An entire program that has modules where you can select experimental designs for small pieces and parts of it is a great environment for pilots.
The most powerful metrics are change in performance, and we know that consistent practice and coaching over periods of time in safe environments is the way forward in performance transformation. It is very difficult to capture that in data, but the success cases tell the story of effectiveness.
Recommended Reading
What are some examples of learning analytics done well within Bridgestone? Where do you go to see examples of innovation in the learning space outside your organization?
Our results in traction and usage of digital assets has been one of the biggest wins because you can expand learning experiences exponentially. elearning is waning in popularity, and people are busier and want micro-chunks of what they need just-in-time, and it’s evolved the way we bring learning to our participants.
There will always be an appetite and need for instructor-led training, but it’s only a slice of the design elements that will result in true transfer of knowledge and the ability to perform at a higher level.
Up Next: L&D Spotlight on Visa
Next week, we'll continue our real-world learning analytics spotlight and talk with Gordon Trujillo, who is the senior director of global talent at Visa. Be sure to sign up for our blog to have the next post sent straight to your inbox.
About the author
Monica Griggs is an experienced, high-energy talent leader who partners with business executives to develop strategies and solutions linked to growth, performance improvement, and meeting business goals serving many different industries.
Subscribe to our blog