By Alan Todd
Proving the value of learning and development programs is important, but it’s no easy feat. According to data from LinkedIn’s 2017 Workplace Learning Report, 32 percent of learning executives believe demonstrating ROI is a top challenge for L&D, and 80 percent agree that quantifying employee development is top-of-mind for their CEO and leadership team.
As a result, companies often embrace a “virtuous evaluation cycle” when building learning solutions for employees: Identify business needs, develop a learning solution, collect the results, determine impact — and then iterate on what works. It’s a proven process across multiple dimensions of a business, and yet it often falls short of quantifying returns on learning investments. So while most companies are largely adept at identifying their business needs and deploying solutions, far fewer are good at quantifying impact.
Because, as it turns out, the trusty evaluation cycle omits a necessary ingredient to success: learning analytics. And I’m not talking about the analytics that are sourced from subjective employee satisfaction surveys, which barely scratch the surface on engagement and desired outcomes. Such summative measures rarely reflect what we know to be true about learning science. We rarely remember, or can articulate how and when we learn, right after we absorb content. Rather, learning manifests itself over time to allow for reflection, observation and practice.
For example, when a manager tells his employee to solve a problem in a new way, he can describe how the approach works — but true learning retention happens when employees have a continuing discussion about pros and cons of different methods, how to execute, plus short- and long-term impact for different stakeholders. Learning science shows us that the key to generating valuable learning analytics, and determining learning ROI, lies in listening to how employees talk about what they’re learning.
Until recently, this wasn’t possible at scale. Now most modern learning solutions capitalize on the familiar by tracking discussion, collaboration and peer-to-peer experiences. As computing power has accelerated and costs have fallen, organizations are able to map structured dialogue over time. Through online discourse, technology enables the use of natural language processing to identify and flag key themes that emerge.
For the first time, employers can actually “hear” whether key learning concepts are borne out in the language their employees are using and “listen” to understand whether new concepts are becoming a part of the corporate lexicon.
Learning analytics to identify key themes in the dialogue of employees provides businesses with a clear view into how participants really feel about what they’re learning — and the impact the learning solution is having on their growth as learners.