Chasing the Analytics Dream

Vendors are rolling out dozens of analytics tools to measure the business impact of learning, but learning organizations are still struggling to adapt.

For decades, learning experts have pondered how — or even if it’s possible — to measure learning impact. The use of surveys and spreadsheets to track usage rates has always been cumbersome, and they rarely delivered more than inklings of insights into the effect learning programs have on an organization.

But new analytics tools and dashboards from vendors across the learning and HR technology sector are bringing the promise of measurement closer to a reality. “It’s a great time to be in this space,” said Brian Kelly, president of Vestrics, a human capital and learning analytics services company based in Carrboro, North Carolina.

Kelly said he has seen an increase in chief learning officers talking about the need for metrics and asking how they can apply big data analytics to quantify the business impact of learning. “They see it as a way to demonstrate a link between business strategies and learning investment.”

According to Deloitte’s 2015 Global Human Capital trends report, analytics is on the agenda at almost every organization surveyed, with 3 in 4 respondents rating it as “important” or “very important” to their talent development strategies.

Ask for What You Want
Tech vendors have been paying attention. Learning management and human resource management system providers, as well as enterprise solutions, all incorporate metrics andanalytics functionalities into their platforms. That includes visualization tools to make metrics moreuser-friendly via customizable dashboards that populate and evaluate data in real time. These systems are also using application programming interfaces, or APIs, to bring in data from outside the closed system, and provide reports and analysis from a range of other data sources outside the LMS or talent management suite.

These new data-driven solutions give learning leaders the tools to make more future-focused development decisions, including what courses to offer and when, based on past learning and individual career performance. They also offer learning leaders the capability of tracking the results of these investments through a variety of measures, such as whether people completed the course and felt it was useful and whether it drove the behavior change necessary to improve business performance.

Perhaps more important, these systems promise to give CLOs quantified results they can report back to the leadership team to demonstrate the business benefits for their programs, said Kevin Oakes, CEO of the Institute for Corporate Productivity, or i4cp.

Integrating analytics into these measurement tools is an exciting step for learning leaders who have been feeling increasing pressure to demonstrate the business benefits for their investments. Tracking who took training and whether they liked it is an important first step, Oakes said. But the real value will come when they can apply predictive analytics on both sides of the learning equation. “On the front end, you want to know whether training is needed to solve a problem, and once it is implemented, you want to know if it had an impact,” he said.

Unfortunately, learning leaders have struggled to go beyond those early phase one measures. ATD and i4cp’s 2014 “Value of Learning”report showed only 18 percent of respondents said their learning organization did a good job of evaluating overall learning effectiveness. That figure represents a 5 percentage-point slip since 2012. “It’s clear that measurement competency continues to elude the learning profession,” Oakes said.

It’s Not All About the Numbers
It’s not an easy problem to solve. For analytics tools to effectively measure learning impact, they have to do more than just rearrange learning data into colorful charts and reports, said Josh Bersin, principal and founder of human resources advisory and research firm Bersin by Deloitte.

Learning leaders have to be able to compare information from databases across business units, including sales data, performance reviews, management history and past learning track records to get a clear picture of business change related to learning. That requires learning leaders take a bigger picture view of how they can measure results, Bersin said.

Again, it’s not easy, but it is possible. “Every day we are more focused on tracking operational metrics around where the business is going and how training is consumed to support business goals,” said JulianaCamargo, head of strategic programs and operations for education services at EMC Corp., an IT data storage and networking company.

For example, EMC compares data points, such as changes in behavior among staff based on manager reviews, with learning consumed. They also review what learning high performers consumed in the past to identify links between success and learning and to define better development program paths.

Camargo’s team uses Tableau, an interactive data visualization and business intelligence platform, to support many of these tasks, and they’re currently looking into upgrading EMC’s LMS with a solution to provide deeper analytics capabilities.

Learning leaders across industries are clamoring for better measurement and reporting tools, and vendors are stepping up. That’s the future of analytics for learning leaders, said Stephen Bruce, senior vice president for PeopleFluent.

In March, the company announced its latest iteration, Talent Insight Cloud, which combines data from across the talent management life cycle to create customizable reports and actionable insights. “It goes deeper than just historical reporting,” he said. “It allows users to go into the effectiveness of programs to help drive decision-making.”

PeopleFluent is not alone. In response to the need for greater insight into learning needs and impact, software vendors have been building out their analytics capabilities and rolling out new tools in rapid succession over the past 18 months.

In late 2014, for example, Cornerstone OnDemand acquired predictive analytics company, Evolv, and has since incorporated its machine learning algorithms into its reporting and analytics tools. “It enables clients to predict the needs of employees based on past learning and performance data,” said Jason Corsello, vice president of platform strategy for Cornerstone.

Saba’s new TIM, The Intelligent Mentor, also uses machine learning algorithms and historical user data to generate recommendations on which courses and learning content employees should use to close skill gaps and achieve compliance. And Lambda Solutions recently announced the launch of Analytika, a cloud-based reporting addition to Totara LMS that lets users visualize LMS data in dashboards and reports to analyze the effect of training investments.

Customers, CLOs Drive Innovation
CLOs are helping to drive this rush to develop tools to provide better learning impact data. But knowing customers want better measurement and reporting capabilities, and giving them what they want are two different things.

While most vendors offer out-of-the-box metrics and reporting tools, they generally fall short of expectations, said Stewart Rogers, director of product management atLambda. “The biggest complaint from LMS users is that reporting tools don’t go deep enough, and if a customer has never used an LMS before, better reporting is their first request.”

Fortunately, most learning management software is cloud-based, which means developers can roll out a series of iterations every few months to flesh out their offerings and bolster analytics capabilities. This allows vendors to quickly bring measurement and visualization tools to users, and give them a road map outlining the deeper capabilities to come, based on users’ priorities.

For example, Lambda developed its analytics road map after inviting a group learning leaders to a workshop to discuss which analytics questions they would like answered. Rogers’ team expected the conversation to narrow in on two or three key measures they could then focus on. But “we had 30 people in the room and we got 25 different answers,” he said.

So instead of building a single solution that could answer every question, Lambda began developing a series of customizable reports, each could address individual issues such as whether employees are up to date on their assigned skills and what is the analysis of student progress in correlation with organizational goals and objectives. “Every six weeks, we add more reports, and we work with customers on how to use them to delve deeper into their analysis,” Rogers said.

For learning leaders like Misha McPherson, measurement and analytics tools have become a vital component to help shape their learning programs. As senior director of sales enablement for Mixpanel Inc., a small analytics startup companies with 180 employees, McPherson is focused on ramping up the company’s fast growing sales team as quickly as possible. Being able to track progress and identify changes that increase or decrease learning impact is vital to her efforts. “It helps us scale quickly and understand what works,” she said.

Mixpanel uses Mindflash, an online learning platform that features learning analytics tools to track learning effectiveness. McPherson has established several key performance metrics to measure success for her sales training courses, including time to hit performance goals, level of product knowledge and probability of hitting quotas.

Because the company is growing so quickly, it hires groups of sales staff every month, and puts them through three weeks of training as a team. “From the beginning we have benchmark expectations that we use to measure whether they are meeting targets,” she said.

Then her team regularly assesses each class against the benchmark and previous classes to determine whether individuals are meeting targets, and how changes are driving results. “With every new hire class we make tweaks based on past performance,” McPherson said. “The analytics allow us to go back and see if the tweaks worked.”

For example, her team wanted to determine when was the best time to teach sales staff how to demonstrate products to customers. The team moved the lesson from the first week to the second, then the third, comparing changes in on-the-job performance among the different classes.

“We eventually found the best results came when we taught it all three weeks, but had different people do the presentations each time, so they could learn different techniques,” she said. “We start with good guesses, then we test to see if it works.”

The Mindflash analytics tools help her manage data and reporting, but McPherson said it’s up to her team to figure out what to measure. She uses an analytics expert on the sales operations team to do the most in-depth reporting. “You have to understand why you are measuring something to be sure it will deliver meaningful results; otherwise it is a waste of time.”

This is a common challenge for companies and vendors as they try to deliver on the promise of analytics. “Many organizations don’t yet have the detailed grasp of big data to predictive analytics work,” said PeopleFluent’s Bruce.

What to Ask
Without some level of expertise, knowing what questions to ask and what data to mine for answers, learning leaders may find themselves going where the algorithms take them rather using the analytics tools to get the right answers. This runs the risk of making decisions based on measures that aren’t relevant to the business.

“Analytics shouldn’t drive strategy; it should help support and guide it,” Vestrics’ Kelly said. “That is a key distinction CLOs need to understand.”

As with all new software, there is the risk that learning leaders will adopt a new tool based on empty promises. But it’s not always the technology that’s at fault. Unless they understand how to apply analytics it won’t make an effect on performance, Kelly said.

Learning organizations still have a long way to go. According to the Deloitte study, only 8 percent of talent management organizations believe they have a strong analytics team in place, which is virtually unchanged from the prior year, despite the deluge of new tools that hit the market in the past two years. “It suggests teams are still not fully enabled, trained or organized to succeed,” the report stated.

Many industry experts agree. i4cp’s Oakes said these tools make it easier for organizations to evaluate learning data, but they still lack the skill set to use them. Many learning groups are attempting to close this skill gap by hiring analytics experts or partnering with experts from finance or marketing who already have these skills on their teams.

At the same time, they also clean up their data and make sure disparate databases across the organization can be linked together so that information can be shared and analyzed in meaningful ways, he said.

While data-driven companies like EMC and Mixpanel have made great strides in measuring learning outcomes and predicting future needs, they are the outliers. Most companies are still stuck in the basic metrics stage tracking usage and completion rates.

In the meantime, Kelly urges companies to carefully review the analytics capabilities for the tools they invest in, and to understand before they buy what these tools can really deliver, and what expertise they will need in house to harness that benefit.

“Reach out to others who are already doing analytics for learning,” said Mixpanel’s McPherson. “The more we can share ideas, the better.”