Learning leaders may find themselves devising elaborate measurement reports to prove the efficacy of specific programs. But, at the end of the day, business relevance should take precedence over dollar signs.
“For 50-odd years, the learning profession has said: training happens in a classroom; we evaluate that and we find out what the ROI was of that training event. I think the whole model is backwards,” said Dan Pontefract, head of learning and collaboration at TELUS, a telecommunications company.
“What we should be doing is articulating what type of performance change we’re looking for in a person, in a team, in an organization, their particular goals and how the new learning and collaboration aspect that is formal, informal, social learning all align to that,” he said.
Determining Relevance: Tangible vs. Intangible
Starting with a business goal or problem to be solved around corporate culture, knowledge management or even systematic training can eliminate a narrow measurement focus, or as Jay Cross, CEO of the Internet Time Alliance, a knowledge exchange organization, explained it, getting hung up “on doing that part right rather than asking again and again: ‘Is this improving the business?’ ‘Is this helping us attain our current objectives?’ ‘Is this delighting our customers?’ And if it’s not, they shouldn’t be doing it,” he said.
Some learning leaders — perhaps fearful for their budgets and status as business partners — may be wary of seemingly unquantifiable learning initiatives such as social learning, but hard numbers aren’t always the best indicator of success. A focus on formal school or executive education-type learning involving tests, for example, may not provide valuable metrics anyway, Cross said, because grades or test results in school are unrelated to anything outside of school. They are essentially the wrong measures.
Cross said there is actually a significant amount of learning taking place in formal situations that fails to translate to behavior change on the job. To increase the likelihood of behavioral change, gathering immediate metrics — smile sheets for example — might not be as beneficial as waiting to ascertain whether learning stuck and is being applied on the job.
“When I measure the effectiveness of a learning initiative, I want to go in and talk to people six months later after they’ve had a chance to forget it or not,” he said. “I’m going to go after a period of time and I’m going to talk to people about: ‘What are you able to do now that you couldn’t do before?’ ‘How did you learn it?’ Then generalize from talking to the sample of people to the whole organization.”
Executives are accustomed to CLOs showing up with their investment reports, but ensuring learning effectiveness isn’t always about being able to prove money was spent wisely. To ensure workforce development offerings contribute to the business, learning leaders should evaluate more intangible metrics such as: How are we driving culture change around formal/informal/social learning, customer satisfaction or employee engagement?
“That is the return on the investment,” Pontefract said. “It’s not the return on investment of the learning in and of itself isolated and boxed out from everything else. It’s what are we spending and what’s the return on what really matters to us — keeping our employees, attracting the right ones, creating that culture of collaboration, doing it all for the customer.”
Measuring the business value of as informal or social learning can and should be done, however, though gathering metrics is not as straightforward as measuring the effectiveness of e-learning or classroom-based interventions.
“You can have some quantitative measurements. For example: How many folks do you have on job rotations in your organization? How many folks are part of a mentor or coach[ing] situation?” Pontefract said. “But more importantly, what’s happening as a result of that? Does that person feel as though they’re more capable of doing their job? Do they feel more capable of finding a new role for them? Are they more engaged in terms of recommending their company as a best place to work? Do they believe in their teammates and their boss?”
These intangible facets often indicate whether employees are engaged and productive, and that ultimately impacts customer satisfaction and profitability.
Work With Business Leaders
Rita Smith, vice president of enterprise learning at Ingersoll Rand — a $10 billion global, diversified industrial company — said the company never creates learning solutions in isolation; it co-designs them in tandem with business leaders to create a system of engagement, high ownership and business relevancy.
During day-and-a-half-long sessions during which business leaders are present physically or virtually, participants navigate a high-impact learning map, an idea that originated with author and learning effectiveness and evaluation expert Robert Brinkerhoff.
Smith said the business goal works backward to identify what knowledge and behaviors are needed to achieve desired results post-learning. Also present at the sessions are potential vendors who have been vetted, or internal trainers or designers who are expected to soak in and understand what learning solutions are needed and how they should be delivered. Following the session, each cohort comes back at allotted times and presents its suggestions to the business leaders.
“The business leaders are actually choosing the vendor; they’ve co-designed the course. There’s nothing that’s ever done without a business sponsor who asks for the funding with us and articulates the value to the company,” Smith said. “We’re never up there alone going: ‘This is good; we should do it.’”
Having sponsorship is one of the keys to ensure learning programs have business relevance.
“Without a senior executive that stands up and says: ‘This is important, and this is why I think it’s worth our investment,’ we don’t touch it,” Smith said. “At the end of the day it should not be learning pushing learning out; it should be a pull from the business.”
Focus on Strategic Metrics
There are two types of metrics: operational and strategic, said Teresa Roche, vice president and chief learning officer for Agilent Technologies, a manufacturer of electronic and bio-analytical equipment for measurement and evaluation.
Operational metrics are often quantitative — dollars per participant, dollars spent, hours per employee. Roche said though she has never been asked for operational metrics, she gathers them to ensure she is spending company and peer time and resources with integrity. The company tends to focus more on strategic measurement, which isn’t always quantifiable: Is capability being strengthened? Is the relevant impact of any program occurring?
These metrics also come from Brinkerhoff’s impact map, which was developed to reflect Agilent’s needs for every program.
“We first start with a column called organizational impact — things we know matter to Agilent, to our business or our culture. Then we look at in each program: What do we need our participants to know? What do they need to do? What do they need to achieve? And what do they need to believe to cause that organizational impact along the way?” she said. “That know-do-achieve-believe — it’s hard to quantify, but if you were to see our impact maps, you’d see there are some things that are non-quantifiable, and there are some things that are quantifiable, but they’re not ROI.”
Forge Partnerships, Avoid Pushback�
Learning leaders, or anyone in organizational leadership, should first determine what matters most to the CEO and board with regard to measurement, and this can vary from organization to organization. Learning leaders who face resistance from CEOs may have engaged them too late; it’s important to partner with the CEO from the start.
“What’s keeping the CEO up at night? What is it that he or she feels is missing in the company in the leadership bench — and speaking to that,” Roche said. “I would first find out what’s their agenda because then you’re not going to have to prove something — you’re working in partnership with them. If you’re in an argument with your CEO about the value of what you’re doing, I have to ask: Did you even leave the starting gate in the right way?”
As Agilent President and CEO Bill Sullivan travels the world, Roche said he looks for employees who explain how a program they took helped them change for the better.
“Instead of assuming that [CEOs] want a certain metric or that they need to have a certain impact report, find out [how] he or she is going to know whether you’re making a difference and what does that difference look like to them?” she said. “With my CEO, it’s about stories, him having conversations with people and just listening to what they’re saying.”
Ultimately, the learning function exists to solve business needs, so CLOs should ask themselves exactly that: how are they helping to solve a business problem?
“Where CLOs make a mistake is by not talking in advance about what capabilities the organization needs and delivering on that and reporting back on that,” Cross said. “Instead, the CLO [should get] together a governance body where they [have] people in charge of the organization who say: ‘What do we want the people to be able to do that they can’t do, and what part of that can learning address?”
This will ensure learning doesn’t occur in isolation, but is executed with the consent and advice of the business leaders based on what they need to succeed.
Deanna Hartley is an associate editor at Chief Learning Officer magazine. She can be reached at dhartley@clomedia.com.