Highlighting The Crucial Data Every Company Should Consider
This article is part of a series on tracking the right metrics using the right methodology to visually display and quantify the investment in learning and prove ROI. What you are about to read is a fable. The company, AshCom, is fictional, but the learning challenges faced by Kathryn, AshCom’s CLO, and her team are real and commonly shared by learning teams in large organizations. It is our hope that you will be able to connect with the characters, their challenges, and the solutions they discover. We also invite you to read the first eBook in the series.
eBook Release
The Learning Scorecard: How To Measure L&D Impact And Prove ROI
This eBook introduces a system to track the right metrics using the right methodology to visually display and quantify the investment in learning and prove ROI.
A Strained Atmosphere
The hallways at AshCom were hushed. Team members no longer lingered by the coffee stations but returned quickly to their workstations. The once cheerful and relaxed faces of staff members were replaced with concerned, sober expressions. Things were noticeably strained.
AshCom’s financial struggles were something new to the company of 7000 employees. As a manufacturer located in Minneapolis, Minnesota, it had been steady for decades in both growth and profitability. Some months were better than others, but it had never lost money for an entire quarter. Even more distressing, the losses were continuing. They were now in their fifth month of losses and no one on the financial team was confident things would turn around soon.
AshCom’s leadership team was aware of some of the causes. Their competitors were not only becoming more numerous, but they were also embracing new technology that was increasing their efficiency and lowering the prices they charged their customers. AshCom lost several long-time clients to cheaper options. That set off alarm bells throughout the organization.
The CFO of AshCom, Kurtis, could see other challenges beyond the increased competition. He knew that rising wages were causing pressure on profits. Attracting and retaining a skilled workforce was essential to their success but doing so meant higher pay scales. The cost of raw materials was increasing too. It seemed like everywhere he looked, things were getting more expensive.
Complacency Caused By Success
Kurtis also wondered if something else was in play. The company’s long history of growth and profitability may have been at the root of its current situation. He suspected that success had created some level of complacency. Departmental budgets tended to stay the same with small incremental increases each year. As long as a department was spending its budget and meeting its objectives, it tended not to examine its budgets for any waste that might have crept in. Kurtis suspected the spending creep was a significant contributor to their financial losses.
This suspicion led Kurtis and the CEO to institute what they were calling “Defend the Spend.” They would be taking a microscope to every department’s budget to look for ways they could eliminate waste. They wanted department leaders to defend everything they spent, no matter how small the amount. Their motto was “Shavings make a pile.”
The “Defend the Spend” initiative caused anxiety throughout AshCom. But nowhere more so than in the company’s learning team led by Chief Learning Officer, Kathryn. She and Kurtis were not only colleagues. They had become friends with a deep level of respect and trust for one another.
Kurtis called a meeting with Kathryn and explained the situation. He could see her anxiety rising as he walked through what would happen. Flashing through her mind was the specter of staff cuts. She knew that in challenging financial times, the first cuts happened among the learning team. She saw several friends lose their positions in the recession of 2009.
Kurtis told her that her budget and her staff, for now, would not be cut. He also reminded Kathryn that the learning team had received the largest percentage budget increases for several years in a row.
Defending The Spend
Both Kathryn and Kurtis knew something else. Of all the departments in AshCom, the learning team was the least well-positioned to defend what it was spending. Human resources, sales, research and development, and operations could all show the value of what they did. They could demonstrate Return On Investment (ROI) fairly easily. And most of them had multiple dashboards that made the impact of their work visible to people like Kurtis and others in the C-Suite.
Kathryn and her team had nothing like this. She felt her team was exposed and vulnerable.
Kurtis knew her well enough to know what she was thinking. He reassured her by telling her that the operations team had discovered a major problem with their preventative maintenance program. Because of the operation team’s failure to maintain their machines, their production rates had dropped, scrap rates had risen, and their machine lifecycles were much shorter than they should have been. It was costing AshCom millions.
Kathryn’s team was being tasked with creating a new set of learning experiences to address the problem. He added another dimension related to “Defend the Spend.” Kathryn and her team would need to build a system that would precisely track the Return On Investment for the preventative maintenance training. For now, the system would only focus on this topic, but the expectation was that other learning assets would eventually be brought into the model and would provide precise ROI for each.
Kathryn was rattled at first. She set a meeting with Amy, a consultant to AshCom’s learning team who had worked with Kathryn for several years. Amy was a trusted advisor and served as a consultant to dozens of large organizations.
ROI For Learning
Amy was familiar with the challenge of proving the Return On Investment for learning. She had been through the same situation with multiple companies in her 20 years of consulting.
In the first meeting between Kathryn and Amy, Kathryn laid out the challenge. Amy did not do much to relieve the stress Kathryn was feeling. She said this was a common problem, that it needed to be addressed, and that Kathryn owned it. Amy did offer some hope by reassuring Kathryn that she would walk through this with her, and they would come to a good solution.
In their second meeting, Amy laid out a systematic approach to tackling the problem. They would follow a three-step process.
Must-Haves
1. What to Measure
2. How to Measure
3. How to Make Visible
Kathryn agreed to the process. Together they reviewed Kirkpatrick’s Model for evaluating learning and added a level that would take into consideration the thoughts of the learning team who actually built the learning.
For each level, Amy created a list of what they should measure at each level focused only on preventative maintenance. This would address the question of what to measure.
Level One – Learning Team: Key Metrics To Gather
• Learning science (clarity of organizational/learner objectives)
• Use of technology
• Level of creativity (visuals engaging, User Experience)
• Confidence to meet objectives
Level Two – Learner Reaction: Key Metrics To Gather
• Engagement (hold your attention)
• Relevance (meaningful to your role, good use of time)
• Utility (meet an immediate need, likelihood to change behavior)
• Recommend to Other (net promoter score)
Level Three – Learning Outcomes: Key Metrics To Gather
• Knowledge (percentage completed, knowledge checks passed)
• Skills (percentage able to demonstrate correct procedures)
• Application (knowing when to apply what they’ve learned)
Level Four – Learner Behavior: Key Metrics To Gather
• Compliance (schedule available and visible, replacement parts on hand)
• Actions (lubrication, cleaning, scheduled adjustments, part replacement)
Level Five – Organization Results/ROI: Key Metrics To Gather
• Schedule uptime/downtime
• Production target per hour
• Defect rate per 1000 parts
• Machine lifecycle in months
• Energy use
• Safety record
• Cost of training (ROI formula: ROI = net income/cost of investment x 100)
“This is all that we need to measure?” asked Kathryn at the start of their third meeting.
“I’m not sure I would say that,” replied Amy. “There may be other metrics you or the operational team want to consider. I wasn’t trying to be exhaustive and don’t want us to get stuck on that right now.”
How Do You Measure?
“Got it,” said Kathryn. “Sounds like we are ready to move on to your next area of focus. How are we going to measure these things?”
“It’s like you read my mind,” said Amy. “Or at least you read what I wrote on your whiteboard. We have identified our metrics. Now we have these two questions to ask: How are we going to measure these things? How are we going to weigh what we learn for each one of them?”
“Weigh?” asked Kathryn.
“Think about it,” said Amy. “Not all metrics are equally important. As we gather data, some will be weighted more than others. Some will matter more than others. For instance, looking at Level Two, Learner Reaction, you might care a lot more about whether someone recommends this learning experience than how much it kept their attention.”
“Makes sense,” said Kathryn pausing for a moment. “That isn’t actually the hard part of this, is it?”
“It can be,” replied Amy. “Especially as you expand the number of people who get to decide which metrics are more important and which might be less important.”
“Sorry,” said Kathryn, “that’s not what I meant. What causes me a little heartburn is how we are going to collect the data. Where it will come from. I don’t have visibility into some of the metrics we will need from other departments, and I wonder how willing they will be to share it with me.”
ROI Scores For Learning Experiences
“That has been a problem in a few of the companies I’ve helped,” said Amy. “The solution is usually to show the finance team that you want to report a clear and concise ROI score for each learning experience, but you don’t have the data you need to provide it. That usually opens all sorts of doors.”
“Our CFO Kurtis already told me he would help in any way possible,” said Kathryn. “I’m sure he can get this done for us.”
“Ok,” said Amy. “That’s good to know. Let’s dive into the question of how we will collect the data for the metrics we have determined we need.”
“Looking at this,” said Kathryn, “it seems like getting data for Level Two and Three will be relatively easy. That can come straight from our Learning Management System. Our LMS does a good job of reporting Learner Reaction and Learning Outcomes.”
“All LMSs do that,” said Amy. “The strange part is that learning leaders usually spend very little time actually examining it to see the trends and where they might improve.”
“I am guilty of that,” said Kathryn. “Our team usually moves on to the next thing without looking carefully at what we could be learning.”
“You skipped over Level One,” said Amy. “We need to gather data on what your learning team thinks of the learning they are creating. That can be done with a simple survey tool. I have several I’ve used in the past that work well. You probably have options too.”
Learner Behavior
“We do,” replied Kathryn. “But now we get into Level Four, Learner Behavior. This is where things get much more difficult.”
“Let’s talk about that,” said Amy, “but let’s also remember that we are focusing only on the preventative maintenance program for now. We are going to want to establish the state of current learner behavior before we give learners the training. We need to work with operations to determine their current state of things like maintenance schedules being visibly available and replacement parts on hand. They can tell us this information simply by walking around and tracking it at each machine.”
“Our head of operations will be able to do this,” said Kathryn. “He is highly motivated to improve their results. He will also be able to tell us how often machines are lubricated, cleaned, and adjusted. He will also be able to tell us if parts are being replaced like they should be. When our CFO Kurtis met with me, he told me they discovered a lot of shortcomings in exactly these items.”
“So, they must already know the current state of affairs,” said Amy. “It won’t always be this easy, but this is a good place to begin. Once we have the baseline, we will have to make some choices about isolating the impact of learning.”
“Do you mean a control group?” asked Kathryn.
“That’s one way to do it,” replied Amy, “but there are several options. We can put everyone through the training and then track the metrics to see their level of improvement. Or we can have some go through it and not others and see how the two groups perform compared to each other. This is the most scientific approach, but some companies don’t want to wait. They want everyone to have the same learning experience as fast as possible.”
“I can review these options with Kurtis,” said Kathryn.
Exploring Other Options
“There are other options too,” said Amy. “Sometimes you can introduce learning, see improvement, and ask the learners how much their learning contributed to their improvement. There is another option. I’ve seen some companies simply ask workers who have been through training two questions: Are you improving? How much did learning contribute to your improvement?”
“That seems a little too self-referential,” said Kathryn.
“That’s true,” said Amy, “but it is a data point when there is no other way to measure. Thankfully, that isn’t the case here. You should have all the data you need.”
“How about Level Five, Organization Results?” asked Kathryn. “This is really the payoff for determining ROI.”
“It is,” replied Amy. “You already have the support of your CFO who recognizes how important this is. You also have data from operations on things they are already tracking like scheduled uptime and downtime. Production targets per hour. Defect rates. They are also tracking machine lifecycle and energy use per machine. Again, you have almost everything you need to get started.”
“We do,” said Kathryn. “I’m sure Kurtis will help us with what happens to profitability when our machine uptime and production improve. He can also help us get to the numbers for the financial benefits of lowering our defect rates.”
“Some math will be required,” said Amy, “but don’t hesitate to ask for the help you need. For instance, if you know how much it costs for each defective part made, you can quickly determine how much money AshCom saves when defects are reduced by 10% after workers go through the learning experience.”
“With all of this,” asked Kathryn, “how do we calculate the Return On Investment?”
“The hard part,” said Amy, “is gathering all the data. Once you have that, the formula for determining ROI is pretty simple.”
“So, if we provide these educational experiences at a cost of $100,000,” said Kathryn jotting in her notebook, “and we can prove that we save the company $150,000, it looks like this:
$150,000/$100,000 x 100 = 150% ROI
“If you can do that and those are real results,” said Amy, “everyone in the C-Suite will be thrilled with you.”
Capturing The Data
Kathryn thought for a moment. “I have a question,” she said. “Something we haven’t talked about yet. Where does all this data get captured? I have all these numbers and information. We have several metrics to track for each of the five levels. And then each one of them is weighted by its importance. I’m not sure I could track all of this just for the preventative maintenance program to say nothing of doing this for all the other learning experiences we provide. Do I have to hire a full-time statistician? Tracking all these might be the biggest challenge.”
“That’s true,” said Amy. “You’ve already looked at MindSpring’s Learning Scorecard at the various levels for what to measure. The software can track all of this for you even if it comes from your Learning Management System, and surveys your operations people and financial team. Without a system to manage it, you will be buried in data, and it will all be meaningless.”
“Just what I needed to hear,” said Kathryn. “I’m sure I can’t get approval to hire additional people to provide what Kurtis is asking me to show him.”
“I have a model from my time with other clients that pulls this all together,” said Amy. “You will need to customize it to fit the needs of AshCom, but it will give you a sense of how all this can fit together to give you a clear picture of the Return On Investment. I will send you a link that will be helpful.”
“Speaking of clear pictures,” said Kathryn, “I still need to be able to show this to Kurtis and the finance team. It can’t be a series of spreadsheets. I need to make this visual and clear for them.”
“This a good place to stop for today,” replied Amy. “That is the topic for next time: how to make this all visible and simple. We are going to talk about creating a dashboard so that you can see performance and ROI in real time. This will be a game changer for you and for AshCom.”
“I’m looking forward to that conversation,” said Kathryn. “Let me look at my calendar, and I’ll shoot you some options for dates and times. And thanks again for your wisdom and insight. I’m starting to feel like this is very doable.”
“It is,” said Amy. “It takes work, thought, and planning, but we will get there. I will see you next time.”
Conclusion
Download the eBook The Learning Scorecard: How To Measure L&D Impact And Prove ROI to delve into the data and discover which key metrics your L&D team should consider. You can also join the webinar to discover a completely new approach to measuring ROI.
Dear Reader, if you would like to see a demo of MindSpring’s Learning Scorecard, please click here to schedule a time convenient for you and the learning experts at MindSpring will be happy to walk you through it.