Whenever I travel I like to spend some time hanging out with blog readers to answer questions. At a recent session someone asked, “How do I measure the success of my elearning?”
That’s kind of a tricky question. While we may all use words like “elearning” we don’t always mean the same thing. On top of that, just because it’s built with an elearning tool doesn’t mean that the output is really an elearning course.
Generally we think elearning is built to change behaviors or improve performance, but that’s not always the case. Many organizations use the rapid elearning tools just to share information. There are also quite a few people who use the rapid elearning tools because it allows them to quickly build multimedia content.
Since there are different reasons why people build “elearning” courses, there are different ways to measure success. In a previous post, we looked at measuring return on investment. Today we’ll review a few reasons why some are building courses and look at how they can be evaluated.
Building elearning courses is not usually the organization’s business goal. Elearning is merely a solution that helps meet a business goal. Understanding that is important. I’ve worked on plenty of projects where producing the course was considered the success; but we never tracked if the course itself produced any meaningful results.
The trick is trying to figure out the real goal and how the elearning course helps meet it. In a simple sense, the client is at Point A and they want to get to Point B. So the expectation is that your course is going to help them get there. Thus the measure of success isn’t that you have a course. Instead the measurement is how close you are to Point B.
Let’s assume that they’ve done all of the analysis and an elearning course is the correct solution (a big assumption). They are using some metric that identifies where they’re currently at. And they’ll use that metric to determine if they’ve met their goals. You’ll use that same metric to determine your success.
If the client says we need to increase sales by 10%, your metric is increased sales. However, this is where it can get a bit tricky. Most times you can’t control a 10% sales increase because there are so many other factors involved. In that case you can set different metrics.
For example, in the past only 50% of the people took the training. Getting more exposure to the correct information is critical. A metric could be that 100% get the training. It’s not going to guarantee that you hit 10% sales, but it does guarantee that you’ve delivered the training, which is one piece of the puzzle.
Another metric might be to do some sort of pre/post assessment to determine what they can do currently and then how well they can do it after the course. Again, you may not be able to guarantee a 10% sales increase, but you can state that after the training they were able to make the types of decisions and perform in a way that demonstrated the level of understanding they needed.
This is why building courses that are connected to performance expectations is critical. It’s less about giving them information and more about how they use the information to make decisions. This post on pull versus push training gives you some insight into what that entails.
While we may not want to admit it, there’s a lot of elearning that’s kind of pointless from an instructional perspective. But it exists because the organization wants to show they delivered the training or met some regulatory and compliance guidelines.
I used to joke that instead of creating courses on how to be ethical, they should be teaching unethical people how to not get caught. It seems we don’t care much about ethics until it makes news. So the goal is really keeping them out of the news. It’s not like the folks at News Corps are saying things would be different if only they had that anti-wiretapping elearning course.
I once met with our chief legal counsel who basically said he didn’t care what the course did as long as he could show that we offered the training. In that case a performance metric didn’t meet his needs.
If the compliance training isn’t tied to measurable business goals, you’ll need to find some other metrics such as how many completed the course. You could also do some sort of pre/post assessment to measure their understanding of the topics.
Sometimes it’s easier to measure your efficiency. For example, last year all of the compliance training was delivered in a classroom. This year it was delivered via elearning. Compare the time spent in class to the cost of online delivery. It’s not going to tell you if any behavior has changed, but it will tell the organization that you’ve cut costs and become more efficient. And that counts for something.
There are purists who will rant and rave about how this isn’t real elearning and we shouldn’t build courses like this. But it is what it is. I’d rather be an employed elearning compromiser than an unemployed elearning purist.
Outside of performance improvement and compliance training, the most frequent use of rapid elearning software is to share information. While the organizations may label it elearning, to me it’s really more like a newspaper or website sharing news. The information is good to know and plays a role in things, it’s just that there’s no real performance expectation tied to it.
Some would say that in those cases they should just use a web page or create some sort of simple job aid or document. But what they miss is that people want to leverage the multimedia capabilities of the software. And besides, the rapid elearning tools are so easy to use there’s not much of a difference in production time between creating a job aid and converting a PowerPoint file to Flash.
If sharing information is the goal, then the key metric would be to see how many people actually viewed the information. If you have some expectation for them to do something like download more info or visit a link, you can track hits to the link or the number of downloads. If that is your goal, then you may also want to employ some of the landing page strategies that are used to entice traffic to web sites. Those strategies could help meet your goal of getting exposure.
So there you have it, three simple ideas to help you get started measuring the success of your elearning courses whether you’re seeking to change behaviors or just share information.
The key in all of this is knowing how to contribute to the organization’s success. Sometimes elearning isn’t the right solution and sometimes it’s the best. I’ve recommended the Performance Consulting book in the past. It’s a good one to help you think more about focusing on the right goals. If you want to think a bit more strategically about where you fit in the organization and how you can make a meaningful impact, Running Training Like a Business, is a good read.
What types of courses do you build? And what are you doing to measure their success? Share your thoughts by clicking on the comments link.
Upcoming E-Learning Events
June 29-30 (Toronto, ON). Connect with your peers in the Toronto area and learn all sorts of tips & tricks in this fun community-based workshop. SOLD OUT.
- July 26 & 27 (Boston, MA). Connect with your peers in the northeast and learn all sorts of tips & tricks in this fun community-based workshop. Register here.
- August 3 & 4 (Seattle, WA). Connect with your peers in the Pacific Northwest and learn all sorts of tips & tricks in this fun community-based workshop. Register here.
- September 20-21 (Dallas, TX). Connect with your peers in Texas and learn all sorts of tips & tricks in this fun community-based workshop. Register here.
- October 12-13 (Vancouver, BC). Connect with your peers in British Columbia and learn all sorts of tips & tricks in this fun community-based workshop. Register here.
Free E-Learning Resources
Want to learn more? Check out these articles and free resources in the community.
Here’s a great job board for elearning, instructional design, and training jobs
Participate in the weekly elearning challenges to sharpen your skills
Lots of cool elearning examples to check out and find inspiration.