Cath Ellis Learning Design Logo

Why Evaluation is the Missing Link in Learning Design

Are you stopping at the “Smile Sheet”? Discover how to use Kirkpatrick’s Four Levels of Evaluation to measure real impact, prove ROI, and transform your L&D team into a strategic business partner.

Reading Time: mins

Hands typing on a keyboard with illustrated rating icons, including a sad face, neutral face, happy face, and five stars, suggesting an elearning feedback or survey process.

You’ve analysed the needs. You’ve designed the storyboard. You’ve developed the assets. You’ve launched the course.

Now, you celebrate. Right?

Wrong.

In the rush to meet deadlines, the final step of the ADDIE process, Evaluation often gets reduced to a generic survey asking, “Did you enjoy this course?”

This is a missed opportunity of massive proportions. Without rigorous evaluation, you are flying blind. You don’t know if your training changed behaviour, you don’t know if it improved the business, and you certainly can’t prove your Return on Investment (ROI) to leadership.

Evaluation isn’t just about grading the learner; it’s about grading the design. It is the only way to know if your solution actually worked.

To do this right, we look to the gold standard: The Kirkpatrick Model.

Decoding Kirkpatrick: It’s More Than Just a Quiz

Created by Donald Kirkpatrick in the 1950s, this model breaks evaluation down into four distinct levels. Most organisations stop at Level 1. To be an award-winning designer, you need to climb the pyramid.

Level 1: Reaction (The “Smile Sheet”)

The Question: Did they like it? This is the immediate feedback survey at the end of a course.

  • What it measures: Learner satisfaction, engagement, and relevance.
  • The Trap: A learner can love a course because the instructor was funny or the coffee was good, yet learn absolutely nothing.
  • How to do it better: Don’t ask “Did you have fun?” Ask “Do you feel confident applying this skill tomorrow?”

Level 2: Learning (The Knowledge Check)

The Question: Did they learn it? This is where we measure the increase in knowledge or capability.

  • What it measures: Whether the learning objectives were met.
  • The Tools: Pre-and-post tests, simulations, skill demonstrations, or quizzes.
  • The Reality: Passing a quiz proves short-term memory, not long-term competence. It is necessary, but still not enough.

Level 3: Behaviour (The Transfer)

The Question: Did they use it? This is the chasm where most training fails. This level measures whether the learner applied the training back on the job.

  • What it measures: Behaviour change and transfer of learning.
  • The Timing: You cannot measure this on the last day of the course. You measure this 3 to 6 months later.
  • The Tools: Observation checklists, 360-degree feedback, or interviews with managers (“Is Sarah using the new sales script?”).

Level 4: Results (The ROI)

The Question: Did it impact the business? This is the Holy Grail. It connects the training directly to organisational goals.

  • What it measures: The tangible return on the training investment.
  • The Metrics: Increased sales figures, reduced safety incidents, higher customer satisfaction scores, or faster production times.
  • The Power: When you can show a stakeholder that your training reduced error rates by 20%, you stop being a cost center and start being a strategic partner.

Why You Can’t afford to Skip This

If you don’t evaluate, you can’t iterate.

Imagine a marketing team launching a million-dollar ad campaign and never checking to see if it sold any products. They would be fired. Learning & Development should be no different.

Evaluation provides the data you need to:

  1. Defend your budget: Prove your worth with numbers.
  2. Fix what’s broken: If Level 2 scores are high but Level 3 is low, you know the content is good but the management support is missing.
  3. Improve continuously: Turn “good” courses into “great” ones.

Evaluating Without Access

A common objection: “I’m a freelancer/consultant. Once I hand over the SCORM file, I never see the learners again. How can I possibly measure Level 3 or 4?”

This is the “Black Box” problem. You build it, ship it, and lose all visibility. But lack of access is not an excuse for lack of data. You just have to be smarter about how you gather it.

If you know you won’t be there in three months, use these “Proxy Metrics” instead:

1. Measure Confidence (The Predictor) If you can’t be there to watch them do the job, measure how ready they feel to do the job.

  • The Tactic: meaningful confidence scoring.
  • The Question: Instead of “Did you like this?”, ask: “On a scale of 1-10, how confident are you that you can handle an objection using the ABC method tomorrow?”
  • The Science: Research shows a strong correlation between high self-efficacy (confidence) immediately after training and actual behaviour change later.

2. The “Trojan Horse” Resource Embed a digital “hook” inside the course that requires the learner to reach out to a server you control (or at least can monitor).

  • The Tactic: Offer a high-value Job Aid (e.g., a digital calculator, a checklist, a template) via a link or QR code inside the module.
  • The Data: Track the click-through rate. If 500 people took the course but only 2 downloaded the “Essential Checklist,” your training failed to prove the value of the tool. If people are still downloading it six months later, you have proof of Level 3 utility.

3. The Automated “Boomerang” If you are handing off to a client’s LMS, negotiate one simple automation before you leave.

  • The Tactic: Ask the LMS administrator to schedule a single automated email to go out 30 days after course completion.
  • The Content: Two questions only. “Have you used the skill? If not, what stopped you?”
  • The Benefit: even a 5% response rate gives you qualitative data you can use for your next case study.

The Bottom Line

Stop treating evaluation as an administrative afterthought. It is the most critical part of the design cycle.

If you aren’t measuring the result, you aren’t managing the learning.

Don’t guess if your training worked—know for sure. Book a coaching session with me and start evaluating for impact today.

Trusted by global brands, government agencies, and industry leaders:

Ready to create something exceptional?

I accept a limited number of projects to ensure every client gets my full attention. Let’s chat about what you need.
Cath Ellis Learning Design Logo
I acknowledge the Wurundjeri People of the Kulin Nation as the Traditional Custodians of the Country on which I live and work.
I honour their enduring connection to land, waters, skies, and community, and pay my deepest respects to Elders past and present, and extend that respect to emerging leaders.
I recognise that sovereignty was never ceded. This always was, and always will be, Aboriginal land.

About Cath Ellis

Cath Ellis is an eLearning Designer and Developer based out of Melbourne, crafting engaging and effective learning experiences.
ABN: 32 316 313 079
A Queer-Owned Business

Contact Info

Join My FREE Community

Sign up for my community to enjoy free eLearning tips, inspiration, and more.
©
2026
Cath Ellis
Made with
in Melbourne