The Kirkpatrick model of evaluation is a 4-tier model for evaluating effectiveness of an instructional design program. Its creator, Donald L. Kirkpatrick, PhD, introduced the model in the 1950s. While some are skeptical of its effectiveness, it still remains the most widely used model in the world due to its simplicity.
http://www.kirkpatrickpartners.com/Our-Philosophy/The-Kirkpatrick-Model |
Level 1 - Reaction
This level measures customer satisfaction, engagement, and relevance. While a positive reaction doesn't ensure that learning occurred, often, a negative reaction almost always reduces the possibility that it did. In other words, how learners react to training is an important part of whether or not they are able to learn.
Types of questions one might ask to measure effectiveness of this level:
- How did participants react to the training program?
- Did it engage them?
- Did they feel the training applied to their jobs?
- Were the materials relevant?
Level 2 - Learning
This level measures learning. Specifically, it measures how well students have evolved in skills, knowledge, or attitude. Pre and post tests help evaluate learning at this level, but they are not always seen as useful in a work-type setting. Various other summative evaluations may be used. While a positive reaction doesn't ensure that learning occurred, often, a negative reaction almost always reduces the possibility that it did. In other words, how learners react to training is an important part of whether or not they are able to learn. Levels 1 and 2 are the most often evaluated levels. Many organizations do not ever reach level 3 of evaluation. If level 3 is evaluated, it is imperative that proper analysis of levels 1 and 2 have first occurred.
- How did participants react to the training program?
- Did it engage them?
- Did they feel the training applied to their jobs?
- Were the materials relevant?
Level 3 - Behavior
This level measures the transfer that has occurred in a learner's behavior due to the training program. For many, this measures the true effectiveness of a learning program. Some argue that this level is difficult to assess due to a variety of factors beyond the instructional designers role. However, the purpose of learning is to impact on-the-job performance, so analysis of this level is important in knowing what changes need to be made to the instructional program in order to facilitate maximum return on investment (ROI). Some factors which impact level 3 results include managerial support of the client, organization processes which may affect a client's ability to transfer learning, and lack of follow-up by training organization after the training has concluded.
Types of questions/actions one might ask to measure effectiveness of this level:
Types of questions/actions one might ask to measure effectiveness of this level:
- Are the new skills, knowledge, or attitudes being used in the learner's everyday environment?
- Are you (or the trainee) using what you (they) learned? Why or why not?
- If you needed to _____ (insert taught skill based off learning objective), how would do this?
- If you aren't ______ (using new skill based off learning objective), what would you need to begin doing so?
- At the conclusion of training, have students write a letter/(or, if virtual, an email) to themselves reflecting on what they learned and how they will apply the learning in the field. Mail the letter/follow up on the email to them months after training and contact them to get input on how they think they are doing applying the skills they set out to apply. The idea is that improving a learner's self-efficacy (belief that they can apply the learned skills), will in fact, increase the likelihood that the learner will apply the newly learned skills. This exercise allows for the learner to authentically evaluate self-efficacy at the end of training while also providing a means to follow up on that evaluation.
- Are the new skills, knowledge, or attitudes being used in the learner's everyday environment?
- Are you (or the trainee) using what you (they) learned? Why or why not?
- If you needed to _____ (insert taught skill based off learning objective), how would do this?
- If you aren't ______ (using new skill based off learning objective), what would you need to begin doing so?
- At the conclusion of training, have students write a letter/(or, if virtual, an email) to themselves reflecting on what they learned and how they will apply the learning in the field. Mail the letter/follow up on the email to them months after training and contact them to get input on how they think they are doing applying the skills they set out to apply. The idea is that improving a learner's self-efficacy (belief that they can apply the learned skills), will in fact, increase the likelihood that the learner will apply the newly learned skills. This exercise allows for the learner to authentically evaluate self-efficacy at the end of training while also providing a means to follow up on that evaluation.
Level 4 - Results
This level measures the impact of training. This may include increased productivity, reduced cost, increased sales, higher profits, etc.
Types of questions/actions one might ask to measure effectiveness of this level:
Types of questions/actions one might ask to measure effectiveness of this level:
- Have support calls on training topics reduced?
- Is there increased customer satisfaction?
- Is waste reduced?
- Has morale improved?
Sources:- Have support calls on training topics reduced?
- Is there increased customer satisfaction?
- Is waste reduced?
- Has morale improved?
http://www.kirkpatrickpartners.com/Our-Philosophy/The-Kirkpatrick-Model
https://www.mindtools.com/pages/article/kirkpatrick.htm
https://maketrainingstick.wordpress.com/
https://maketrainingstick.wordpress.com/2014/06/30/letter-to-self-easy-closing-activity-that-makes-training-stick/
http://maketrainingstick.com/pdfs/written-self-guidance.pdf
https://www.mindtools.com/pages/article/kirkpatrick.htm
https://maketrainingstick.wordpress.com/
https://maketrainingstick.wordpress.com/2014/06/30/letter-to-self-easy-closing-activity-that-makes-training-stick/
http://maketrainingstick.com/pdfs/written-self-guidance.pdf
Bates, R. A. (2004). A critical analysis of evaluation practice: the Kirkpatrick model and the principle of beneficence. Evaluation and Program Planning, 27(3), 341-347. doi:10.1016/j.evalprogplan.2004.04.011