Netflix has just announced that they are abandoning their star-rating system to a “thumbs” rating system to make it easier for viewers to provide them feedback. Apparently, having to choose between 5 levels of satisfaction was too confusing or time consuming for some customers. According to Cameron Johnson, director or product innovation, testing over the past year showed that viewers were three times more likely to respond to the thumb-rating system than to stars.
Does this mean that it is time to abandon your Likert scale evaluation system for your training classes? Probably not. But perhaps it is time to reflect on the types of questions you are asking, your formatting, and the time you are providing for people to respond. Thomas Guskey, in his book Evaluating Professional Development, makes a strong case for the limited value of post-training surveys. Although they procure immediate feedback and the results are simple to tally, the data usually are not helpful. Participants frequently circle the same number for each question, just to finish up quickly and hit the road. Survey data doesn’t usually to improved instruction. Neither will thumbs up or down. In an article I wrote for TD Magazine, “How’s My Training?” (January, 2015), I suggest that if your goal is to ensure what I call “caffeinated learning”—the kind of learning experience where participants are alert and engaged, and real learning occurs—then the surveys must be tailored to elicit better feedback on your instructional design and delivery skills. You want participants to give actionable insights to your unique areas for improvement. Instead, I find more value in open-ended questions. True, the answers to open-ended questions are harder to quantify, but are much more effective in improving program design and delivery. A thoughtful presenter will be able to mine these answers for nuggets of information and use them to improve their instructional practices. Just be sure that your end your session a few minutes early to provide the extra time it takes to complete this type of evaluation. In addition, custom-designed questions for each session will provide more value than a generic form. Barbara Boone, responsible for training and development in a California firm, decided to add one custom-designed question to each survey, based on the professional growth goals of the trainer. Boone found “By custom-designing some of the evaluation questions, we are able to support our employees to grow as professional developers. They feel that the feedback is more meaningful and valuable to them as individuals.” One last tip comes from a conversation I had with Ken Phillips, founder and CEO of Phillips Associates. If your evaluations are not meant to be anonymous, Ken recommends putting the name line at the bottom of the evaluation form, rather than at the top. Most people will assume that the evaluation is anonymous and answer more honestly. When they get to the bottom and see the request for a name, they are unlikely to go back and change their answers. Have some tips for designing evaluation forms? We’d love to hear from you! Looking for more ideas about training? Check out my book Caffeinated Learning.
0 Comments
Your comment will be posted after it is approved.
Leave a Reply. |
For even more ideas...
AuthorAnne Beninghof is passionate about teaching and learning. Archives
May 2020
var switchTo5x=true;
>
|