Member Questions

Course Evaluations

Written by LERN | May 12, 2018 6:16:25 PM

Q: We are in the process of reviewing our evaluation forms that we use here at the City of Tempe Kiwanis Recreation Center. One if for some of our classes i.e. CPR, Baby Sitting, First Aide, SportBall, Lil Hoopsters.  The other one is for our aquatic classes.

What is driving this process are the following factors:

1.      Is this the best format, questions etc to ask?

2.      We want to gain more demo and psycho graphic info on our users yet still want to get them filled out and returned. So where is the happy medium here?

3.      We just got back a city wide evaluation from a third party organization in Olathe Kansas who surveyed the city and our citizens regarding their satisfaction with city services. Adult and Youth recreation services came in as unsatisfied. Now the city won’t  show us the results, the areas surveyed, what the specific questions are etc. They will reveal them once formally presented to City Council so we know we are going to get taken to task however we do not understand why for they will not reveal the criteria or data base. So we are in a reactive mode looking at past evaluation forms to see if there were things missed? Or, if our data that our customers are satisfied with our services and that the city wide survey and data maybe off? We don’t know.

4.      So we are attempting to go backwards the last 6-9 months and look at our evaluations; along with looking to develop an instrument that could be somewhat standardized which could be used across the city to gain demographic information, measure satisfaction with services /offerings and still allow for some specific program questions that are parochial to a specific program like swim lessons or swim team or even a girls softball program. Make sense?

Can you send me some ideas or samples of “Award Winning” Program/Class Evaluations that LERN champions! I would imagine there is a level of consistency to them, a synchronized method of laying out the form, and a defined strategy to them to elicit response and feedback.

Any questions, please let me know.

Thank you very much!

 

A:
We do not have “award-winning survey” examples, but here are my comments based on LERN’s recommendations for effective evaluations.

In terms of format, you are going to get the best response from surveys and evaluation forms if you keep them short – 5 questions maximum. You have to decide whether the information gathered from each question is worth the time to collect it. Generally, the more questions asked, the fewer the responses, and the more incomplete responses.  Two pages of questions will immediately appear daunting to customers.

Demographic and psychographic information should be solicited separately from course evaluations.  One good way to obtain this type of information is to have your instructors hand out a brief survey in the first class session, letting class participants know that the information will help improve the quality of future programming.  Asking too much from your customers at once will simply reduce the number of responses you receive.

For your course evaluations, determine what is most important to find out from the questions – how you want the information to help you improve your program. Ideally you should consider an evaluation of the course and instructor separately from questions that address things like classroom environment/facilities, equipment, your registration process, etc.  Questions on course content should precede marketing questions – e.g. questions like ‘How did you hear about this class?” and about your registration process should follow questions about the specific class, or should even be asked separately to help keep the evaluation form brief.

· Ask only questions that you can act upon. Don't ask questions where answers give you information about which you can do nothing or which are too vague to provide you actionable value. For example, what is the difference between “Facilities & Accommodations” and “Safety & Cleanliness” on your forms?  If you were to receive negative response on one and positive on the other, how would you know what that means?   For something like “Facilities & Accommodations” are you asking about convenient locations, adequate rooms sizes, heat or cold comfort levels, etc.?  You need to know how the answers will impact your follow-up actions - which leads us to ….

· Be specific. Ask questions that will give you information you can use to make adjustments or changes, for example, “Did the facility offer convenient parking?” or "Did you feel safe and comfortable in finding and using the classroom?" etc.

· Give options (multiple choice or Likert scale, for which you have done for most of your questions) for the answers, rather than asking open-ended questions.

· Give respondents the opportunity to make suggestions for ways to improve your program and for additional classes by offering an “Comments” option at the end – which you have also done.

Think about exactly what it is you want to do with the information from the survey, and determine which questions take priority to help you get the direction you need in order to take action.

The best time to administer the course/instructor evaluation is actually twice - in a course that meets more than once, give it out in the middle and then at the next to last class. The first submission lets you monitor any problems that might be identified and correct them with the instructor before the class ends. Attrition increases with the length of the class, so if you wait until the last class meeting, you may get fewer responses.

 

Convert to online evaluations

This will eventually be done by every program. Online evaluations take almost no staff time. And since they are online, they can be stored, converted, put into records, etc. Most programs currently report that fewer people respond to online evaluations than manual written ones. But if you get enough back, then that’s all you need.  There are two ways to encourage students to do online evaluations. First, have the teacher distribute the URL address on your web site for students to go and fill out the online evaluation. Second, email students the day after the class ends, or the day before the class ends, and provide a link in the email. This second method should garner a good response. With the online evaluation, your web software can even tabulate the results automatically. Staff can copy/paste and forward the results to teachers quickly.

 

 

With regard to your concerns about the city-wide survey, you need to wait until you know more about who was surveyed and how before you “panic” about the results. “Unsatisfied” about what?  Customer Service?  Types of classes offered?  Quality/cleanliness/convenience of facilities? Of course you are being prudent by reviewing past evaluations and surveys, so that you can be prepared to present positive information based on recent responses from known customers.  If you have evaluations that have revealed weaknesses or problems within your programming, customer service, or other areas, you need to be prepared to let the City Council know that you are already aware of these and are in the process of addressing/correcting them (and by what means).  You also need to know what questions were asked by the third party organization and how (format), as well as numbers (levels or extrapolation).   You cannot make comparisons or have valid reactions without information on which to base those reactions, and you need to understand how the data/responses were analyzed.

I have attached a couple of evalution forms that are reader and user-friendly with a clean professional look for your reference.  Please keep me posted on what you find out, and let us know if we can help in any ways to allow your program to move forward toward continuous improvement.

http://media.lern.org/webinars/City-of-Cambridge.pdf
http://media.lern.org/webinars/Tempe-Sample-2.pdf
http://media.lern.org/webinars/Tempe-Sample-Eval.pdf
http://media.lern.org/webinars/Tennis-Eval.pdf