Feedback on Your Instruction (FYI) is a web-based system for generating a survey to collect student feedback about teaching effectiveness and the quality of specific aspects of a course. The FYI surveys are intended as a convenience to you and they are for your use only. They may be modified in multiple ways to best meet the goals you have for collecting feedback on your teaching, in general, or your specific courses.
How to construct a Course Feedback Survey Using FYI
When you want to use a survey to gather student feedback, the FYI menu provides topics, on the left side of your screen, with corresponding questions, which appear when you select a specific topic, to ask your students.
Select a specific question by clicking within the box around the question’s text. Once you have selected your question, the box will be highlighted in blue and the question will be added to the preview of your course survey – appearing on the right of your screen. You can unselect a question by clicking the highlighted box to remove the question from your survey. Once you have selected your questions you can click and drag questions to reorder them in the preview area. If you would like to see the complete layout with the response area included, you can click the box next to "Show response areas”, and the strongly disagree to strongly agree (SD, D, N, A, SA) scale and blank lines for open ended questions will appear.
Number of Questions
Fewer than 20 questions is desirable. The extensive number of options in the FYI system is meant to take into account the wide variety of possible course characteristics and teaching goals. Choices should be limited to those you most want to know about and should be based on your course context. You should not include too many items on a student feedback survey if you want students to complete the survey thoughtfully.
Types of Questions
There are three types of question formats.
The first format – scaled item – asks students to rate their agreement with a given statement using the ratings of SD, D, N, A, SA, respectfully standing for strongly disagree, disagree, neutral, agree, and strongly agree. These have been adapted from the former Student Evaluation of Teaching (SET) instrument that was used at Ohio State for more than twenty years.
Exam questions were clear.
Course helped develop my problem-solving ability.
Writing and drawing at board were legible.
The second format – open ended – allows for students to comment, rather than select a specific rating.
How did the way in which the course was designed help or hinder your learning?
What outcomes did you experience as a result of course field experiences?
Which course texts would you recommend adding or eliminating and why?
The third format – tailored items either scaled or open-ended – allows you to tailor the statement to be specific to your course. Instead of asking if the students found a general item agreeable or disagreeable, or to comment in general on a given item, you can include the specific items.
Please comment on the following reading(s), being as specific as possible about its effect on your learning:
Glasswell's Notes on Metaphysics
The following field experience was helpful to my learning:
Wexner Center for the Arts
You may choose whatever mix of these formats you desire in constructing your survey.
Note for Team Teaching Situations
When a survey is being constructed for a team teaching situation, it is best to decide how to do this based on the team arrangement. If each instructor teaches separate components fairly independent of the other, separate surveys can be used for each. If each instructor participates fully in the course design and is present and active during the majority of course sessions, the course should be evaluated as one, with all instructors sharing the rating. Only the team members can make the decision as to what is most appropriate.
The Survey Creator walks you through three easy steps to create, format, and generate your survey. The three buttons on the left of your screen will guide you through this process. In "Step One," you will select the questions for your survey by selecting a specific topic on the left of your screen, and then clicking on the desired questions within each topic. We suggest no more than 20 questions total for an effective survey. After you have selected your questions, you can rearrange their order by clicking and dragging the questions in the preview screen on the right. In "Step Two,” you have the opportunity to format a heading for your survey, including course information, and make adjustments to your response spaces for open-ended questions. Lastly in "Step Three," you will generate your survey. We provide three standard options: PDF, Word Document, or CSV file used to create a Carmen Survey. If you have any questions about FYI or the results of your survey, please contact University Center for the Advancement of Teaching by email at firstname.lastname@example.org or by phone at (614) 292-3644.
Click below to create a PDF version of your file that you can save and/or print.
Collecting Feedback | When and How
Course feedback can be collected at any time you think appropriate. Early feedback (at midterm or before) is most helpful in suggesting changes that can be made during the offering of the course and is thus likely to positively influence course success and final student ratings. The use of a questionnaire can also be supplemented by other methods during the term, such as asking students at the end of a given class to provide brief written comments on the perceived effectiveness of that class or appointing a course committee to provide feedback at various intervals.
Unlike a summative instrument, such as the SEI, which is intended for personnel purposes, the FYI diagnostic questionnaire is for your purposes alone. For this reason, security is not an issue and you do not need to have a person other than yourself distribute and collect the questionnaires. You will want students to answer honestly and to know that what they say will not influence your attitude toward them, so you will probably want replies to be anonymous. If you think handwriting would interfere with anonymity, you might ask the students to respond to the questionnaire between classes using a computer for the open-ended items. If you think your presence would also inhibit responses, the out-of-class option would be helpful. You can also arrange for forms to be returned to a third party who will hold them until after the course grades are turned in, if you think this will increase student honesty.
Be sure to give students enough time to answer thoughtfully, particularly if open-ended items are used. Some instructors may want to use a take-home option, asking that the form be completed between class sessions.
Since this information is for your own use, you can compile the results yourself and obtain quick feedback. If you have a very large class and are using scaled items only, you can have the students use scannable response sheets, which can be purchased from Stores. These will be scanned free by the Office of Testing at the University Registrar, which will provide results to you.
If you are hand-scoring the completed forms, tabulate the scaled items to compute the number of responses in each category (SA, A, N, D, SD). If computing a mean will help you to understand the results, do this by multiplying each response in a given category by an assigned point value for each category (SA=5, A=4, N=3, D=2, and SD=1). Then, add up the responses and divide by the total responses. An example is below:
For a class of 25 students, the response set for the question, "Exam questions were clear,” contains two students’ choice of “Strongly Disagree," three students’ choice of “Disagree," fourteen students’ choice of “Agree," and six students’ choice of "Strongly Agree." Using the following formula fill in the appropriate numbers to generate the average: ((# of students giving SD Rating*1(SD Rating)) + (# of students giving D Rating*2(D Rating)) + (# of students giving N Rating*3(N Rating)) + (# of students giving A Rating*4(A Rating)) + (# of students giving SA Rating*5(SA Rating)))/total # of responses = Average rating; ((2*1)+(3*2)+(0*3)+(14*4)+(6*5))/25 = 3.76 Rating
Interpreting the Results
When scores are less than “Agree” the course or instructor characteristic is not getting the desired student response. If the reasons for this response are not clear from open-ended information or other information available to the instructor, dialogue with the students is important. A consultant from the University Center for the Advancement of Teaching (UCAT) may be called to help elicit reasons for low ratings on certain items or to suggest strategies for addressing low ratings.
In the case of open-ended items, it is useful to group these by item and look for patterns. If, for example, more than half of the responses to a question on the value of course readings identifies a certain text as being poorly written or difficult to understand, this is important information for course improvement. If only one student mentions this factor, the opinion of other students on this specific text should be solicited or reasons for mentioning this text should be sought in an in-class conversation following the feedback. Once again, a consultant from University Center for the Advancement of Teaching (UCAT) may be asked to help summarize and interpret results.
Using the Results
Even when there are no ambiguous results, it is helpful to talk with students about questionnaire results so that you can respond to them about what changes you can make, which ones you cannot, and reiterate your goals for the course. Such conversations demonstrate your commitment to the course and help clarify your intentions for the course. They also communicate the message that students will be heard and that they are active participants in the success of the course. Asking students to reflect on their own efforts (amount of study time, class attendance and participation, and the like) when a course is being evaluated underlines this notion of joint responsibility.
Your analysis of student feedback is for your own purposes and should not be included in portfolios or materials intended for personnel purposes unless you want to show a pattern of improvement or continuing efforts to know about the effects of your teaching. When this is the case, a descriptive statement rather than raw results should be used, since the methods for collecting the information, the reliability and comparability of different items, and the method of analysis were not controlled. You might also ask a peer to look over the raw results and write a summary of the direction of the findings when you want to make use of information collected for improvement purposes at a time when a personnel decision (merit pay, promotion) is being made.