Online Student Ratings: Increasing Response Rates
Abstract
Most institutions offering distance education can identify with the problem of low response rates of online evaluation, but few have systematically investigated the issue. The purpose of this two-phase, sequential mixed method study was to first explore and generate themes about student online evaluation response motivation and practice using interviews conducted via email. Based on these themes, Phase 2 used a Web-based cross-sectional survey of undergraduate and graduate online nursing students to identify preferred strategies to maximize response rates. Perceived value represented the key theme that emerged from the qualitative narrative. Faculty members tend to value online student completed course evaluations and use student feedback for their ongoing course revisions. Students want evidence that the faculty and institution value their feedback. They expect to receive feedback from their institution regarding course changes and improvements. Survey results confirm and extend literature findings. Respondents identified rewards, risk and trust as general means to increase response rates. In particular, participants rated the relative effectiveness of administrative factors (i.e., reminders, motivators, best time for completing, and best location for posting results) and the face validity of course evaluations to measure important aspects of instruction. Online nursing student respondents rated the following reminders to complete the online course evaluation form: email message, faculty facilitator, course schedule, WebCT course calendar, Campus Pipeline homepage, and welcome letter. With a mean response level of 1.29, the Email Message reflected the most effective reminder with nearly three fourths of students ranking it as very effective. Faculty Facilitator and Course Schedule reminders also reflected very strong positive responses. The students also responded positively to factors motivating them to complete the course evaluation form. Briefly, the motivators included bonus mark for evaluation, draw for a prize, improvement/change from feedback, faculty facilitator encouragement, requirement to receive grade and comparison with other student ratings. The Bonus Mark for each course evaluated with a mean response level of 1.48 reflected an extremely effective motivator with approximately 70% of students ranking it as very effective. Further, Draw for a Prize and Notice of Course Improvement or Change resulting from Feedback also reflected very strong positive responses. Respondents clearly indicated the best time to complete course evaluations, the location for posting the results, and whether or not course evaluation addresses the important aspects of instruction. The best time to complete the online course evaluation forms seemed to be the end of the course with a mean response level of 1.29 with 78% of students ranking it as very effective. Most respondents (88.8%) indicated that results should be posted on Campus Pipeline (Intranet) rather than the Saskatchewan Institute of Applied Science and Technology (SIAST) Web site (Internet). In their open-ended responses, students asserted the need for SIAST to post course evaluation to show value for student feedback and institutional accountability and quality assurance. Finally, students confirmed that the course evaluation addresses the most important aspects of instruction (83.6% of respondents). This study has implications for educational institutions striving to maximize student response rates. First, it is fundamental that institutions show value for the student feedback by reporting evaluation results including course changes/improvements. Second, institutions should take advantage of incorporating motivators and reminders, appropriate for their organizational culture, to maximize response rates to online course evaluation. Finally, the researcher achieved a 70% response rate to the Web-based survey by employing strategies identified in the literature to maximize response rates.