Improving Response Rates for Online EvaSys Surveys

How can response rates be increased for online EvaSys surveys?

Since the start of the COVID-19 Pandemic and the associated move to online learning, all course evaluation surveys have been distributed online. As a result, Schools and Research Institutes that had previously carried out their surveys on paper in class have had to adjust their practices. In some areas the move to online surveys has resulted in decreased response rates and concerns being raised about how useful or representative course evaluation surveys are. The list of strategies below (which is based on good practice developed by a number of Schools and Research Institutes) is intended to assist colleagues to enhance their survey response rates. Please note that this list is not definitive, and some strategies will work better in some areas than others:

  1. Areas with the highest response rates usually gave students the opportunity to complete their course evaluation surveys on their mobile devices or laptops in class or during online lectures and tutorials. This strategy was most effective when lecturers set aside 5 or 10 minutes at the beginning of the final or penultimate class. For pre-recorded lectures, a reminder to complete the course survey can be added to the start of the recording. It is worth noting, however, that not all teaching rooms have wifi access, and many students will choose to watch a recorded lecture at a later point, after the survey has closed.
  2. Surveys can be sent out to students a week or so prior to the last in-person or online lecture of a course (some areas also send surveys out mid-semester). At this point students are still engaged with the course and are more likely to provide feedback. Due to exams and students returning home, surveys that are sent out after teaching has finished are far less likely to receive high response rates.
  3. In some Schools and Research Institutes concerns have been raised about the issue of survey fatigue. There are a few strategies that can be adopted to tackle this. First, Learning & Teaching Committees within individual Schools and Research Institutes can have a meeting to decide on the timing of surveys to ensure that not all courses send their surveys out on the same day. Second, if students are encouraged or given time to complete surveys during online or in-person classes then this will reduce the pressure on them to complete surveys in their own time when they're trying to complete pieces of assessment or revise for exams. Third, questionnaire lengths can be kept to a minimum. Many areas only use the 5 CORE Questions, and students are informed in advance that the survey will be short and won't take long to complete.
  4. In areas with the highest response rates, teaching staff were well informed about when their course surveys would be sent out. Staff also made a personal appeal (both in online/in-person classes, and by email) for students to complete their course evaluation surveys. Staff explained to students that their feedback would be taken seriously and addressed in a ‘Summary and Response Document', which would be made available to them shortly after teaching had finished.
  5. Students are often reluctant to complete surveys if they don't think that the issues they raise will be responded to or taken seriously. To maximise student engagement, all courses must produce a 'Summary and Response Document' (SARD) which summarises the feedback that students have provided and highlights any actions that will be taken to address any issues/concerns that have been raised. If feedback is overwhelmingly positive then the document can simply state that students were satisfied with the course and that no actions were required. Details about the requirements for SARDS can be found on page 2 of the Course Evaluation Policy: https://www.gla.ac.uk/myglasgow/senateoffice/qea/courseevaluation/#thepolicy An exemplar SARD can also be found in Appendix 2 on page 5 of the Policy.
  6. Administrators in many Schools and Research Institutes made use of the reminder function in EvaSys (please note that this only works if all of the student email addresses are uploaded into the system for each course). When this function is activated, only students who haven't completed the survey (EvaSys can work this out from their email address) are sent reminders to complete the survey. These reminders can be set at various intervals but most Schools and Research Institutes set the reminders for once a week.
  7. Surveys can be left open for a couple of weeks after the course has finished. This will give students the opportunity to complete their survey at a more convenient time. If the reminder function has been set then this will act as an additional prompt for students.
  8. Administrators can edit the standard survey invite email that is sent via EvaSys to make it more specific to students on the course. In particular, the ‘Sender (name)’ can be changed from ‘EvaSys Admin’ to something more relatable such as the lecturer’s name. The ‘Reference’ in the invite email can also be personalised, as can the text in the email. These small changes can clarify the purpose of the email and encourage students to click on the survey link.

Closing note: There is no benchmark ‘ideal’ response rate – while we might like to get (for example) over 80% return, not getting a high response rate should not in any way devalue the responses that are received. Indeed, dismissing a (for example) 25% response rate as ‘not representative of the class’ may ignore the fact that this 25% is probably representative of those students who actually have something they want to say - the others not having responded simply because they do not have any particular input or comments that they want to give.

For any queries about EvaSys response rates, please contact Dr Richard Lowdon in the Senate Office: Richard.Lowdon@glasgow.ac.uk.

February 2022