What really happens to Unit of Study Surveys?

Where your feedback goes …
Nearly every class taught at the University is thoroughly assessed by students at the end of each semester. More than 100,000 survey answers are collected, but how does student feedback actually improve teaching?

Remember the pen and paper days?  Back when pieces of paper were passed around the lecture theatre for students to hastily scribble their thoughts with an eye on the clock as it ticks closer to freedom. The old method certainly lacked today’s online sophistication, but what happens to the suggestion box when it becomes a digital thing?

Data-centric companies like Uber, AirBnB and Amazon understand the power user ratings wield to push innovation and provide ever-improving services. Quality education at the University is no different and Unit of Study Survey (USS) data is a foundation for making the student experience better and better with every session.

Quality quant

The driving force in education data at the University is a small team by the name of Evaluation and Analytics (EA), within the Education Portfolio, who process mountains of ratings and feedback and data in order to make it available for faculties, whilst protecting student privacy.

 “We moved unit of study surveys online in 2015, with all units being automatically surveyed (with a few administrative exceptions). Before that it was paper-based,” says Dr Kathryn Bartimote-Aufflick, Head of Quality and Analytics (Academic Lead).

“We now have 93% coverage across all the units on offer at the University. When it was paper-based, the coverage was only 30-40% of units on offer.”

“It’s not just about me thinking of the surveys as something that just happens – it’s about me thinking, 'how can I use them to deliver a better learning outcome for the students?'”
Associate Professor Matthew Beck

More data comes with better insights. Associate Professor Corinne Cauillard, Associate Dean (Education) in the Faculty of Medicine and Health says her team finds it incredibly constructive.

“Data is important. Course directors get an annual report that they share with me – I then talk with them about how to improve it.”

“We use Tableau to visualise the survey results and see the trends – it gives people an idea of what the feedback is like for the whole faculty,” she says.

“We value the fact that people engage with the process. Depending on what comes out of the feedback – it could be pedagogy, assessment design, online materials or activities – we then find the resources to help us make the changes that have been identified.

The USS journey


Students complete surveys online in the final weeks of semester.


Survey data goes to Evaluation and Analytics to be processed and anonymised (i.e. all identifying information is removed so teachers can’t know who the responses are from). If a unit receives fewer than five responses, the feedback is not released.


Final marks are released to students.


After student results are finalised, the USS results are then released to faculties. They are seen by the Unit of Study Coordinator, the Dean, the Associate Dean (Education)s and any other nominees approved by the Dean.



Coordinators add their comments to the feedback – known as ‘closing the loop’.


EA upload the results along with the coordinator’s additional comments to the survey portal for students to see.


Survey results are discussed in unit planning sessions by schools and departments, and at individual coordinator level ahead of the next semester.

It’s the little things – how your feedback helps academics make changes

Associate Professor Jennifer Smith-Merry says feedback for her Health Ethics and the Law unit helped her provide greater freedom for her students.

“I opened up the assessment to say that they could work however they liked – in groups, in pairs or on their own, the choice was theirs.”

Professor Smith-Merry also loosened up requirements for written-only I also removed the requirement for it to be written, because the students told us that they wanted the option to be more creative. So I said they could do it as a multi-media teaching resource; as long as it communicated the key messages about ethics.”

“Some of them have been quite fantastic. I remember one student stuck some super hero heads on rulers, and filmed them having a debate.”

Survey feedback also challenges coordinators to innovate and not let their classes become stagnant.

“It’s not just about me thinking of the surveys as something that just happens – it’s about me thinking, 'how can I use them to deliver a better learning outcome for the students?'” says Associate Professor Matthew Beck from the Business School.

“For instance, when I used the feedback loop I discovered students liked to have opportunities to do research, but when the activity was assessed, it put a different level of pressure on the students, which detracted from the sheer intellectual enjoyment of the discovery process.”

“So I introduced the concept of regular podcasts. I provided them with a short summary of a concept – for example, the concept of optimism bias and why it’s problematic in infrastructure decision making, - and then asked the students to go out and find three other examples of where it exists. The students went out and find the examples, uploaded their research on Canvas, and then we talked about it in the next class.”

The Unit of Study Surveys are open until 14 June. As Associate Professor Beck tells his students: “be specific in your feedback – give me feedback that I can do something with, to make it better for the next cohort of students”.

Last updated: 1 June 2021

29 October 2018

Related articles