We’ve all spent time on the phone with an automated system repeatedly failing to understand our request, trapping us in a loop of escalating frustration. What if, in addition to becoming ever more accurate, such technologies could also be made simply less frustrating to engage with?
Dr Zhanna Sarsenbayeva, a researcher and lecturer at the Faculty of Engineering’s School of Computer Science, believes they can, and dedicates her research to better understanding the real-life dynamics – including the emotional dynamics – between humans and technology.
Hey, voice assistant, you’re driving me crazy!
Voice assistants such as Siri and Alexa are an excellent example. While their accuracy continues to increase, there will inevitably be times when they fail to understand a request. How might they be modified to reduce the negative emotions such failures can induce?
To explore this question, Zhanna and her colleagues created a series of experimental voice assistants (VAs), each with varying characteristics, and recruited participants to interact with them before measuring their resulting emotional experiences.
For consistency all the VAs had female-sounding voices, some with a perceived age of around 35, and some that sounded significantly older at 65+. The researchers varied each VA’s response style upon failing to correctly understand a user’s request, with some responding apologetically (“I’m sorry, I cannot create that calendar event at the moment”), some with humour (“Looks like my music player is having a moment of silence!”) and some by simply continuing without acknowledging their failure at all.
Participants were asked to make a series of specified requests of the VAs – which, unbeknown to them, had been primed to fail to correctly understand these requests on three out of four occasions.
Following these interactions, participants completed a questionnaire asking them the degree to which they experienced certain emotions, including empathy for the VA’s efforts to assist them, trust in its abilities, and satisfaction when it did ultimately resolve their query after initially failing.
The results were striking.1 “Perceived older age positively contributed to inducing user empathy, trust, and service recovery satisfaction,” Zhanna reveals. “Humorous and apologetic responses also positively impacted empathy, promoting better failure recovery.”
The implications are clear, she adds: “By fostering empathy through voice assistant personality traits, we can effectively mitigate failures.”
By fostering empathy through voice assistant personality traits, we can effectively mitigate failures.
Dr Zhanna Sarsenbayeva
Lecturer, School of Computer Science
Putting the 'human' in human-computer interactions
One of the reasons even the most promising of interactive technologies can sometimes miss the mark, Zhanna explains, is that humans are largely left out of the development process.
“We currently do most of our research in this area in the lab, where the environment is ‘perfect’,” she says. “But our research shows that when we introduce even small ‘situational impairments’– such as walking while we’re using a device, or carrying shopping bags at the same time – our interactions with that device are affected. So we need to account for these effects, so that they inform design and make our technology more robust in the real world, where we actually use it.”
To this end, another of her studies set out to systematically quantify the effects of common situational impairments on human–device interactions, producing models that allow these to be factored into development.
As it turns out, many can quite significantly affect performance.2
“As just one example,” Zhanna says, “if we can model exactly how much the weight of a controller will affect our use of it, and over what period of time, then designers can be informed beforehand and can design something of optimal weight, rather than designing something too heavy and then having to come back and redesign it.”
As with all of Zhanna’s work, this contribution places humans at the forefront of technological development, ultimately resulting in devices that work better for us, and expediting design processes for industry along the way.
Representation matters, and I believe it is essential to keep encouraging and supporting women in engineering so that the field becomes more balanced and inclusive for future generations.
Dr Zhanna Sarsenbayeva
Lecturer, School of Computer Science
The best decision I’ve ever made
Having joined the University of Sydney just a few years ago, Zhanna reflects that the support and opportunities she’s received since then have been “amazing”.
“Having only joined three years ago, it’s amazing to be where I am now – what I’m publishing, the students I’m supervising – all because of the support of the faculty and the school. Since I’ve been here I’ve gained independence in my research, I’ve found my own voice, and I’ve felt completely supported, which really matters to me.”
While emphasising that this support has come from all colleagues and staff, when asked specifically about her experience as a woman in the traditionally male-dominated field of engineering, she describes her female colleagues as “outstanding”. “They are brilliant, accomplished, and incredibly supportive,” she says. “I am genuinely inspired by their achievements and the way they carry themselves, and they continue to motivate me in my own journey.
“Representation matters, and I believe it is essential to keep encouraging and supporting women in engineering so that the field becomes more balanced and inclusive for future generations. Diversity, including gender diversity, enriches teamwork and problem solving. Different lived experiences and approaches often lead to more creative solutions and a more balanced way of thinking about challenges.”
Again, she credits the environment she works in with fostering such experiences.
“Looking back, I think it was the best decision I’ve ever made, to join Sydney.”
1 S. Jayasiriwardene, B. Tag, A. Withana & Z. Sarsenbayeva (2025). ‘More Than Words: The Impact of Voice Assistant Personality Traits on Failure Mitigation’, In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT, UbiComp’25), vol 9, issue 3, Article No.: 91, pp 1–33
2 T. Li, E. Velloso, A. Withana & Z. Sarsenbayeva (2025). ‘Estimating the Effects of Encumbrance and Walking on Mixed Reality Interaction’, CHI’25, pp. 1–24
Manual Name : Dr Zhanna Sarsenbayeva
Manual Description : Lecturer, School of Computer Science
Manual Address :
Manual Addition Info Title :
Manual Addition Info Content :
Manual Type : profile
_self
Auto Type : contact
Auto Addition Title :
Auto Addition Content :
Auto Name : true
Auto Position : true
Auto Phone Number : false
Auto Mobile Number : true
Auto Email Address : true
Auto Address : false
UUID :