Researchers at the University of Sydney published a new report on online safety issues concerning young people. Developments in big data, algorithmic technologies and artificial intelligence require measured legal and regulatory responses to safeguard children and teenagers online. In Australia, large-scale research on teenagers and their social media use is quite limited and the new study by researchers in Media and Communications canvasses their main concerns and expectations from policymakers and platform providers. The ‘Emerging Online Safety Issues: Co-creating social media education with young people’ project uncovered the proactive steps teenagers take to avoid harm and how parents and carers negotiate digital media issues with their children. The research findings were utilised in an evidence-based social media campaign to give young people a voice and educate their peers and parents about online safety.
With a grant by the eSafety Commissioner, Chief Investigators Dr Justine Humphry, Dr Olga Boichak, and Dr Jonathon Hutchinson embarked on a research project to identify the most concerning issues for young people in their use of social media and online games.
For the study, they partnered with Youth Action, the peak body representing young people and the services that support them in NSW, and Student Edge/Youth Insight, one of the largest member-based Australian student organisations specialising in research among young people.
The team conducted seven focus groups, three co-design workshops, and surveyed 1200 young people and parents or carers. To achieve a representative snapshot of the Australian society, they engaged Culturally and Linguistically Diverse and Aboriginal and Torres Strait Islander people in metropolitan and rural areas in their research. Key findings show that while young people take charge of their online safety, use platforms with sophistication, and possess a repertoire of digital safety skills, they also want to have a seat at the table in policy consultations.
One of the most interesting things I discovered during the research is young people are incredibly capable when it comes to digital technologies and have developed a range of tactics that help them stay safe and navigate their digital lives.
Young people use a wide array of social apps, messengers, and online games, especially TikTok, Instagram, YouTube, SnapChat, Minecraft, and Fortnite. Over 75 percent of young people indicated that they have used YouTube or Instagram. Nearly 70 percent have used TikTok or SnapChat. Typically, they first joined Instagram and SnapChat in late primary school—with or without their parents’ permission. These are also among the top platforms on which young people chat to people they have not met in person (31 percent). Messaging friends and watching videos are the key drivers for downloading apps. 31 percent stated they upload their own content at least weekly. 17 percent of young people play games online, usually with people they met in real life (68 percent).
To stay safe, young people exercise self-regulation by avoiding scams and suspicious links (72 percent), blocking abusive users (68 percent), declining follow requests from strangers (67 percent), disabling location services, or deleting their profiles or apps (37 percent).
What young people need help with is managing their data and removing content quickly – that’s where parents and carers can really help.
Having material removed, or accessing their personal data are key concerns as platform providers often make these processes cumbersome as young people described it. Common negative experiences include wasting time (54 percent), seeing unwanted ads/content (51 percent), sleep deprivation (27 percent), app overuse (37 percent) as well as cyberbullying (17 percent). Images or videos targeting groups or individuals based on gender, race or sexual identity, and violent or abhorrent materials are seen as particularly problematic.
Usually, I can laugh it off but if it’s more harmful than that, the feeling of uncomfortableness can linger on for some time.
Remarkably, the majority of young people understand factors that undermine their online experience and safety such as cyberbullying (65 percent), harmful or illegal content (77 percent), and trolling (66 percent). Many are not across data privacy principles (53 percent), and only 27 percent have confidence in platforms protecting their data adequately.
While parents commonly insist that rules about social media are in place, young people admit that they often do what they want online and that their parents lack control. 89 percent of parents believe it’s important to monitor their child’s digital engagement, however only 33 percent feel their children are safe online. The research found that parents feel generally ill-equipped to develop their children’s online safety skills. 72 percent of parents of children, aged 12-14, enforce household rules on digital media use. These are often relaxed as they grow up and advance their safety skills and confidence. This is also often a source of tension as young people exercise more independence. Twitter, Facebook, and TikTok are among the most ‘banned’ apps.
Parents and young people live in quite separate digital words. Talking at the point of signing up to a new app or when creating a profile can help to find common ground and learn from one another.
Two thirds of parents discuss their children’s online engagement at least weekly. Discussions involve permissions about accessing a device (44 percent), using a specific app (38 percent) or online/in-app purchases (41 percent).
We expected to find age differences in young people’s digital engagement patterns, but we were confounded by the extent to which those differ by gender. Girls report lower levels of online safety, while mothers are significantly more likely to encourage safety behaviours among their children.
Insights from the participatory research provided the foundation of the co-created Youth Online Safety campaign launched in late June. Featuring evidence-based educational materials including peer-to-peer videos and fact sheets, the campaign generated 2500 views over two months on TikTok and Instagram.
This project was funded through the eSafety Commissioner’s Online Safety Grants Program.