Conspiracy theories thrive on YouTube, new study - The University of Sydney
a hand tapping the YouTube play button on a mobile phone in a dark room

Conspiracy theories thrive on YouTube, new study

5 October 2022
YouTube urged to do more to prevent misinformation
A new study by social media researchers at the University of Sydney and QUT has found conspiracy theories are thriving on YouTube despite the platform's efforts to harden posting rules and guidelines.

The study published in the Harvard Kennedy School Misinformation Review, global leaders on misinformation research, examined YouTube comments on Covid-19 news videos featuring American business magnate and philanthropist Bill Gates and found conspiracy theories dominated.

The comments covered topics such as Bill Gates’ hidden agenda, his role in vaccine development and distribution, his body language, his connection to convicted sex offender Jeffrey Epstein, 5G network harms, and ideas around Gates controlling people through human microchipping and the ‘mark of the beast’.

The results suggest that during the Covid-19 pandemic, YouTube’s comments feature, just like anonymous message boards 4chan and 8kun, may have played an underrated role in conspiracy theories growing and circulating.

The findings support previous studies that argue misinformation is a collective, socially produced phenomenon.

a woman with short blonde hair smiling

Dr Joanne Gray, digital cultures

Dr Joanne Gray, a University of Sydney researcher on digital platform policy and governance, said: “We found that the process of developing a conspiracy theory is quite social. People come together and socially ‘join the dots’ or share new pieces of information that they use to build conspiratorial narratives. The social media platforms’ current approaches to content moderation (which are often automated) are not good at detecting this kind of social conspiracy theorising.”

Co-authors of the study include Lan Ha and Dr Timothy Graham from Queensland University of Technology.

Conspiracy theories on YouTube

a graphic image of a circle with YouTube comments in it

Conversational strategies used in sample comments on YouTube.

Click image to enlarge.

During the Covid-19 pandemic, YouTube introduced new policies and guidelines aimed at limiting the spread of medical misinformation about the virus on the platform.

But the study found the comments feature remains relatively unmoderated and has low barriers to entry for posting publicly, with many posts violating the platform’s rules, for example, comments that proposed vaccines are used for mass sterilisation or to insert microchips into recipients.

The researchers studied a dataset of 38,564 YouTube comments drawn from three Covid-19-related videos posted by news media organisations Fox News, Vox, and China Global Television Network. Each video featured Bill Gates and, at the time of data extraction, had between 13,000–14,500 comments posted between April 5, 2020, and March 2, 2021. 

Through topic modelling and qualitative content analysis, the study found the comments for each video to be heavily dominated by conspiratorial statements.

graphic chart of topics in the comments, including words like pure evil, vaccine and microsoft

22 topics and percentage of affiliated comments. 

Click the image to enlarge.

Some comments were considered “borderline content,” which YouTube defines as content that “brushes up against” but does not cross the lines set by its rules.

Examples of borderline content include comments that raise doubts about Bill Gates’s motives in vaccine development and distribution and the suggestion that he seeks to take control in a “new world order.” These comments implied or linked to theories about using vaccines to control or track large populations of people.

YouTube comment moderation

The researchers said the platform should consider design and policy changes that respond to conversational strategies used by conspiracy theorists to prevent similar outcomes for future high-stakes public interest matters. 

Three common conversational strategies include: strengthening a conspiracy theory (“joining the dots of disparate information”), discrediting an authority (“casting doubt”) and defending a conspiracy theory. These comments can be amplified when readers ‘like’ the comment.

“YouTube almost completely lacks the community-led or human moderation features that are needed to detect these kinds of strategies,” said Dr Gray.

The researchers said that for YouTube to address this problem adequately, it must attend to both the conversational strategies that evade automated detection systems and to redesign the space to provide users with the tools they need to self-moderate effectively.

News publishers and YouTube

a graph showing the number of likes and replies for different topics such as microchip or fake news

The average numbers of likes and replies within each topic.

Click the image to enlarge.

The study urges YouTube to develop best practice content moderation guidelines for news publishers that outline strategies used by conspiracy theorists that are invisible to automated moderation. In addition, news publishers could turn off comments on high stakes public interest videos to ensure they do not exacerbate the circulation of conspiracy theories.

“A major implication of our study is that YouTube needs to redesign the space to provide social moderation infrastructure,” said Dr Gray. “Otherwise, the discursive strategies of conspiracy theorists will continue to evade detection systems, pose insurmountable challenges for content creators, and play into the hands of content producers who benefit from and/or encourage such activity.”

Declaration: The authors declare no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. YouTube data provided courtesy of Google’s YouTube Data API. Dr Timothy Graham is the recipient of an Australian Research Council DECRA Fellowship (project number DE220101435). Top Photo: Adobe Stock Images

Related articles