It’s been said that a lie can travel half way around the world while the truth is putting its shoes on. Social media, politics and 'fake news' are currently demonstrating the reality of that idea, and its consequences.
As the forces of the French Revolution began to swirl in the late 1700s, a new element had become part of the political discourse: pamphlets. The invention of the printing press made it relatively cheap and easy to print pamphlets that could be quickly circulated around the country.
Some of the greatest political thinkers of the time wrote pamphlets, as did gifted slanderers and outright liars. Marie Antoinette was viciously smeared by the pamphleteers, adding to the hunger for her execution. The French called their pamphlets libelles, a word that lawmakers have repurposed as libel.
The similarities between those revolutionary pamphlets and the current post-truth environment has not been lost on commentators. In fact, it’s easy to point to any number of events and situations, historic and more recent, where obvious falsehoods have gained more traction in national conversations than verifiable facts – Iraq and the non-existent ‘weapons of mass destruction’ being an obvious one, with continuing ramifications.
If there is a difference between what’s happening now and past events, it’s that the internet and social media have turbocharged the effect – as the printing press once did.
For Professor of Linguistics, Nick Enfield, the forces working against truthful reporting and honest political discourse could have even more profound consequences.
“We live in a time when we must act quickly and make serious decisions around things like climate change,” he says. “So it’s important that those conversations are based on facts.”
Post-truth might feel like a new term, but it was actually coined in 1992 to describe the Iran-Contra scandal and the circumstances around the Gulf War. Then, from 2015 its usage spiked 2000 percent, driven by Brexit talk in the UK and Donald Trump’s presidential nomination in the US. In 2016, Oxford Dictionaries bought in by making “post-truth” its international word of the year.
The significance of post-truth isn’t politicians telling lies – that’s hardly new territory. The significance is that large parts of the population are willing to believe the lies despite all evidence to the contrary.
There are lots of theories on how this evolved. The complexity of the world has caused people to embrace gut feeling rather than facts. The 2008 global financial crisis destroyed faith, not just in banks, but in news media and educational institutions. One commentator even pointed to advertising and its focus on brand rather than the substance of products.
To confront the situation where people buy into emotions or prejudice rather than facts, Enfield has assembled a group of cross-disciplinary University academics to create the Post Truth Initiative. The group includes representatives from the fields of physics, philosophy, media and communications, linguistics, and government and international relations.
They meet regularly to arm-wrestle concepts like ‘truth’ and get up to speed with new thinking on the post-truth universe. These ideas are put into circulation through well-attended public forums on subjects such as the problems created by scientific fraud and psychological insights into why people are convinced by stories but not by facts.
The Post Truth Initiative is also building a Bullshit Detector. The name suggests something that might be able to determine whether a statement is bullshit. But that’s not how it works.
“It’s about making it easier for people in the community to look at politicians and know what they really represent. The extent to which they’re a bullshit artist or not.”
Enfield had the idea after reading about the advances in voice recognition technology and how it’s now possible for a computer to ‘listen’ to a recording and tell you, for example, how many times China is mentioned. If it’s possible to pick out words, maybe it would be possible to pick out ideas, he thought.
That’s when Dr Joel Nothman (BSc(Adv, Uni Medal) ’09 BA ’09 PhD ’14), from the Sydney Informatics Hub came into the picture. Nothman is a software engineer and data scientist with expertise in linguistics. An unusual combination of skills, you might think. But no.
“Language and computer science are both about patterns,” he says. “A lot of people who do computing and maths also have an interest in the structure of language.”
With artificial intelligence, it’s already possible for machines to not only ‘read’ law documents, but make recommendations. Nothman and Enfield are now working together to teach a similar set of skills to the Bullshit Detector. This means grappling with some tricky ideas: what does language look like when someone talks in favour of something? Or talks against something? And how do you teach a computer to even recognise that something’s on topic?
Nothman adds: “I’m also interested in when people don’t speak. It’s easier for a machine to identify those silences than for people to do it.”
When all these and other subtle modes of language and communication are modelled and assimilated into the Bullshit Detector, it will be able to quickly scan vast amounts of information looking for people, topics, attitudes and contradictions, and display them visually on a timeline.
When you think that searching Hansard – the verbatim record of all the proceedings of the parliament and its committees – for the term “Great Barrier Reef” currently presents you with more than 10 pages of references, the Bullshit Detector will be an invaluable tool for researchers and people interested in what is happening around issues of public concern.
In fact, the hope is that one day the Bullshit Detector could encompass not only Hansard, but all the information generated by the broader news media. It will give a complete picture of how people in the public eye deal with the issues of the day.
All this is still some time off. For now, we’re in a war of ideas where people, and indeed democracy, are trying to see a path through the post-truth cultural shift. Enfield himself takes a longer view.
“Truth will always matter, so it will have the last laugh,” he says, before adding a caveat. “They say ‘the truth wants to be free’ ... well, bullshit wants to be free too, and today’s communications revolution makes all assertions equally free as a bird.”
The Post Truth initiative is supported by the University’s Strategic Research Excellence Initiative 2020, which helps University researchers test new ideas, push disciplinary boundaries and find ways to scale up their research.
Written by Elliott Richardson
Photography by Louise Cooper