Earlier this year news broke that Cambridge Analytica, a data-mining firm with links to the Brexit referendum and the Trump presidential campaigns, accessed the profiles of almost 100 million Facebook users to influence the outcome of the US election. In the ensuing investigation, it was reported that psychometric analysis used by firms like Cambridge Analytica can correctly predict skin colour (with 95-percent accuracy), sexual orientation (88-percent accuracy), and political affiliation (85 percent accuracy) on the basis of as little as 68 Facebook likes. The more data available, the more a subject is ‘known’, allowing fine targeting of political messaging designed to exploit psychological vulnerabilities.
In an environment where everything we do leaves digital traces – every purchase, every Google search, every movement we make when our mobile phone is in our pocket, every ‘like’ – policy makers are now asking whether we have a right to psychological privacy. A right not to be known by our digital data – or at least to choose who knows us and how they can use that knowledge.
This issue raises questions about whether the threat posed by psychometric analysis of big data poses any new or special risks, and if so, the nature of those risks and what we might do to protect them, without sacrificing the benefits of being known. This panel of experts will discuss this question and some options for the future of data privacy in Australia.
This event was held on Tuesday 21 August at University of Sydney. There is no podcast for this event.
- Mia Garlick, Director of Policy Australia and New Zealand. Mia manages policy for Facebook in Australia and New Zealand and works with government, child safety and other stakeholders to promote greater awareness about Facebook’s policies and products. Prior to joining Facebook, Mia was the Assistant Secretary for Digital Economy and Convergence Strategy at the Department of Broadband, Communications and the Digital Economy during which time she served on the Government 2.0 Taskforce, which advised the Government on how best to engage on social media and adopt a more open data policy. Mia joined the Department after working in Silicon Valley as the Product Counsel for YouTube and, prior to this, the General Counsel for the non-profit Creative Commons. She has a Bachelor of Arts and Law from the University of New South Wales and a Masters of Law from Stanford University.
- Sophie Farthing, Senior Adviser (a/g) to the Australian Human Rights Commissioner and leader of the AHRC’s Human Rights and Technology Project.
- Peter Leonard is a Sydney-based data and technology business lawyer. He has worked as a legal and commercial adviser to global and Asia Pacific data, communications and technology businesses for more than 30 years. Peter is contributing editor of Communications Law and Policy in Australia, the leading Australian looseleaf service in communications and media law. He chairs the Australian IoT (Internet of Things) Alliance’s Data Access and Use workstream, the Law Society of New South Wales Privacy and Data Law Committee and the Australian Computer Society’s Artificial Intelligence and Ethics Technical Committee. He also serves on the Law Council of Australia’s Information Integrity Taskforce.
- Moderated by: Sascha Callaghan, a lawyer and lecturer in neurolaw at the University of Sydney Law School. She has published widely in the area of health care decision-making, mental health and cognitive impairment. She also has a research interest in law and technology is currently a lead researcher in the Sydney Neuroscience Network on intersections between neuroscience, law and ethics.