Opinion_

Is FaceApp hoarding our data?

6 August 2019
In recent weeks there have been privacy concerns and questions around how FaceApp processes facial images, and whether the app is amassing a database of user-submitted images for political purposes, under the guise of a light-hearted game.

FaceApp could track individual user behaviour 

AI and deep learning expert from the University of Sydney’s School of Computer Science, Dr Vera Chung says the main concern with FaceApp is there's no telling what happens to a person's photograph once they submit it to the app.

“Big data is a valuable commodity. The issue with FaceApp is that we do not know whether user-submitted face images will be collected and sold to third parties. Once a user submits his or her own photo, they rescind all control," said Dr Chung.

“In the worst-case scenario, the user-uploaded images could fall into the hands of a malicious regime or politically motivated organisation. Hypothetically, if the company behind FaceApp had ties to such a regime, then they could also have access to other metadata FaceApp collects, such as user location. FaceApp could very well be a useful tool for an organisation to build their mass surveillance system to monitor and track the behaviour of individual users," she said.

FaceApp a privacy "flag"

"FaceApp should be a flag. Firstly, this selfie time machine shows how self-absorbed we can be. But it also underscores that we can pay for such services with our money, our time or our privacy," said Mike Seymour, an Associate Lecturer from the University of Sydney Business School who specialises in computer graphics and virtual reality.

"In that light this shows we don’t value our own privacy highly. While ads demand payment in forced attention, this app demands a slice of your privacy and hence, your identity."

FaceApp one of many apps collecting personal data

“When it comes to cybersecurity and personal data, it’s much easier to focus on one incident and forget the broader issues at play. The FaceApp trend is alarming, yet it is no more alarming than the implications of us sharing photos over any other social media platform, many of which likely have similar terms and conditions," said Dr Suranga Seneviratne who is a cybersecurity expert from the University of Sydney’s School of Computer Science.

“We should learn from incidents like this and anticipate upcoming AI-driven threats to cybersecurity. AI tools are becoming pervasive. With minimum effort and cost, a person can build a high-quality face recognition system in a matter of minutes.”

Person taking a selfie on a smartphone.

Does the novelty of FaceApp mask malicious intentions, or is its use of data no different to apps like Facebook?

“It’s time we start a discussion about the ethical and responsible use of machine learning as well as building solutions to counter such threats. In the coming years, we can expect to see threat actors using AI for highly effective social engineering, evading malware detection systems, and bypassing authentication methods.”

Dr Fiona Andreallo, is a Media and Communications lecturer who specialises in digital cultures. She also says FaceApp is not the only platform amassing personal data.

“At first people may be alarmed by the recent reports about FaceApp and data mining. However, if we are serious about these issues then we need to think beyond a particular app, and indeed the technology itself.  

“Technology is simply a media through which human relationships are apparent. This app is not the only app with terms and conditions that require reviewing because this is just one way personal images and data are shared through social media.”

FaceApp amassing huge, online image database

Computer Science PhD candidate, Henry Yeung believes the app's sophistication would require images to be stored and processed remotely, raising privacy concerns. 

“Apps that use a mobile phone’s computation power to perform changes do not require data to be stored remotely. If this were the case, every bit of data supplied by the user would remain on their phone.”

“However, it seems far more likely that the technology used by FaceApp is deep-learning based. Deep-learning technology is highly demanding in terms of computation power and therefore it’s very unlikely the image could be processed by a mobile device. It is therefore far more likely that FaceApp is sending the images to a powerful, external cloud server for processing."

“This remote data storing is what poses serious privacy issues. An image could be housed on a server long after a person has finished using the app. They would have no means to check whether the image has been deleted or not, meaning the app creators could do what they like with it.”

Low Luisa

Media and PR Adviser (Engineering & IT)

Loren Smith

Assistant Media Adviser (Humanities)

Related news