Is FaceApp a circling bird of prey ready to swoop on our data and identities or is it merely a pit canary warning us of danger to come? Chris Lake takes a closer look at the latest social media trend.
GLOBAL MEDIA ATTENTION on the FaceApp application has been focused on the dubious wording of its privacy agreement with plenty of spine-chilling reporting about the potential for users to lose the rights to their own faces forever.
We’ve been here before, though. Two years ago, FaceApp’s viral reach was brought about by the enormous popularity of its gender and face-swapping functions, with a seemingly endless tide of social and traditional media sharing of admittedly hilarious images of men as women, women as men and both as other people. The coverage only really turned nasty when it turned out that ethnicity swapping functions allowed users to engage in “digital blackface”, as reported in Techcrunch way back in 2017.
Back then – though somewhat less hysterically – exactly the same privacy concerns were being raised, before fading again into the frenetic maw that is the tech news cycle. So why have these concerns flared up again? And why are they so much more intense this time around?
Meet the face behind FaceApp: https://t.co/sqPjCyXf9Y pic.twitter.com/K5ynVLSDlz
— Forbes (@Forbes) July 24, 2019
It all seems to have started with the sharing of the User Privacy Agreement, a bone-chilling and matter-of-fact document which appears to empower the company to steal each and every user’s data forever.
According to Edward Farrell, cybersecurity expert and CEO of Australian CS firm Mercury ISS, there’s not necessarily anything particularly sinister in this:
“If you write heavily prescriptive guidelines, you then have to abide by them.”
Mr Farrell explained that a combination of factors could likely have led to the agreement’s sweeping nature, chief amongst them being the very real disconnect which often exists between lawyers and technology. Broad general statements, he explained, minimise policy implementation requirements and potentially cover gaps in understanding on the part of legal drafters.
For the most part, nobody actually reads these agreements, but the general tendency of them is to grant very similar rights to those associated with FaceApp’s.
Facebook’s terms of service, for example, says:
‘...when you share, post or upload content that is covered by intellectual property rights (e.g. photos or videos) …you grant us a non-exclusive, transferable, sub-licensable and worldwide licence to host, use, distribute, modify, run, copy, publicly perform or display, translate and create derivative works of your content.’
FaceApp, a 2-year-old+ app created by a Russia-based developer, has seen recent spike in use due to celebrities and influencers taking part in the "FaceApp Challenge."
— NBC News (@NBCNews) July 17, 2019
Here's a look at what you should know about how the app might use your photos and data. https://t.co/n8KGD89DkO
FaceApp’s privacy agreement, which has had a near identical clause removed since the spotlight settled on them, still contains similarly chilling blanket declarations.
That still leaves the mystery of why this particular application has come in for so much attention. Some commentators like The Atlantic have argued that this comes down to fear of the “Russian bogeyman”, which seems to make sense, especially in the aftermath of the Mueller Report and its explosive findings on Russian cyber-interference in the 2016 U.S. Presidential Election.
Coupled with this is the “Artificial Intelligence” scare factor, two words chosen by developers to bring thrills and chills to their customer base. In this case, however, we’re really only talking about the simplest of machine learning processes performing relatively inane functions.
According to Mr Farrell, however, there’s something deeper at play:
“With this, it’s personal. You start talking about people’s faces and people start worrying about having their identities stolen by the camera. This is reflective of a lot of the human expectation and discussion around privacy today — basically, we still have Victorian era views of privacy.”
FaceApp isn't taking photos of your face and sending them back to Russia for some nefarious project. At least that's what current evidence suggests https://t.co/LbQuGE31lM pic.twitter.com/FpOIiV6Tsc
— Forbes (@Forbes) July 17, 2019
The characterisation of our interaction with the digital sphere as antiquated is apt. Our collective fear of apps like FaceApp is rooted primarily in a sense that we are riding the tiger, just barely clinging on to the rapid waves of innovation and adaptation in the technology sector. This has become especially acute when it comes to surveillance capitalism and the bone-chilling implications it has for our personal safety and privacy and for our society in general.
FaceApp itself, as far as anyone can tell, is just a silly digital frippery. As reported in Forbes and multiple other outlets, Wireless Lab – the company behind the app – has taken pains to distance itself from the Russian State, has a tiny staff and an annual income which is utterly dwarfed by the big tech giants like Facebook. In short, if you’re a Faceapp user, you can probably relax — Big Brother hasn’t quite got you in his clutches just yet.
There is, however, a lesson to learn from this whole controversy. We – all of us as individuals and as a nation – need to change our assumptions around big data and big tech. We urgently need to change our behaviours to adapt to current and future realities. Above all, we need to educate ourselves so that we can make informed decisions not just as consumers, but as citizens.
It might just be a silly scare about a silly app today, but what it has exposed is the shocking lack of understanding and depth of naivety at all levels of Australian society when it comes to dealing with technology.
JUST IN: Schumer calls on FBI and FTC to investigate FaceApp created in Russia https://t.co/51CbVXdlpJ pic.twitter.com/J3SbUiOReY
— The Hill (@thehill) July 18, 2019
Chris Lake is a freelance writer based in Sydney. He focuses on security, international relations and technology.

Support independent journalism Subscribe to IA.
