A new day, a new challenge in social media with potentially dull consequences. If you're one of the millions of people who recently downloaded FaceApp to participate in the #FaceApp Challenge and show the world what you'll look like when you're old and gray, bad news: You may have inadvertently granted access to your resemblance to malevolent actors in order to do whatever they want with that content.
What is FaceApp?
FaceApp was blown up in 201
Seems to be a fun game, right? Once you upload your selfie to the app, you pass on your face and data to shadow characters who might use it for possibly nefarious purposes.
However, there is an additional, potentially problematic problem access problem: FaceApp happens to be based in Russia.
Who is behind FaceApp?
Wireless Lab is headquartered in St. Petersburg, Russia, and is headed by Yaroslav Goncharov, a former Yandex employee. Understandably, given the confirmed role that Russia and Russian companies played in the 2016 US elections and the ongoing propaganda war, the security and privacy communities are concerned about the access rights granted to you when using FaceApp. While there is no direct and explicit connection with the Russian government, what if there could be one? And what effects could this have?
Why should I care about commissioning a Russian company with my picture?
"There is a real possibility that such applications are simply honeypots that allow you to divulge information about yourself," says Marc Boudria, VP of Technology at AI Company Hypergiant .
"They just sent her up close, well-lit images of her face," he continues. "Now they know your name and important details and can create a commented image data set of you as a human. The next model would have no problem triangulating, verifying, and adding more data from other sources, such as LinkedIn, beyond your education, work experience, and boundaries. "
Current discussions about facial recognition software and deep fakes highlight the dangers faced by individual companies that have large amounts of data – especially amounts of human faces that can support facial recognition technology.
A recent white paper by Moscow scientists describes the development of a computer-aided learning model in which only a few or only individual images can be used to create deep fakes from these images. A recent article in the New York Times (19659018) found that documents released last Sunday (19459004) revealed that Immigration and Customs Bureau officials (19459018) used face recognition technologies to scan photos of motorists and identify undocumented immigrants , The FBI has spent more than a decade comparing license and visa photos to the faces of suspected criminals. This is clear from a report from the Government Accountability Office (19459005) from last month. Now that this has been done by another company By accessing your data and having a strong record that contains your similarity, this information can be armed by all actors who are interested in it to cause damage to a cyberattack or propaganda campaign. Looking at the upcoming elections in 2020 – but also at the increasing interconnectedness of our daily lives – is a very real and lively concern.
"There are obvious political concerns over the exchange of personal identification information, especially given how Russia has armed weapons in democracies, often in manipulated and false form, and subjected its local business to its will," New America Senior Fellow and LikeWa r author Peter Singer. "But there are also some important privacy issues that go beyond the Russian aspect: like so many social media, most users only think about the funny aspects, not about how they can be monetized and armed with weapons."