قالب وردپرس درنا توس
Home / Fitness and Health / Is FaceApp safe? | The truth about the #FaceAppChallenge

Is FaceApp safe? | The truth about the #FaceAppChallenge



A new day, a new challenge in social media with potentially dull consequences. If you're one of the millions of people who recently downloaded FaceApp to participate in the #FaceApp Challenge and show the world what you'll look like when you're old and gray, bad news: You may have inadvertently granted access to your resemblance to malevolent actors in order to do whatever they want with that content.

What is FaceApp?

FaceApp was blown up in 201

7 when it was downloaded 80 million times and is now experiencing a new level of virality thanks to the challenge. The app uses neural networks to simulate how you'll look like in old age – think of adding wrinkles, coloring teeth – and the challenge is the company's marketing campaign, which encourages you to share the image with others.

Seems to be a fun game, right? Once you upload your selfie to the app, you pass on your face and data to shadow characters who might use it for possibly nefarious purposes.

Wireless Lab, the company behind FaceApp, has very far-reaching terms of service, which raises a growing number of privacy concerns. Section 5 of the Terms of Use "grants FaceApp a perpetual, irrevocable, non-exclusive, royalty-free, worldwide, fully paid, transferable, sublicensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publish and display Your User Content and any names, usernames or similarities provided in connection with your User Content, in any known or later developed media formats and channels, without your being compensated. Granted, this type of content ownership is pretty standard for app services. However, the terms of use of FaceApp are particularly vague.

FaceApp's privacy policy enables you to collect information that is sent from your device, including the websites you visit, add-ons, and other information that helps the app improve the service. This means FaceApp You have full access to your device, your photos, and more, though the app responded only to TechCrunch that it does not intend to misuse your information or data.

However, there is an additional, potentially problematic problem access problem: FaceApp happens to be based in Russia.

Who is behind FaceApp?

Wireless Lab is headquartered in St. Petersburg, Russia, and is headed by Yaroslav Goncharov, a former Yandex employee. Understandably, given the confirmed role that Russia and Russian companies played in the 2016 US elections and the ongoing propaganda war, the security and privacy communities are concerned about the access rights granted to you when using FaceApp. While there is no direct and explicit connection with the Russian government, what if there could be one? And what effects could this have?

Why should I care about commissioning a Russian company with my picture?

"There is a real possibility that such applications are simply honeypots that allow you to divulge information about yourself," says Marc Boudria, VP of Technology at AI Company Hypergiant .

"They just sent her up close, well-lit images of her face," he continues. "Now they know your name and important details and can create a commented image data set of you as a human. The next model would have no problem triangulating, verifying, and adding more data from other sources, such as LinkedIn, beyond your education, work experience, and boundaries. "

Current discussions about facial recognition software and deep fakes highlight the dangers faced by individual companies that have large amounts of data – especially amounts of human faces that can support facial recognition technology.

A recent white paper by Moscow scientists describes the development of a computer-aided learning model in which only a few or only individual images can be used to create deep fakes from these images. A recent article in the New York Times (19659018) found that documents released last Sunday (19459004) revealed that Immigration and Customs Bureau officials (19459018) used face recognition technologies to scan photos of motorists and identify undocumented immigrants , The FBI has spent more than a decade comparing license and visa photos to the faces of suspected criminals. This is clear from a report from the Government Accountability Office (19459005) from last month. Now that this has been done by another company By accessing your data and having a strong record that contains your similarity, this information can be armed by all actors who are interested in it to cause damage to a cyberattack or propaganda campaign. Looking at the upcoming elections in 2020 – but also at the increasing interconnectedness of our daily lives – is a very real and lively concern.

"There are obvious political concerns over the exchange of personal identification information, especially given how Russia has armed weapons in democracies, often in manipulated and false form, and subjected its local business to its will," New America Senior Fellow and LikeWa r author Peter Singer. "But there are also some important privacy issues that go beyond the Russian aspect: like so many social media, most users only think about the funny aspects, not about how they can be monetized and armed with weapons."

What can I do to protect myself ?, an apathetic approach to personal security, we know that it's easy to bypass the privacy policy, but the sooner you ask questions and focus your attention, the sooner you can protect your privacy.

"Consumers should use the latest versions of iOS and Android to control these risks," says Dan Guido, CEO of Trail of Bits . "IOS 13, coming out this fall, warns users, when apps collect their location data or enable Bluetooth in the background. "

Apps do not have to be that unsafe, says Guido." The iOS Photos app uses the Vera working on the device to recognize faces and places, "he says. "Creating software this way is more difficult. Apple and Google should publish software APIs that make it easier to process on the device and then ask app developers for an explanation when they are not being used. "

When platform providers take a strong position to protect human safety, we relieve them of people to protect their own safety. The focus on security at all levels – from Google and Apple to individual users – makes for a more resilient and secure community for all, rather than just one person against all the bad actors.


Source link