Country singer Carrie Underwood has done it, as has rapper Drake, and NBA star Steph Curry. Celebrity chef Gordon Ramsay, the Jonas Brothers, and comedian Kevin Hart, too.
Through the popular photo-editing application FaceApp, these celebrities and millions of others have used the power of technology to digitally travel through time to see what they look like decades older.
When you take a trip to the Year 3000. pic.twitter.com/O9Dxpwj6ex
— Jonas Brothers (@jonasbrothers) July 16, 2019
Me hosting #MasterChef Season 50……#faceapp pic.twitter.com/uKnfxUpC1D
— Gordon Ramsay (@GordonRamsay) July 16, 2019
The app, which has more than 80 million active users and is the top free application in Apple’s App Store, uses artificial intelligence to allow users to change their hair color or eye color, and makes them look younger or older.
FaceApp and the altered photos it has churned out have taken social media by storm over the last 24 hours. But the app has privacy experts urging users to think first before handing their photos and data over to the company for a little bit of fun.
“This is just one instance in a larger ecosystem where user data is the currency and the different application developers are looking to build their own dossiers about their users,” said Daniel Kahn Gillmor, senior staff technologist for the ACLU Speech, Privacy, and Technology Project. “You want a large user base, and you offer it to users gratis, and if they want to monetize it somehow, that’s the business model and it’s certainly problematic.”
FaceApp is owned by Wireless Lab, based in St. Petersburg, Russia. The app specifies in its terms that users “grant FaceApp consent to use the User Content, regardless of whether it includes an individual’s name, likeness, voice or persona, sufficient to indicate the individual’s identity.” But its privacy policy delves further into the data FaceApp is collecting, beyond content uploaded directly by users.
FaceApp’s terms notify consumers it uses cookies to track how the app is used, as well as third-party analytics tools that collect information on the websites they visit. It also collects metadata, and FaceApp’s servers automatically record users’ web request, IP address, exit pages, and URLs, and how they interact with links on the app, domain names, landing pages, pages viewed, and other information.
The overnight popularity of FaceApp led to questions about its privacy policies and concerns over its Russian address, and FaceApp issued a statement further explaining the basics of the service. Its founder, Yaroslav Goncharov, told TechCrunch it uses Amazon Web Services and Google Cloud to store images uploaded by users, and most photos are deleted from FaceApp’s services within 48 hours of the upload date. Sending the images to the cloud, however, could make them susceptible to hacks.
Users who want to have their data removed from FaceApp’s services can make such a request through the mobile app, and while the company said its “support team is currently overloaded,” these requests are a priority.
Additionally, seeking to quell fears that the Russian government may get its hands on the content uploaded to FaceApp, the company said that while its “core R&D team” is in Russia, user data is not transferred there.
But should FaceApp be served with a search warrant or subpoena, it has a “good faith belief that the law requires us” to respond to such a legal request, according to its privacy policy.
“This may include requests from jurisdictions outside of the United States where we have a good faith belief that the response is required by law in that jurisdiction, affects users in that jurisdiction, and is consistent with internationally recognized standards,” the terms state.
The company said it may also “access, preserve and share information” when it believes it is necessary to do so.
Wary of FaceApp’s link to Russia, the Democratic National Committee on Tuesday sent a security alert to 2020 Democratic presidential campaigns urging them to avoid the app.
“This app allows users to perform different transformations on photos of people, such as aging the person in the picture,” DNC Chief Security Office Bob Lord wrote in the alert. “Unfortunately, this novelty is not without risk: FaceApp was developed by Russians.”
Privacy experts said FaceApp’s privacy policy and terms of service are standard for apps or online services, which collect similar information and often write their policies to be vague and one-sided. But a key question is what third parties have access to users’ information and how are they using it?
“That’s a fundamental problem generally, that it’s difficult for users to know who their information has been shared with and for what purposes?” said Jeremy Gillula, tech projects director at the Electronic Frontier Foundation. “Sometimes, you can get information about the types of data that has been shared and entities, but what we really need is strong laws that say what types of data and with whom.”
Gillula said a major privacy issue for the data swept up by apps and online services is the collection of location data, which “can be a pretty sensitive thing,” especially if a company were to experience a data breach.
“It’s inevitable that data gets out, and so that’s why there’s this big concern that things like location data can be very sensitive,” he said. “Everyone has private information they don’t want broadcast to the world.”
Joe Jerome, policy counsel for the Center for Democracy and Technology, urged consumers to minimize the information that’s shared to protect their data.
“The reality is a lot of these companies are collecting facial data not to do anything malicious or bad, but to better train their systems and algorithms to do face detection,” said Jerome. “That raises a whole host of technical and privacy concerns.”
Privacy experts hope to see the regulatory environment shift to require companies like FaceApp to be more transparent regarding their data practices, but also want to see the companies themselves take steps to limit the information they collect.
“These privacy issues are systematic and interdependent issues, and we don’t have a good handle on them yet,” said Gillmor, “and that’s a good reason to use cautious data stewardship when thinking about how to engage with this ecosystem.”
