On Wednesday, lawmakers and tech executives sat down on Capitol Hill and talked data privacy. As more and more personal data ends up online, and it is less and less clear how that data is being used by whom and for what, lawmakers and users are right to be concerned. Those concerns, however, must not prevent companies and researchers from having access to the large data sets necessary for innovation. Key to keeping data flowing is keeping users’ trust through transparency, and companies must make that a priority.
Fueling concerns on data privacy are revelations, like those recently released in a recent research paper, that users really don’t know, or have control over, how their personal data is being used.
That research found that Facebook has provided third parties with information shared for security features, such as two-factor authentication, and by friends who allowed their contact information to be shared with the social media company. That means that even if you are the most diligent user, and work hard to minimize what information is on the web, your personal information could very will still be out there.
To figure out how data might be shared, researchers from Northeastern and Princeton University set up a series of test accounts that shared certain contact information and then created targeted ads to see if that data was being used and analyzed the resulting statistics.
Their analysis found that providing a phone number for security features made that number targetable by advertisers. In an elusive response, Facebook said, “We use the information people provide to offer a more personalized experience, including showing more relevant ads.”
Additionally, that research showed that contact information shared by one friend about another with Facebook could also be used by advertisers. That information could not be viewed or deleted by the person whose information it was. That person would also not know that their information had been shared in the first place. The researchers tested this by uploading landline phone numbers and then attempting to target ads based on those numbers. Those numbers could be targeted without the people whose numbers they were even knowing they had been shared.
This does not instill trust or confidence in users. Instead, it adds to growing concerns about privacy and makes users reluctant to share information.
And although this research was about Facebook’s data collection process, other companies also fail to be transparent when it comes to personal information.
That is a loss for both companies and for American innovation. New technology, like artificial intelligence, relies on large data sets for development. To obtain that data, users must share it and trust those they share it with.
If companies want to innovate and have access to the necessary data, then they must be transparent and credible to their users and prove that, in sharing large data sets, personal identifying information remains safe.