Government fires warning shot at social media with Facebook discrimination case

Redlining was outlawed in America 50 years ago, when it still involved using maps and red markers to identify minority communities where banks didn’t want to offer mortgages.

Using high-tech social media platforms to accomplish the same thing doesn’t change the legality.

In fact, it may make violations easier to find and document, as illustrated in charges brought by the Department of Housing and Urban Development against Facebook and the company’s recent settlement of a lawsuit filed by the National Fair Housing Alliance.

Together, the two cases demonstrate how the reams of user data that let social media firms pinpoint even the narrowest of markets for advertisers can be exploited by businesses seeking to discriminate on legally prohibited grounds such as gender and race.

“In the shadow of this settlement and the challenges that we’ve been raising in the course of our litigation, others in the industry have taken note,” Morgan Williams, general counsel for the alliance, told the Washington Examiner. The changes Facebook agreed to make, he said, “can inform other companies about the sort of thinking they need to put into their own ad platforms.”

Twitter doesn’t allow discriminatory ads, and customers using the medium have to agree to comply with both the law and the company’s rules, a representative said Friday. The platform prohibits targeting ads on the basis of racial or ethnic origin, religion, poor financial condition, or commission of crimes. Google, which owns YouTube, didn’t immediately respond to a request for comment.

“Even as we confront new technologies, the fair housing laws enacted over half a century ago remain clear — discrimination in housing-related advertising is against the law,” Paul Compton, HUD general counsel, said after his agency brought its charges before an administrative law judge. “Just because a process to deliver advertising is opaque and complex doesn’t mean that it exempts Facebook and others from our scrutiny and the law of the land.”

According to the complaint, Facebook’s platform allowed customers to keep people such as parents, non-Christians, and immigrants as well as those interested in topics such as Hispanic culture from seeing their ads.

Though advertisers typically made the choices themselves based on an array of tools on Facebook’s self-service platforms, the social media giant was responsible for setting up targeting options using hundreds of thousands of traits from “hijab fashion” to “moms of grade school kids” that mimicked protected classes, the department said in its complaint.

Facebook, which had been working with the agency to address its concerns, was surprised by HUD’s decision to file charges, a spokeswoman told the Washington Examiner.

“Last year, we eliminated thousands of targeting options that could potentially be misused, and just last week we reached historic agreements with the National Fair Housing Alliance, American Civil Liberties Union, and others that change the way housing, credit, and employment ads can be run on Facebook,” the spokeswoman said.

The Fair Housing agreement resolved a January 2018 lawsuit filed in U.S. District Court in Manhattan accusing Facebook of abusing a trove of user data that “made it the biggest advertising agency in the world.”

In the claim, Fair Housing and other plaintiffs described submitting dozens of housing ads through Facebook’s platform. In tests conducted in New York, Washington, D.C., Miami, and San Antonio, the platform offered users the option of excluding women, families with children, and people with disabilities, according to the complaint.

“When Facebook users grant Facebook access to their data, they do not authorize Facebook to use that data to discriminate against them on the basis of their protected characteristics,” the plaintiffs argued. “By exploiting users’ data in this manner, Facebook transforms the character of the data in a manner to which the users did not consent and could not reasonably foresee.”

It’s a claim at the heart of lawmakers’ concerns about how social media companies gather information from users, then repackage it into a product that earns billions in revenue from advertisers, some of whom misuse it.

“There’s a lot of examples where ad targeting has led to results that I think we would all disagree with or dislike,” Sen. Chris Coons, D-Del., told Facebook CEO Mark Zuckerberg during an April 2018 hearing at which he and and Democratic Sen. Cory Booker of New Jersey both worried aloud about housing and credit discrimination.

“This country has a very bad history of discriminatory practices toward low-income Americans and Americans of color,” in part through redlining, said Booker, who’s now seeking the Democratic nomination to run against President Trump in 2020. “I’ve always seen technology as a promise to democratize the nation, expand access, expand opportunities, but unfortunately, we’ve also seen how platforms, technology platforms like Facebook, can actually be used to double down on discrimination and give people more sophisticated tools with which to discriminate.”

To settle the Fair Housing suit, Facebook agreed to create a separate platform for housing, employment, and credit ads that will have restricted targeting options. The company will also set up a page where users can search for and view all ads for home sales and rentals, regardless of whether the ads appeared on their own news feeds, require advertisers to certify that they’re complying with the company’s anti-discrimination policies, and train Facebook employees in fair housing and lending laws.

The agreement will “set a new standard across the industry concerning social media policies that intersect with civil rights laws,” said Williams, the Fair Housing Alliance’s general counsel.

Facebook, which has more than 2 billion users a month, said its ad platform levels the playing field for small businesses, allowing them to identify and reach markets they couldn’t access previously. But the company acknowledges a responsibility to keep such tools from being used to further illegal discrimination.

“This harmful behavior should not happen through Facebook,” Chief Operating Officer Sheryl Sandberg said in a post on the company’s website. “Getting this right is deeply important to me and all of us at Facebook because inclusivity is a core value for our company.”

Related Content