A better solution to Facebook’s controversial ad fields

Advertisers can no longer target “Jew-haters” and other bigots on Facebook, and that’s not necessarily the best solution.

On Thursday, it was revealed that Facebook allowed advertisers to target white supremacists and anti-Semitic audiences. ProPublica discovered and explored the extent to which these groups could be targeted. The report described the experiment:

“Until this week, when we asked Facebook about it, the world’s largest social network enabled advertisers to direct their pitches to the news feeds of almost 2,300 people who expressed interest in the topics of ‘Jew hater,’ ‘How to burn jews,’ or, ‘History of ‘why jews ruin the world.'”

It only took 15 minutes for their controversial advertisement fields to be approved.

Slate, after the ProPublica report was released, also attempted to test Facebook’s ad fields. They described submitting target fields like “Kill Muslims Radicals,” and “Ku-Klux-Klan,” being approved in “just one minute.”

On Sept. 14, Facebook released a statement that acknowledged the findings. It was noted that only “an extremely small number of people were targeted” by the used fields. “Hate speech and discriminatory advertising have no place on our platform,” the post emphasized.

The statement declared that, “to help ensure that targeting is not used for discriminatory purposes, we are removing these self-reported targeting fields until we have the right processes in place to help prevent this issue.”

As crazy as it is to acknowledge, white supremacists, neo-nazis, and racists of all sorts of categories are rampant nowadays. They feed on online trolling, and, as we saw this summer at Charlottesville, are not shy about reviving the racist language of the 1940s. News and media platforms, plus anything that caters to information and advertisements really, cannot ignore that these people exist.

According to the ProPublica report, “Facebook’s algorithm automatically transforms people’s declared interests into advertising categories.” This is unlike “traditional media companies” that market to specifically selected and targeted audiences. These extremist groups do not need a specific market to be reached by those that want to spread their racist supremacist messages. They’ll find and spread that information in other ways.

Instead of shutting down these fields, Facebook should make sure they stay available. They should continue with their proposed plan to hire 3,000 more moderators — and not just rely on algorithms.

Facebook would not be alone in this sort of approach. This past July, Youtube declared that it would be taking new approaches to combat extremist content on its site. The site now employs the Redirect Method, so that, “when people search for certain keywords on YouTube, we will display a playlist of videos debunking violent extremist recruiting narratives.”

Facebook could make this an opportunity to ensure that pro-Jewish, anti-Nazi, positive materials and information reached these targeted audiences, who could otherwise avoid dissenting viewpoints.

Gabriella Munoz is a commentary desk intern with the Washington Examiner and a student at Georgetown University.

Related Content