Facebook appeals court: Mark Zuckerberg wants outsiders to review deleted posts

Deciding whether a Facebook user’s posts and pictures violate company standards is often a matter of context: Is the writer using a racial slur to condemn such language or to attack someone? Is a picture that contains nudity pornographic or merely a reflection of cultural norms that differ dramatically between countries and continents?

One of the lessons that founder Mark Zuckerberg has taken from more than a year of debate over how the Menlo Park, Calif.-based social media giant regulates content on its sprawling platform, used by more than 2 billion people a month, is that Facebook shouldn’t make those judgments by itself.

One of the solutions, he told reporters Thursday, is an independent panel that will handle appeals of its content decisions. It would have the authority to determine whether statements, images, and articles “should come up or stay down,” Zuckerberg said.

Much of the debate over content is not only about what’s acceptable, “but about who should be making those decisions in the first place,” he explained. “We shouldn’t be making so many important decisions about free expression and safety on our own.”

The panel, still being set up, is the latest in a series of steps Facebook has taken to address questions over its content decisions, which peaked after U.S. intelligence agencies confirmed that the platform and its competitors, including Twitter and YouTube, were used by foreign intelligence agencies to influence and inflame voters before President Trump’s surprise election victory in 2016.

The company has also written algorithms to flag problematic content before complaints are made and established a team of about 30,000 people to evaluate posts that require more nuanced judgment.

Artificial intelligence tools may simply delete spam, for instance, while more complex cases involving hate speech or bullying might be flagged by algorithms and referred to human evaluators, Zuckerberg said. Once the panel is set up, it would likely handle high-level reviews, much in the manner of an appeals court, with Facebook itself making initial judgments and evaluating questions about its decisions, he added.

“I don’t see any system in which we won’t be making first-line calls about what is up and what it isn’t,” Zuckerberg said. Making such determinations quickly, as Facebook did when it set up a war room to weed out inauthentic posts targeting voters before this year’s mid-term elections is crucial, he believes.

But there is also “a place in this system for a longer-term deliberative process that determines what precedents are being set,” Zuckerberg said. “I don’t think those things are in conflict. They’re going to work in different time scales.”


In the past 12 months alone, Facebook says, it has substantially improved in proactively enforcing its standards.

From July through September, about 52 percent of the posts removed because they contained hate speech were identified by the company’s algorithms and internal reviewers and taken down before any complaints, according to Facebook’s latest Community Standards Enforcement Report, which was published Thursday. That compares with just 24 percent from October through December 2017.

During the past three months, Facebook also took action on 15.4 million pieces of violent and graphic content, removing posts in some cases, putting a warning screen over them in others or, if appropriate, notifying law enforcement officers, according to the report.

“This is more than 10 times the amount we took action on” in the last three months of 2017, Guy Rosen, the head of project management, wrote in a post on the company’s website. “This increase was due to continued improvements in our technology that allows us to automatically apply the same action on extremely similar or identical content.”

Shares of the social media giant have tumbled 18 percent this year to $143.25 as Facebook executives including Zuckerberg fielded questions from Congress not only about hate speech and bullying, but whether content policies enforced in liberal-leaning Silicon Valley were discriminating against conservative voices such as Trump supporters Diamond and Silk.

Zuckerberg said the social media platform’s initial blocking this spring of posts from the video-blogging commentators, whose real names are Lynette Hardaway and Rochelle Richardson, was a mistake. The sisters testified during a separate Congressional hearing that they believe they were targeted because of their support for the president, a claim Facebook has denied.

This fall, New York City joined Facebook investors, including three state governments, in pushing Zuckerberg to give up the role of chairman and calling for someone outside the company’s management team to take it. State treasurers from Illinois, Rhode Island, and Pennsylvania are also backing the measure, originally filed by Trillium Asset Management in June, which calls for a vote at Facebook’s next shareholder meeting.

“I don’t think that’s the right way to go,” Zuckerberg said Thursday, though he said he remains open to a variety of steps to make the company more transparent.

Despite the holdings of state retirement systems such as New York’s in Facebook, and the possibility of support from additional investors, passage of the proposal without Zuckerberg’s support is an impossibility. Including his super-voting shares, which each count for 10 shares of common stock, Zuckerberg controls 59.9 percent of investor voting power in the company, according to its April proxy filing.

“This is not the first time we’ve had to deal with big issues for the company,” he said Thursday. “I certainly don’t love that we’re in a position where we’re not delivering the quality that we want to be delivering every day, but to some degree, you have to know that you’re on the path and doing the right things and allow some time for the teams to get things working the way they need to be.”

Facebook’s net income rose 9 percent to $5.14 billion, or $1.76 a share, in the three months through September, topping the $1.46 average estimate from analysts surveyed by FactSet.

Related Content