Mark Zuckerberg readily concedes that Facebook has made mistakes, and he counts the social media platform’s initial blocking of posts from conservative commentators Diamond & Silk and an Easter ad from a Catholic university among them.
The incidents were among several that GOP lawmakers cited in a House Energy and Commerce Committee hearing on Wednesday to question whether Facebook displays a liberal bias in its content decisions, following claims made by Republican Sen. Ted Cruz of Texas the day before.
While the consecutive hearings were convened to examine Facebook’s privacy practices in the wake of revelations that a consultant for President Trump’s 2016 presidential campaign had improperly accessed data for some 87 million users, lawmakers took advantage of the opportunity to address a variety of other concerns.
Several focused on whether Facebook’s efforts to ensure the legitimacy of news content after accusations that Russian actors used false articles to influence the 2016 election had silenced conservative voices more aggressively than others.
“We are deeply offended by the censoring of content inappropriately by Facebook,” said Rep. Leonard Lance, a New Jersey Republican. “I would be offended if the censoring were occurring on the left as well as the right.”
Zuckerberg, who serves as CEO of the company he founded while a student at Harvard, repeatedly attempted to assure members of Congress that Facebook provides a platform for all ideas and has safeguards in place to ensure that its content oversight isn’t biased.
[Related: Diamond and Silk slam Mark Zuckerberg for dodging questions on Facebook bias]
The platform has already reached out to Diamond and Silk, the North Carolina supporters of President Trump whose real names are Lynnette Hardaway and Rochelle Richardson, to correct its mistaken flagging of their content as “unsafe to the community,” said Zuckerberg.
The original rejection of an Easter ad from Franciscan University of Steubenville, Ohio, that showed a picture of the San Damiano crucifix was likewise erroneous, the company has said, and it apologized to the school. Facebook had identified the image, which depicted bleeding nail wounds in Jesus’ hands, as “shocking, sensational or violent,” a conclusion that Rep. Cathy McMorris Rodgers, a Washington Republican, questioned.
“Given that your company has since said it didin’t violate terms of servce, how can users know their content is being viewed and judged according to objective standards?” she asked. “This is an important issue in building trust.”
“Unfortunately, with the amount of content we have in our system and the current systems that we have in place to review, we make a relatively small percent of mistakes in content review, but that’s too many, and this is an area where we need to improve,” Zuckerberg said. “I wouldn’t extrapolate from a few examples to assuming that the overall system is biased.”
Artificial intelligence capabilities will make content-review issues progressively easier to address, Zuckerberg said, which is why Facebook is investing in them. But they also raise thorny issues involving free speech, guaranteed by the U.S. Constitution’s First Amendment.
Having algorithms that proactively analyze, and potentially block some forms of content, “creates massive questions for society about what obligations we want to require companies to fulfill,” Zuckerberg told the Senate Judiciary and Commerce committees in a joint hearing on Tuesday.
Preventing posts by terrorist groups inciting violence is easy. Deciding when passionate arguments on a particular social issue go too far, and who should make the choice, is much trickier, potentially casting tech firms in the role of Big Brother, the dictatorial entity that policed commentary in George Orwell’s dystopian novel 1984.
“That’s a question we need to struggle with as a country, ” Zuckerberg said, “because other countries are, and they’re putting laws in place.”
A relatively easy way to address Facebok’s content dilemma would be writing a policy that mirrors the First Amendment, argued Rep. Jeff Duncan, a South Carolina Republican, who offered Zuckerberg a pamphlet-sized copy of the Constitution.
“We can all agree that certain content like terrorist propaganda should have no place on our network,” the CEO replied. “My understanding of the First Amendment is that that kind of speech is allowed in the world, and I just don’t think it’s the kind of thing we want to spread on the Internet.”
That conclusion, however, is a more restrictive stance than the Constitution so a policy like Duncan proposed wouldn’t work effectively.
“We make a number of mistakes in content review today that I don’t think only focus on one political persuasion,” Zuckerberg said. “It’s unfortunate that when those happen, people think that we’re focused on them.”

