Facebook announced Tuesday it will add thousands of employees in an effort to eliminate what it deems "hate speech" on people's pages.
The California-based company said in a blog post Tuesday it will be hiring an additional 3,000 people to its community operations team, with the aim of deleting nearly 66,000 posts deemed to be offensive each week.
"Our current definition of hate speech is anything that directly attacks people based on what are known as their 'protected characteristics' — race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, or serious disability or disease," said Richard Allan, Facebook vice president of public policy for Europe, the Middle East and Africa said in the blog post. "There is no universally accepted answer for when something crosses the line. Although a number of countries have laws against hate speech, their definitions of it vary significantly."
The company rejected the idea that it is trying to censor what their users post, and said failing to remove content that other users feel is offensive would not be "living up to the values in our Community Standards."
"When we remove something you posted and believe is a reasonable political view, it can feel like censorship. We know how strongly people feel when we make such mistakes, and we're constantly working to improve our processes and explain things more fully," Allan added.
Facebook has been under fire following the posting of several violent videos over the past year, including live broadcasts of multiple murders and suicides.
"These reviewers will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation. And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they're about to harm themselves, or because they're in danger from someone else," Facebook CEO Mark Zuckerberg said in May.