Top executives from Twitter, YouTube, and Facebook told lawmakers Wednesday that they are making strides in their effort to more quickly take down terror-related content, and thousands of employees are now tasked with that job.

“Since June, YouTube has removed over 160,000 violent extremist videos and has terminated approximately 30,000 channels for violation of our policies against terrorist content,” said Juniper Downs, YouTube’s public policy director, in prepared remarks for the Senate Commerce Committee hearing on Wednesday. “We achieved these results through tougher policies, enhanced enforcement by machines and people, and collaboration with outside experts.”

Downs described how YouTube uses “a mix of technology and humans to remove violent content immediately.” In addition to using algorithms — which catch 98 percent of extremist content — YouTube’s parent company Google also has about 10,000 people monitoring content on the site daily.

“We have developed rigorous policies and programs to defend the use of our platforms from the spread of hate and incitement to violence," she said.

Twitter’s head of public policy and philanthropy Carlos Monje revealed that though there is no “magic algorithm” for identifying terrorist content, the tech giant was able to use its algorithms to suspend 1.1 million terrorist accounts since 2015 and nearly a half million more in 2017.

Of those suspended accounts, Monje said 75 percent are suspended before they even tweet once.

“As is the case with a determined adversary, as we make it harder for terrorists to use Twitter, their behavior evolves,” Monje explained. “To stay in front of this, we continue to invest in technology to prevent new accounts being opened to replace those we suspend, while also developing further the tools that prevent the distribution of propaganda in the aftermath of attacks.”

Twitter employees have also participated in more than 100 training events since 2015 on fighting extremist content, he said.

Monika Bickert, Facebook’s head of product policy and counterterrorism, said that more than 99 percent of Islamic State and al Qaeda propaganda that they remove from the site is content they identify themselves using image and text matching.

Facebook also intends to double the number of people working on safety and security from 10,000 to 20,000 by the end of 2018, she said.

"We believe that a key part of combating extremism is preventing recruitment by disrupting the underlying ideologies that drive people to commit acts of violence. That's why we support a variety of counterspeech efforts," Bickert said.