Facebook to add 3,000 to team
reviewing posts with hate speech, crimes, and other harming posts
A week after news broke of multiple videos of suicides posted on Facebook remaining on the site for hours, the company has announced a new plan to add 3,000 more people to its operations team to screen for harmful videos and other posts to respond to them more quickly in the future. Mark Zuckerberg, the CEO of Facebook said that this would be in addition to the 4,500 people already working in this capacity. What is not clear is whether these are full-time employees or contractors, Facebook posts its quarterly earnings (interesting timing to release this on just ahead of those) — and Zuckerberg said that there are “millions of reports” received every week.
“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg wrote in a post earlier today. “We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”The move to add more human curation into the mix is nevertheless a step in the right direction.
Facebook to add 3,000 to team reviewing postswith hate speech, crimes, and other harming posts
A week after news broke of multiple videos of suicides posted on Facebook remaining on the site for hours, the company has announced a new plan to add 3,000 more people to its operations teamto screen for harmful videos and other posts to respond to them more quickly in the future. Mark Zuckerberg, the CEO of Facebook said that this would be in addition to the 4,500 people already.