In a statement on Wednesday, Facebook said it will add 3,000 additional members to the team charged with monitoring Facebook Live and other areas of the social media platform for offensive, inappropriate, and illegal content.
Murder and rape on Facebook Live
As noted by the Guardian, videos have recently emerged that were streamed live using the Facebook Live app, including a man shooting Robert Godwin, 74, a random victim in Cleveland, and a father killing his baby daughter in Thailand. Two copies of the latter video were viewed 370,000 times. There have also been incidents of gang rape streamed on the platform in recent months.
Criticism of the social media network has been rife, with many saying Facebook is not doing enough to weed out content that violates both its own rules and also shocks and horrifies those who might unintentionally come across it. Facebook Live is just that – live. Any number of users on the platform can “tune in” to watch horrendous crimes, whether intentionally or otherwise.
Facebook will add 3,000 people to the team monitoring Facebook Live for inappropriate, offensive or illegal content https://t.co/N1m4VfMOsz
— The New York Times (@nytimes) May 3, 2017
It’s not just the live video service, however, as a recent scandal over military personnel sharing nude photos of female Marines without their consent was in the news recently, along with illegal gun sales that are reportedly ongoing on the social media platform.
Quicker response times with more monitors
The backlash from recent Facebook Live videos has pushed the company’s CEO, Mark Zuckerberg to take action, in an attempt to stop future incidents of this nature. In a Facebook post, Zuckerberg said that the recent incidents were “heartbreaking,” saying the additional 3,000 monitors will supplement the 4,500 people already working on the company’s community operations team.
Zuckerberg’s first action in building a safer community is to lessen the response time, saying they are working to ensure videos of this nature are easier to report, allowing members of the team to take action sooner. This includes taking down an offensive post, but also being able to assist if someone needs urgent help.
Facebook announced plans for a heap of hires around the world to review and react to reports of harm and harassment. https://t.co/swSnIV1Ppw
— NPR (@NPR) May 3, 2017
Zuckerberg also said the social media network will make it easier for both users and company employees to report problematic posts faster and to speed up their reviewers’ response in determining which posts violate Facebook’s standards.
The type of content to be monitored will also include child exploitation and hate speech.
Suicide prevented by monitors on Live
It will also be made easier for monitors to get in touch with law enforcement if somebody needs assistance. The company’s CEO mentioned a report they received last week, where someone on Live was considering committing suicide. In that case they immediately contacted law enforcement officials, who were able to stop the man from hurting himself. However, Zuckerberg added that in other cases, they haven’t been so fortunate.
As reported by the New York Times, this announcement was made just hours before Facebook was expected to make its quarterly financial report. Business has reportedly boomed, partly due to the popularity of video on the social media platform.