Facebook Combats Live Violence
Since Facebook’s creation in 2004, CEO Mark Zuckerberg has repeatedly emphasized their mission statement: to connect the world. And it seems they have been successful; however, some problems have arose. While features like Facebook LIVE and new messenger options have increased users’ enthusiasm in the site, they have also had a few consequences.
Even in the early days of the social media site, Facebook struggled with censoring controversial content. Their Statement of Rights and Responsibilities has always strictly prohibited “hate speech”. Facebook defines hate speech as “direct and serious attacks on any protected category of people based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or disease.”
While their policy seems pretty straightforward, there have been thousands of instances where posts containing hate speech haven’t been deleted. The problem didn’t become a huge controversy until Facebook LIVE came into the picture. This feature, which allows users to broadcast and watch live videos from their mobile phone, is difficult to moderate. Since the videos are live, Facebook’s moderator team has a hard time deleting every controversial video before it gains viewers.
A live video recently sparked national concern. The video, broadcasted by a man in Cleveland, shows the man walking up to an elderly man on the side of the road and shooting him. It then shows the victim bleeding on the ground as the gunman walks back to his car. The broadcast was eventually removed from Facebook, but has had a lasting effect on the nation.
Many other instances of live violence took place since Facebook LIVE’s release. So how can Facebook fix this?
There is no easy solution. With a feature like this, there will always be one out of millions of videos that slips through the cracks. But Facebook recently began to take action against this live violence. On May 3rd, Mark Zuckerberg posted a statement on his Facebook page discussing community safety on the site. He highlighted the need for quicker action when it comes to taking down violent content and how they will do it.
On top of the 4,500 people in the community operations team, recent online violence has prompted Zuckerberg to hire 3,000 more workers. These reviewers will hopefully help keep Facebook safe and friendly. They are also working on easier ways for users to report hateful or violent content. This way, law enforcement will be contacted earlier if necessary to save lives.
Zuckerberg finished the statement by writing “No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need.” Hopefully, these extra safety measures will prove useful to our community, online and off.