Have you ever clicked ‘report’ on Facebook?
Perhaps you’ve considered it, but didn’t know what would happen if you did, or if what you were considering reporting was in fact reportable. Well, BullGuard has looked into it for you, and here it is: a quick guide to what and how things are ‘reported’ on Facebook.
What should be reported on Facebook?
Facebook asks their users to help them police the site for pornography, hate speech, threats, graphic violence, bullying and spam, and has teams on hand 24/7 to deal with such issues. In fact they have divided their reporting teams into four groups to better handle the reports:
- Hate and harassment
- Abusive content
How are Facebook reports handled?
After a report has been received by the appropriate team, the content in question is reviewed and addressed, either by:
- Removing it
- Warning the user that posted said content
- Revoking the user’s ability to share specific types of content
- Disabling particular features for the user
- Canceling the Facebook account
- Reporting the issue to law enforcement
In incidents where the reported issue does not violate any Facebook regulations, the site does offer help for users "to better resolve their issues beyond simply blocking or unfriending another user." Here's a cool Facebook infographic, if you want more details on specific reporting situations (click on the snippet to the right to see the full-size infographic).
It’s really quite simple. Have you ever reported something on Facebook? Share your story with us!