For the first time, Facebook's rules around what can be posted have been revealed.
The Guardian has obtained hundreds of documents outlining the social media giant's policies around moderating users' posts.
Users can choose not to view certain things or have them blocked from their feed.
In recent times, murders, shootings and assaults have been broadcast live on Facebook which has led to calls for Facebook to have more control over their site.
Part of the rules revealed say phrases like "Someone should kill Trump" should be taken down because it references a Head of State.
But telling someone to "f**k off and die" can be left up because it isn't regarded as a credible threat.
Here are a list of things that are okay to post:
- Photos of animal abuse is allowed. Extreme cases of abuse is also allowed but must be marked "disturbing."
- Videos of deaths don't always have to be deleted because they can raise awareness of issues such as mental illness.
- Some photos physical abuse and bullying of children don't have to be deleted.
- Abortions are allowed to be shown as long as there is no nudity.
- Facebook will allow livestreams of self-harm because it doesn't want to punish people in distress.
- Anyone with more than 100,000 followers on social media then becomes a designated public figure, meaning they don't get the same privacy rights as a private individual.
Moderators who spoke anonymously to The Guardian also revealed they find themselves overwhelmed with work and review more than 6.5 million reports of fake accounts a week.
They also say new challenges are creating more problems using revenge porn as an example where users post nude or explicit photos of someone without their permission.