Facebook updated its 'community standards' late on Sunday night. The social network site has defined its perimeters on banned content as it works to tighten its restrictions on dangerous and controversial content. The new update provides specific examples of content prohibited under its general rules against direct threats, hate speech and criminal activity. Facebook CEO Mark Zuckerberg has said that these are not new restrictions per se, but rather a set of more specific guidelines.

Drawing the line between freedom and removing a platform for terrorist organisations is a difficult one to define and Zuckerberg took to his own page to say that: "Having a voice is not some absolute state. It's not the case that you either have a voice or you don't." Perhaps in anticipation of any cries of censorship or backlash that might occur.

Facebook has long forbidden groups it deems to be terrorist organisations from sharing content on its site, but under the new guidelines, it has said that it will remove content that suggest support for these groups, their leaders or their actions. In relation to issues regarding terrorist groups Facebook also said on Sunday that it recorded a slight increase in government requests for account data in the second half of 2014.

Two years ago Facebook said it would use a broader set of criteria to determine when violent videos and content are permitted on the site after a video of a masked man beheading a woman in Mexico prompted an outcry. The company has also been criticised for allowing pages that glorify violence against women. But the new guidelines aim to prevent images of graphic violence and gore to be shared. 

On a more personal issue Facebook also defines that it will not tolerate images or content "shared in revenge or without permission," in a move to help clamp down on harassment online. 

"People rightfully want to know what content we will take down, what controversial content we'll leave up, and why," said Zuckerberg.