Facebook has tightened its rules on who can make money from advertising on its network, responding to criticism that it is too simple for providers of fake news and sensational headlines to cash in.
The world's largest social network implemented the new standards with immediate effect to make it clearer which publishers can earn money on Facebook and with what content.
The new standards coincided with an appearance by chief operating officer Sheryl Sandberg in Germany, one of Facebook's toughest critics on hate speech and safeguarding privacy.
Facebook, together with Alphabet's Google, accounts for around two-fifths of internet advertising, which is forecast by consultancy Zenith to grow by 13 percent to US$205 billion this year - overtaking television as the biggest channel for companies to pitch their wares to consumers.
Marketing executives have criticised Facebook for failing to ensure that the digital ads distributed to its more than 2 billion active users reach their intended audience.
It has also drawn criticism from major advertisers for inflating its audience figures and not adequately tracking ads, which were sometimes placed alongside content detrimental to the brands being promoted.
On Wednesday, Facebook said it would seek accreditation from the Media Ratings Council, a US non-profit organisation, for audience measurement services.
"We take very seriously our responsibility to earn and maintain the trust of people in businesses," Sandberg told dmexco, a major digital marketing gathering in Cologne.
To make money on Facebook in future, content creators and publishers will have to comply with its so-called community standards, which seek to ensure that content is authentic, not offensive and adheres to its guidelines.
Those publishing content flagged as misinformation or false news may be ruled ineligible to profit from Facebook, as would creators of clickbait and sensationalism, according to the rules seen by Reuters.
Facebook's guidelines for monetisation give broad definitions of content that would be disallowed - including "family entertainment characters engaged in violent, sexualised, or otherwise inappropriate behaviour".
Also covered are depictions of death, casualties and physical injuries in tragedies such as natural disasters; and content that is incendiary, inflammatory, demeaning or disparaging towards people or groups.
Facebook will also step up its monitoring of hate speech, adding 3000 content reviewers to nearly double the size of its existing team, Senior Vice President for Global Marketing Solutions Carolyn Everson said in a blog post.