YouTube has moved to cut inappropriate videos featuring children's cartoon characters involved in violence and mature themes from young users.

It comes after clips involving beloved characters such as Peppa Pig undergoing painful dental work made it onto the video sharing site's family-friendly app YouTube Kids.

At the moment all content on the app - which is aimed at children between three and 13 - is automatically screened by software before becoming available to underage users.

But some videos featuring "inappropriate use of family entertainment characters" had not been picked up by the algorithm, with uploaders using animation and keywords targeting children to circumvent it.

The new policy aims to stop this and follows moves to remove advertising revenue for such videos.

"Earlier this year, we updated our policies to make content featuring inappropriate use of family entertainment characters ineligible for monetisation," Juniper Downs, YouTube's director of policy, said in a statement.

"We're in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged.

"Age-restricted content is automatically not allowed in YouTube Kids.

"The YouTube team is made up of parents who are committed to improving our apps and getting this right."

The policy announcement comes after a New York Times article earlier this month highlighting how violent and lewd videos had managed to fool the YouTube Kids automated censors.

YouTube said it has been aware of issues for the past year, and has "thousands" of staff working around the clock to eliminate unsuitable content from YouTube Kids.