Under the new measures, VSPs including TikTok, Snapchat, Vimeo and Twitch, are required by law to take measures to protect under-18s from potentially harmful video content. All users must be protected from videos likely to incite violence or hatred, and certain types of criminal content.

Past research from the regulator had found that a third of users said they had been exposed to hateful content, while a quarter said they had seen unwanted violent or disturbing content on the platforms. One in five said they had seen videos or content that encouraged racism.

Ofcom said it had already begun discussing with the VSPs what their responsibilities are and how they should comply with them. While the body will not be monitoring content itself like it does with TV broadcasts, the laws lay out measures that providers must take to protect their users.

The Internet Watch Foundation reported a 77 per cent increase in the amount of “self-generated” abuse content in...