London, Thanasis Gavos

A new stricter code of practice for the protection of minors that social media companies will have to adhere to is being promoted by UK watchdog Ofcom.

The new measures included in the code are proposed as part of the expanded leeway allowed by the new Internet Safety Act.

Popular apps like TikTok and Instagrambut also the search enginesshould “rein in the aggressive algorithms” that promote dangerous content to minors.

These platforms should also install strict age controls to prevent children from accessing pornographic material and content that promotes suicide, self-harm and eating disorders.

Another rule will be that minors will not be allowed to be added to group discussions without their express consent.

Minors will also have increased control over blocking other accounts and disabling comments on their own posts.

Platforms will also be required to invest more generally in better control of the content they host, with a view to protecting minor users.

If companies do not follow the new rules, Ofcom will be able to impose fines of either £18m or 10% of global revenue.

The watchdog will also be able to block companies from providing services and initiate criminal proceedings against executives.

The government has claimed the new law makes the UK “the safest place in the world to be online”.

Chief executive of Ofcom Dame Melanie Dawes commented that “these measures, which go beyond existing practices in the online industry, will bring about a step change in the safety of children online in the UK”.

It also pledged to children and families that Ofcom will not hesitate to impose the penalties resulting from a breach of the new code of practice.

However, according to the timetable set by Ofcom, the new measures will come into force in the second half of 2025.

Parents of children who have taken their own lives due to exposure to dangerous online content have branded the rules “inadequate” and added that any progress is being made “at a snail’s pace”.