Instagram has launched new “built-in protections” to make the platform safer for teenagers.

All users under 18 will now have privacy settings turned on by default, with younger teens aged 13 to 15 only able to adjust them at the request of a parent or guardian linked to their account.

The new “teen accounts” rolled out last week in the UK, but some critics don’t think the changes go far enough.

Children’s charity NSPCC believes Meta still needs to be more proactive in preventing malicious content from “proliferating” on Instagram.

However, the charity’s child safety policy manager, Rani Govender, said the changes are a “step in the right direction.”

Teens using Instagram will now be much less visible; they will have to actively accept new followers, who won’t be able to see content pending approval.

There are also stricter controls on recommendations of potentially harmful content, while notifications will be turned off by default overnight.

Parents will also have more control, being able to see who their child messages – but not the content of those messages – and an overview of their topic preferences.

Meta recently updated its age verification tools and will soon leverage AI to spot children using adult accounts and switch them to the new teen settings.

It says the platform now offers a “new experience for teens, guided by parents”.

The changes should also help Meta adhere to the UK’s Online Safety Act, which was passed into law this year.

Regulator Ofcom has warned that substantial punishments will be imposed on any digital platform that fails to comply with the new rules.