Meta is launching Teen Accounts on Facebook and Messenger, providing an experience with built-in protections for young users. This feature will initially be available in the U.S., U.K., Australia, and Canada, with plans for future expansion.
Teen Accounts were first introduced on Instagram last September in response to criticism from U.S. lawmakers regarding the protection of teens on social media. In Tuesday’s announcement, Meta also revealed new protections for Teen Accounts on Instagram.
With the rollout to Facebook and Messenger, teens will automatically enter an environment designed to limit inappropriate content and unwanted interactions. Those under 16 will require parental permission to change any settings.
While Meta’s blog post did not specify all restrictions, the company informed TechCrunch that teens will only receive messages from accounts they follow or have previously messaged. Additionally, only friends can view and respond to their stories, and tags, mentions, and comments will be restricted to people they know.
ICYMT: Liverpool Arsenal Eye $129m Star
Teens will receive reminders to log off after using the platforms for an hour each day and will be enrolled in “Quiet mode” during the night.
On Instagram, teens under 16 won’t be able to go live unless they have parental permission. They will also need consent to disable the app’s feature that blurs images with suspected nudity in direct messages.
These updates reflect Meta’s ongoing efforts to address mental health concerns related to social media use among teens, as highlighted by the U.S. Surgeon General and various states that have considered restricting teen access without parental consent.
Meta has reported moving 54 million teens into Teen Accounts on Instagram, with 97% of users aged 13-15 keeping the built-in protections enabled. A study by Ipsos found that 94% of parents believe Teen Accounts are beneficial, and 85% feel they help facilitate positive experiences for their teens on Instagram.