
Following the introduction of its teen account system on Instagram last autumn—complete with parental supervision features and usage restrictions—Meta has now begun extending the same safeguards to Facebook and Messenger, aiming to ensure a safer digital environment for younger users.
As with Instagram’s model, Meta will require users between the ages of 13 and 15 to transition to teen accounts on the updated Facebook and Messenger platforms. Children below this age threshold will remain ineligible to use the services. Additionally, the company will employ detection tools to mitigate the risk of users misrepresenting their age.
Originally launched on Instagram, the teen account system was designed to protect underage users from online harassment, scams, and exposure to inappropriate content. It comes with built-in parental controls that restrict interaction with strangers, limit changes to privacy settings, and allow guardians to oversee screen time and friend lists.
By expanding this system to Facebook and Messenger, Meta aims to replicate the safety and success achieved on Instagram, further reducing the likelihood of teens encountering harmful online experiences.
This update also brings enhancements to Instagram’s existing teen account system. Notably, teens under 16 will now be required to obtain parental consent before accessing the livestreaming feature. Moreover, a new automatic filter has been introduced to detect and block messages containing potentially explicit imagery—thereby enhancing the overall safety of the platform for younger users.