Meta’s New Teen Safety Rules

Meta Expands Global Rollout of Teen Accounts on Facebook and Messenger Meta is accelerating its push to move younger users on Facebook and Messenger into specialized teen accounts. These accounts come with built-in parental controls and enhanced protections designed specifically for adolescents. The company announced that hundreds of millions of teens are already using these types of accounts across its platforms, including Instagram. This global expansion follows the initial introduction of teen accounts on Instagram a year ago. Earlier in 2025, Meta began rolling out the feature to teens in the United States, Canada, the United Kingdom, and Australia on Facebook and Messenger. Now, the program is being extended to teens worldwide. The use of these teen accounts is mandatory for all users under the age of 18. A key aspect of the teen accounts is the level of parental oversight. For younger teens, specifically those aged 13 to 15, parental permission is required to alter any safety and privacy settings. This gives parents a direct role in their childs online safety. The parental supervision tools allow adults to monitor their teens screen time across Metas apps and see a list of who their child is messaging. To ensure teens are using the correct account type, Meta employs artificial intelligence to detect potential age misrepresentation. The teen accounts themselves have more restrictive default settings. These are intended to automatically provide a safer experience by limiting unwanted contact from adults who are not known to the teen. In a related move for school safety, Instagram is also broadening a separate initiative in the United States. The platform is expanding its school partnership program, which allows designated officials from middle schools and high schools to report bullying and other harmful content directly to Instagram for expedited review. Previously in a pilot phase with a small number of schools, Meta states it received positive feedback and is now opening the program to any US-based school that wishes to join. These developments are part of a multi-year effort by Meta to strengthen safety features for younger users and address criticisms regarding child protection on its platforms. The company is currently facing significant legal challenges, including lawsuits from multiple US states and a major lawsuit in Brazil, all alleging that its platforms have failed to adequately protect minors and have contributed to mental health issues. The rollout of teen accounts represents Metas latest attempt to demonstrate a proactive approach to youth safety as these legal proceedings continue. Update, September 25, 2025, 9:03 AM ET: This article has been updated to clarify the age range for teen accounts.

Leave a Comment

Your email address will not be published. Required fields are marked *