Meta faces criticism after introducing Teen Accounts on Facebook and Messenger despite built-in safety features. While the company touts parental controls and restrictions on messaging, story viewing, and comments, 41 states have filed lawsuits alleging harm to young users. The new accounts automatically apply to users under 16 and require parental approval for changes. Meta’s expansion comes amid growing regulatory pressure and advocacy for stronger youth protection legislation.

Despite Meta’s recent introduction of expanded safeguards for teenage users, the tech giant continues to face significant pressure from regulators and parents concerned about youth safety online. The company has launched Teen Accounts on Facebook and Messenger, following their initial implementation on Instagram.
You’ll find these Teen Accounts come with built-in restrictions designed to limit exposure to inappropriate content and unwanted interactions. The features have shown promising adoption rates, with 97% of teens aged 13-15 keeping the protections enabled on Instagram.
Meta’s rollout has begun in the US, UK, Australia, and Canada, with plans to expand to more regions. Teen Accounts automatically apply to users under 16, requiring parental approval for any settings changes.
The timing isn’t coincidental. Forty-one states and the District of Columbia filed lawsuits against Meta in 2023, alleging harm to young users through certain platform features. Lawmakers continue pushing for legislation like the Kids Online Safety Act to enhance protections for minors online.
These Teen Accounts restrict who can contact teens, limiting messages to accounts they follow or have contacted. Story viewership is restricted to friends or followed accounts, while mentions, tags, and comments on posts are limited to friends or accounts the teen follows. The platform applies time spent limits as part of its comprehensive safety approach for young users.
Parents play a vital role in this new system. Teens under 16 need parental consent to change settings, including turning off nudity filters in Instagram DMs or going live on the platform. Parents can now monitor their children’s online activities through Meta’s Family Center dashboard.
The statistical support for these measures appears strong. Fifty-four million teens have shifted to Teen Accounts on Instagram, and 94% of surveyed parents believe these accounts benefit families.
Meta has also implemented screen time management features, with reminders for teens to log off after an hour of daily use and automatic “Quiet mode” activation at night.
Whether these measures will satisfy regulators remains uncertain. Major platforms like Meta, TikTok, and YouTube continue facing scrutiny and lawsuits regarding their impact on youth, despite these new protective measures.
Frequently Asked Questions
What Parental Controls Are Available for Teens on Facebook and Messenger?
As a parent, you can access several controls for teens on Facebook and Messenger.
You can approve who messages your teen, manage their friend list, and set messaging restrictions. The platform offers time management features that encourage breaks after an hour and activate “Quiet mode” at night.
You’ll also be able to view your teen’s contacts, privacy settings, and blocked accounts through parental supervision tools that are available globally.
How Does Meta Verify the Age of Young Users?
Meta verifies young users’ ages through multiple methods.
You can choose from Yoti facial age estimation, which analyzes facial features to estimate your age, or video selfie verification.
You might also upload an ID document, which is encrypted and securely stored.
Meta offers social vouching, where adult mutual followers confirm your age.
They also use an Adult Classifier that examines account interactions to determine if you’re a teen or adult.
Can Teens Opt Out of Algorithmic Content Recommendations?
You currently have limited options to opt out of algorithmic content recommendations on social media.
While complete opt-out features aren’t widely available, Teen Accounts restrict interactions to friends and followed accounts, reducing exposure to harmful content.
You can manually influence algorithms by following or unfollowing accounts and adjusting privacy settings to limit data collection.
Using these settings won’t eliminate algorithmic recommendations entirely, but they can help shape what content appears in your feed.
What Data Does Meta Collect From Teenage Users?
Meta collects various personal data from you as a teenage user. This includes your engagement metrics, like how much time you spend on platforms.
They gather information for targeted advertising purposes, even without explicit parental consent in some cases. Your browsing behaviors, content preferences, and interactions with posts are tracked.
Meta also monitors your messaging activity and location data, though teen accounts have some restrictions on data collection compared to adult accounts.
How Does Meta Respond to Concerns About Teen Mental Health?
Meta responds to teen mental health concerns through several key actions.
They’ve introduced Teen Accounts with enhanced safety features and restrictive content filters that block harmful material related to self-harm and eating disorders.
You’ll notice they require parental consent for users under 16 to change settings.
Meta also conducts research on platform impacts, though critics say they prioritize engagement over safeguards.
Their response comes amid multiple lawsuits from states and school districts alleging harm to teen mental health.