Meta Boosts Instagram Child Safety

Meta Expands Teen Safety Features to Adult-Managed Kid Accounts on Instagram

Meta is rolling out additional safety measures for Instagram accounts featuring children, even when those accounts are managed by adults. While children under 13 are not permitted to create their own Instagram profiles, parents and guardians often run accounts on their behalf, sharing photos and videos. Meta acknowledges that most of these accounts are used harmlessly, but they have also become targets for predators who leave inappropriate comments or solicit explicit content via direct messages.

To combat this, Meta will soon apply its strictest messaging controls to adult-managed child accounts, preventing unwanted DMs. The company will also enable Hidden Words by default, allowing account owners to filter out offensive comments. Additionally, Meta will stop recommending these accounts to users blocked by teens, reducing the likelihood of predators discovering them. Suspicious users will find it harder to locate these accounts through search, and Meta will hide comments from potentially risky adults.

Meta has already taken action against rule-breaking accounts, removing 135,000 Instagram profiles earlier this year for leaving sexual comments or requesting explicit images from child-focused accounts. Another 500,000 related Facebook and Instagram accounts were also deleted. The company has pledged to continue aggressively enforcing its policies.

Last year, Meta introduced teen accounts on Instagram, automatically applying stricter privacy settings for users aged 13 to 18. This feature later expanded to Facebook and Messenger in April. The company is also testing AI-powered age detection to identify users who may have lied about their age, ensuring they are placed in the appropriate account category.

In recent months, Meta has introduced more safeguards for younger users. A June update included Location Notice, alerting teens when chatting with someone from another country—a tactic often used by sextortion scammers. Authorities have reported a sharp rise in such cases, where minors are pressured into sending explicit images. Meta also launched a nudity protection feature, blurring detected nude images in DMs to prevent scammers from manipulating victims.

Today, Meta is making safety resources more accessible for teens. A new Safety Tips icon in DMs allows users to quickly access options like blocking, restricting, or reporting suspicious accounts. Additionally, a combined block-and-report feature streamlines the process, letting users take both actions with a single tap.

These updates reflect Meta’s ongoing efforts to enhance online safety for younger users while addressing the risks posed by malicious actors.

Leave a Comment

Your email address will not be published. Required fields are marked *