Artificial Intelligence

Meta’s AI Now Scans Photos for Bone Structure to Catch Underage Users on Instagram and Facebook

Published

on

Meta’s AI Now Scans Photos for Bone Structure to Catch Underage Users on Instagram and Facebook

Meta is taking a bold new step in age verification. The company now uses AI visual analysis age verification to scan photos and videos on Instagram and Facebook for physical clues like height and bone structure. This goes far beyond simply checking what users type in their profiles.

The goal is straightforward: find and remove accounts belonging to children under 13 who may have signed up using a fake birthday. By analyzing visual cues, Meta aims to close a loophole that has long frustrated parents and regulators alike.

How Does the Visual Analysis Actually Work?

First, a key clarification: this is not facial recognition. The AI does not identify who someone is. Instead, it scans for general physical indicators — such as body proportions and skeletal features — to estimate a broad age range.

This AI visual analysis age verification system works alongside existing text-based detection. That older method looks for contextual clues like birthday mentions, references to school grades, and information in bios, posts, captions, and comments. Meta also plans to expand this text analysis to Instagram Reels, Instagram Live, and Facebook Groups.

What Happens When an Account Is Flagged?

If an account is flagged as potentially underage, it gets deactivated immediately. The user then needs to verify their age to get it back. If they cannot, the account is permanently deleted. This visual analysis is currently live in select countries, with a broader rollout planned soon.

However, privacy advocates have raised concerns. Critics worry that scanning photos for bone structure could lead to false positives or misuse. Meta insists the technology is designed to protect children, not to profile them.

What Else Is Meta Doing for Teen Safety?

Beyond age verification, Meta is expanding its Teen Accounts system. This feature automatically places users the platform suspects are between 13 and 15 into a stricter account experience. That means private accounts by default, direct messages limited to people they already know, and hidden harmful comments.

This expansion now covers Instagram in Brazil and 27 EU countries. It follows earlier content restrictions modeled on film ratings. Notably, Facebook in the US is getting this feature for the first time, with the UK and EU following in June. Meta has also given parents visibility into their kids’ AI chats as part of the same broader push.

In addition, Meta is rolling out new educational resources for teens. The company hopes these tools will help young users make smarter choices online. For more on how social platforms handle youth safety, you can read about youth safety features on social media.

Legal and Regulatory Pressure Mounts

These moves come as Meta faces mounting legal and regulatory pressure over child safety. The company recently paid a $375 million penalty in New Mexico over privacy violations. Meanwhile, the European Commission is investigating whether Meta’s platforms are doing enough to keep children off them.

This is not just about compliance. It is about rebuilding trust. Parents and lawmakers alike are demanding stronger protections. Meta’s new AI-driven approach is a direct response to that demand.

Yet questions remain. Can AI accurately estimate age from bone structure? Will false positives harm legitimate users? And how will Meta handle privacy concerns in regions with strict data protection laws? These are issues the company must address as it rolls out the technology globally.

For a deeper look at how AI is reshaping online safety, check out AI ethics and privacy in social media. And if you are curious about the technical side, explore how age estimation technology works.

Ultimately, Meta’s use of AI to scan for bone structure marks a significant shift in digital age verification. It is a powerful tool, but one that must be wielded carefully. The balance between safety and privacy has never been more delicate.

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version