Google is rolling out a new safety feature for its Messages app on Android. As reported by 9to5Google, the tech giant is introducing a new Sensitive Content Warnings feature for Android users. The feature is designed to detect and blur images containing nudity. The feature is designed to protect users from users from unsolicited explicit content while preserving privacy through on-device processing. The feature uses a new Android system service called "SafetyCore" to perform all classification and blurring locally on the user's phone.
How sensitive content warnings feature works
Automatic Detection & Blurring: Images flagged as containing nudity are blurred before being displayed.
On-Device Privacy: All image classification happens locally using Android’s SafetyCore system—no data is sent to Google servers.
User Controls: When receiving such images, users can:
- Tap to learn why nude images may be harmful
- Block the sender
- Choose to view or ignore the image
In case a user tries to send or forward a flagged image, Google Messages will prompt a warning about potential risks. Users must also confirm their intention before the image is being sent, adding a layer of protection against accidental sharing.
Parents can also manage the feature for children via the Family Link app, ensuring that young users are shielded from inappropriate content.
User Type
| Default Setting
| Can Be Changed?
| Managed Through
|
Adults (18+)
| Off
| Yes
| Messages Settings
|
Unsupervised Teens (13–17)
| On
| Yes
| Google Account Settings
|
Supervised Accounts (Children)
| On
| No
| Family Link App2
|
This rollout is part of Google’s broader push to create a safer digital messaging environment. By combining AI-powered detection with user autonomy and strict privacy safeguards, Google Messages is setting a new standard for responsible communication tools.