Google Messages has begun rolling out warnings for sensitive content that contains nude images after first introducing the feature late last year. The new feature will do two key things if the AI-powered system detects a message containing nudity: it will blur any of those photos and trigger an alert if your child tries to open, send, or forward them. Finally, it will provide you and your child with resources to get help. All detection takes place on the device to ensure that images and data remain private.
Sensitive content warnings are enabled by default for supervised users and registered unsupervised teens, the company notes. Parents control this feature for supervised users through the Family Link app, but unsupervised teens aged 13 to 17 can turn it off in Google Messages settings. For everyone else, the feature is disabled by default.
When Sensitive Content Warning is enabled, the images are blurred and a speed bump prompt opens, allowing the user to block the sender while offering a link to a resource page that details why nudity can be harmful. Next, the program asks the user if they still want to open the message, offering the options “No, do not view” and “Yes, view”. If an attempt is made to send an image, the program offers similar options. Thus, it does not completely block children from sending nude photos, but only provides a warning.
The feature is based on Google’s SafetyCore system, which allows you to classify content on the device using artificial intelligence without sending “personally identifiable data or any classified content or results to Google’s servers,” according to the company. This feature has just begun to appear on Android devices and is not yet widely available, 9to5Google writes.









