Google Messages Introduces On-Device Nudity Warnings to Enhance Teen Safety

google-message-nude-alert
0
0

Google has begun rolling out a new safety feature in its Messages app that warns users about sensitive content involving nudity. Initially announced late last year, the feature uses on-device AI to detect potentially explicit images. If such content is found, the image is automatically blurred and a warning is issued—particularly targeting young users. For children, the feature also provides the option to block the sender and offers educational resources to guide safer online behavior.

This safety tool is turned on by default for supervised accounts and teens signed in to their Google accounts. Parents can manage the settings through the Family Link app. Teenagers aged 13 to 17 who are not under supervision can choose to disable it in the Messages settings. For all other users, the feature is turned off unless manually enabled.

When an explicit image is detected, a prompt appears asking if the user still wants to view or send the image, with options like “No, don’t view” or “Yes, view.” While it doesn’t fully prevent users from sending or receiving such content, it acts as a “speed bump” to encourage safer decisions. Google emphasizes that all image analysis is done locally on the device using its SafetyCore system, ensuring user privacy by avoiding data transmission to its servers.

Currently, the update is gradually reaching Android devices and may not yet be available to all users, according to reports by 9to5Google. This move marks another step by Google to promote digital safety, especially among younger users in today’s increasingly connected world.