EDITION : English/Korean

Nav
Updated

Apple’s Messages Feature Which Blurs Nudity-Containing Images Gets Released Internationally

by Victoria Marian Belmis / Apr 25, 2022 10:25 AM EDT
Apple Software Chief Craig Federighi

Apple's "communication safety in Messages" feature is being rolled out to the Messages app for iOS, iPadOS, and macOS users based in the UK, Canada, New Zealand, and Australia. The feature works by automatically blurring explicit images containing nudity sent to children using the company's messaging service.

A child in that situation will be promptly warned as well as reassured that it is okay to ignore the photo and leave the conversation. According to Apple's official site, scanning of image attachments and indications doesn't leave the device and does not impact the end-to-end encryption of messages.

READ: Apple Analyst Suggests The iPhone 14 Lineup Will Have An Autofocus Front Camera

"You're not alone, and can always get help from someone you trust or with trained professionals," the pop-up message will display. "You can also block this person."

The safety feature is not enabled by default. Parents can activate the warnings for their child/children's accounts in their Family Sharing plan. Instructions on how to enable the feature, are detailed in its support column.

Just like with its initial US release, children will also have the option to message an adult they trust about a flagged photo. During Apple's announcement last August, it suggested that the notification would happen automatically. Swiftly, critics raised the risk of the original approach as it could out queer kids to their parents and could otherwise lead to abuse.

READ: Apple Plans To Upgrade Its iPhone Health App With Several New Features!

Thereafter, the company also expanded the rollout of a different guidance feature for Spotlight, Siri, and Safari. Online searches for topics relating to child sexual abuse will direct users toward additional resources. One example is if a user asks Siri how they can report child exploitation, they will be pointed to relevant resources for where and how to file a report.

Originally, Apple even announced a third initiative last August. It involves scanning photos for child sexual abuse material (CSAM) before they're uploaded to a user's iCloud account. However, this feature drew intense backlash from many privacy advocates as it risked introducing a backdoor that would undermine Apple users' security. The company has yet to provide an update about this detection feature as it continues to address concerns.

Like us and Follow us
© 2024 Korea Portal, All rights reserved. Do not reproduce without permission.
Connect with us : facebook twitter google rss

Subscribe to our Newsletter

Don't Miss

Real Time Analytics