close
close

Association-anemone

Bite-sized brilliance in every update

iOS 18.2 Beta allows kids to report unsolicited nude photos and videos
asane

iOS 18.2 Beta allows kids to report unsolicited nude photos and videos

Apple is making it easier to protect kids from sexually explicit content in messages by expanding its communication safety features in the initial beta of iOS 18.2 with a new option that not only prevents them from seeing nude photos and videos in messages by dimming them, but and allows them. to report messages to Apple.

The Tutor rEPORTS that iOS communications safety feature doesn’t search databases for photo matches and doesn’t report to Apple or law enforcement. Instead, machine learning analyzes images sent and received in the Messages app to identify images that might include nudity, blurring the images and requiring the user to take extra steps to view them. No information about these photos is ever sent from the child’s device.

Now, Apple is adding an additional feature to Communication Safety in iOS 18.2 to allow kids to report any unsolicited nudity they receive. While the feature is initially limited to users in Australia, Apple says it will “roll out globally in the future”.

iPhone will automatically detect images and videos containing nudity that kids might try to receive or send in iMessage, AirDrop, FaceTime, and Photos. Detection occurs on the device to protect privacy.

If a nude image is detected, the child must view two intervention screens before they can continue and are given the offer of resources or a way to contact a parent or guardian.

When the warning appears, users will also have the option to report the images and videos to Apple.

The device prepares a report containing the sensitive image or video, as well as any messages sent immediately before and after the image or video was received or sent. The report will include contact information from both accounts, and users can fill out a form describing what happened.

The report will be reviewed by Apple, which can take action on an account — such as disabling the user’s ability to send messages via iMessage — and also report the issue to law enforcement.

The feature is optional and will not compromise the security of iMessage’s end-to-end encryption, and Apple will not have access to the contents of iMessage conversations unless one of the participants chooses to report it.

The selection of Australia as the first region to receive the new feature coincides with the new codes coming into force in the country. By the end of this year, tech companies will be required to police abusive and terrorist content on children’s cloud and messaging services operating in Australia.