New Apple Tech Aims To Protect Children

Posted by Kirhat | Friday, August 06, 2021 | | 0 comments »

Apple Messenger
As a result of the pandemic, online transactions are becoming a staple in almost all household. Aside from the benefits it brings, dangers to children are lurking in the shadows. To address this risk, Apple later this year will roll out new tools that will warn children and parents if the child sends or receives sexually explicit photos through the Messages app.

The new feature is part of a handful of new technologies Apple is introducing that aim to limit the spread of Child Sexual Abuse Material (CSAM) across Apple's platforms and services.

As part of these developments, Apple will be able to detect known CSAM images on its mobile devices, like iPhone and iPad, and in photos uploaded to iCloud, while still respecting consumer privacy.

The new Messages feature, meanwhile, is meant to enable parents to play a more active and informed role when it comes to helping their children learn to navigate online communication. Through a software update rolling out later this year, Messages will be able to use on-device machine learning to analyze image attachments and determine if a photo being shared is sexually explicit. This technology does not require Apple to access or read the child's private communications, as all the processing happens on the device. Nothing is passed back to Apple's servers in the cloud.

If a sensitive photo is discovered in a message thread, the image will be blocked and a label will appear below the photo that states, "this may be sensitive" with a link to click to view the photo.

If the child chooses to view the photo, another screen appears with more information. Here, a message informs the child that sensitive photos and videos "show the private body parts that you cover with bathing suits" and "it's not your fault, but sensitive photos and videos can be used to harm you."

It also suggests that the person in the photo or video may not want it to be seen and it could have been shared without their knowing.

These warnings aim to help guide the child to make the right decision by choosing not to view the content.

However, if the child clicks through to view the photo anyway, they'll then be shown an additional screen that informs them that if they choose to view the photo, their parents will be notified. The screen also explains that their parents want them to be safe and suggests that the child talk to someone if they feel pressured. It offers a link to more resources for getting help, as well.

There's still an option at the bottom of the screen to view the photo, but again, it's not the default choice. Instead, the screen is designed in a way where the option to not view the photo is highlighted.

0 comments

Post a Comment