laitimes

iOS 15 off-line anti-pornography features!

Fruit powder home, professional Apple mobile phone technology research for ten years! Apple experts around you~

As early as August this year, Apple announced that it would protect children from pornographic/violent images through three new features in the iOS 15 system, such as sensitive image screening in the Messages information app, detection of child sexual abuse content in iCloud, and related content of Siri and search functions.

iOS 15 off-line anti-pornography features!

In short, Apple will review photos saved to iPhones, photos posted through iMessage, and images uploaded to iCloud to identify child pornography and abusive content (CSAM) and combat its spread.

For example, in the iCloud family, if a child receives a pornographic image on the iMessages message, the picture will be hidden, and Messages will warn the user that the picture is a sensitive picture that is "inappropriate for children". If the child insists on viewing this picture, then the parents in the iCloud family will be notified, and the pop-up window will also contain a link with help information.

iOS 15 off-line anti-pornography features!

Although Apple's starting point is good, Apple's child safety features have caused widespread controversy and criticism. Apple, which has always paid the most attention to the privacy of users, has to actively scan users to save and send photos?

In response to the controversy over the new CSAM (Child Sexual Abuse Photo) detection function, Apple also made it clear that the new feature does not leave any backdoors, including that Apple will not scan the entire album of the iPhone to find child abuse pictures, but use cryptography to compare the pictures with the known database provided by the National Center for Missing and Abused Children.

iOS 15 off-line anti-pornography features!

But then in September, Apple announced that it would delay the launch of its CSAM child protection features, and decided to spend more time gathering iPhone user opinions and making improvements in the coming months before releasing these vital child safety features.

Fast forward to December, a few days ago, some users found that Apple has deleted the introduction of CSAM child protection function on the official website. It can be seen that after consulting with users, most users still do not approve of this function, so Apple has removed this function from the iOS 15 system.

iOS 15 off-line anti-pornography features!

This also means that Apple will not add this new feature in a short period of time, at least in iOS 15 systems, and users do not need to worry that Apple will scan the album and cause privacy leaks.

Where there is fruit powder, there is a fruit powder home, learn apple use skills, learn the latest apple information, please pay attention to: fruit powder home!

Read on