laitimes

The new features of iOS15 have been collectively boycotted, and this time it is really difficult to say...

author:Gu Cheng

August 10 news: There is still a month or so to go before the official release of iOS15, in addition to the new features shared in the beta version before, Apple also plans to launch a system process called "Hash" in the official version of iOS 15.

The new features of iOS15 have been collectively boycotted, and this time it is really difficult to say...

It is also because of this feature that Apple has been publicly opposed by more than 4,000 organizations, as well as security and privacy experts, cryptographers, researchers, professors, legal experts and Apple consumers.

hashing is a process that transforms an input of any length into a fixed-length output through a hashing algorithm, and Apple plans to use this process to detect "child abuse material" in the iPhone.

The process compares test results with a hashed database provided by the National Center for Missing and Abused Children in the United States to help find missing children and address child abuse.

The new features of iOS15 have been collectively boycotted, and this time it is really difficult to say...

It is precisely because the process may detect pictures in the phone, which makes some people have privacy leakage concerns, which is also an important reason for the crowd to resist the function.

Subsequently, in order to solve everyone's concerns, Apple responded to the feature.

The response said that at present, the function will only run in the United States, and the process only detects pictures that have been uploaded to iCloud, and users cannot be detected if they turn off iCloud synchronization.

The hashing process will self-detect the images that have been uploaded to iCloud, and only if a certain number of offending files are reached will they be uploaded to Apple's database.

The new features of iOS15 have been collectively boycotted, and this time it is really difficult to say...

In other words, apple can only see the offending files that are uploaded to the database, and other photos in iCloud Apple will still not be consulted.

At the same time, to avoid accidental injury, Apple also recruited a team of 500 people to manually review these offending documents, such as the results of the review are indeed irregular (child abuse/abduction), and if necessary, Apple will send a report to NCMEC (child abuse materials) or notify law enforcement.

In addition to hashing, Apple will also launch the "Infocomm Security" feature in the new system, which only works on images sent or received in the messaging app by children's accounts set up in Home Sharing.

Its main role is to prevent children from sending and receiving unhealthy pictures, and if necessary, to notify parents of their iCloud accounts.

Do you agree or disagree with such a feature?