laitimes

Will the user's conversation with Siri be sent to Apple by default? Response: Bug fixed

Will the user's conversation with Siri be sent to Apple by default? Response: Bug fixed

When you talk to Siri, are you worried that the conversation will be heard by Apple employees?

Previously, Apple has introduced a feature that allows users to choose not to share Siri recordings with Apple for privacy. Recently, Apple discovered a new vulnerability - even if the user refused, the conversation between him and Siri was still shared with Apple. At present, Apple said that this vulnerability has been fixed.

On February 8, local time, according to the technology media ZDNet, Apple said that a vulnerability found in the early version of iOS15 caused the recording file of the user's conversation with Siri to be shared with Apple, even if the user has refused to share. To avoid making the problem worse, Apple has set the "Siri Recording Sharing Feature" for many users to be disabled in iOS 15.2.

Apple said the vulnerability has been fixed, and in the second beta version of iOS 15.4, users will be asked if they would like to share conversations with Siri, and they can choose to join or exit.

"In iOS 15.2, we turned off the 'Improve Siri and Dictation' setting for many Siri users, while we fixed a bug introduced in iOS 15. This vulnerability inadvertently enables the setting for a small subset of devices. Since we found the error, we have stopped reviewing and are removing all audio received from the affected device. Apple said in a statement.

Tech media outlet THE VERGE believes that this is a "serious mistake", and Apple's statement does not tell users when it is affected and how many iPhones are affected by this vulnerability. Without transparency, users can't tell if their conversations with Siri have been heard by Apple employees. Apple should notify all affected users and urge them to upgrade their phone versions as soon as possible.

In fact, as early as 2019, according to the Guardian, Apple will hire outside personnel to manually analyze the voice commands received by Siri to improve Siri performance, including personal information such as location information and contact information.

Apple contractors who reported the matter said they chose to "expose" the matter because of concerns that the private information would be leaked, especially given the accidental activation of Siri. Among the voices they've heard are doctors talking to patients about medical history information, suspected drug dealings masked by engine noise, or even sexual information recorded by the Apple Watch or HomePod speakers.

Subsequently, Apple admitted to hiring humans to analyze some of Siri's voice commands, but stressed that the number of these voices was less than 1% of Siri's daily activation, and soon suspended the program.

Apple has said that starting in the fall of 2019, it will "no longer retain recordings of users interacting with Siri" by default, but will still use "computer-generated recording text to help Siri improve." "Users can choose to participate in programs that help Apple improve Siri, or they can opt out at any time." We realize that our noble ideals have not yet been fully practiced, and we apologize for that. ”

Synthesis: Nandu reporter Sun Chao

Read on