Tencent Technology
Author: Wang Shuang, Chairman & Founder of Zhongwei Technology, Li Haidan of Tencent Technology
Edited by Zheng Kejun
This article is expected to take 13 minutes to read丨Key points● Apple has implemented a series of security deployments for user data security, including Privacy Cloud Computing (PCC) technology, which ensures the security of data during transmission and processing through encryption and privacy protection mechanisms. At the same time, Apple's "local first" strategy, keeping sensitive data local to the device, reducing the risk of leakage, and strictly controlling access to app permissions, and obfuscating users' IP addresses and network identity information, has not yet reached the same level as other manufacturers in the industry. ● Even if Apple has carefully built a security insurance system for the iPhone, once third-party apps are connected, the line of defense against privacy may not be so strong. However, access to third-party applications is inevitably risky, and it is a common test for all users. The responsibility for privacy and security here does not lie with Apple, but belongs to the relationship between users and third-party applications. ● Data security and personal information protection are covered and emphasized by relevant laws and regulations such as the GDPR of the European Union, the CCPA of United States, the Data Security Law, the Cybersecurity Law and the Personal Information Protection Law of China. In particular, the laws and regulations in the EU region have become stricter, which to a certain extent has guided Internet technology companies such as Apple to limit and reduce potential privacy risks from the perspective of compliance and access. For example, companies that violate GDPR regulations can face hefty fines of up to 4% of their annual global turnover. ● From the perspective of development trends, the emergence of quantum computing poses new challenges to existing security measures. In the future, we may need to introduce new technologies of "post-quantum cryptography" to deal with it, but at present, this technology still faces bottlenecks in efficiency and performance, and we still have many technical problems to solve.
On September 10, Apple's press conference brought a variety of new products with the theme of "Highlight Moments". Among them, Apple Intelligence, as the core promotional point of the iPhone 16, is known as the first AI iPhone in history, and Cook said excitedly: "The new iPhone marks the beginning of an exciting new era!" “。 The explosion of large language models has allowed AI technology to be quickly integrated into mobile phones, making mobile phones more and more "smart". Major domestic mobile phone manufacturers, Huawei, OPPO, vivo, etc., have launched their own AI mobile phones this year. Taking the Huawei Pura 70 series as an example, users can take photos more professionally through the AI function, which can automatically optimize photos, and AI helps users find the best angle and light, allowing users to become "photo masters" in seconds. Or AI can make the smart body assistant of the mobile phone arrange the user's schedule and reminders like a secretary, and provide some personalized suggestions to the user. But at the same time, the large language model industry has also exposed many problems, especially in terms of data security. For example, OpenAI has cached users' query information and results, and the data of non-member users will be used for analysis and model optimization, which is why after announcing the cooperation between the two sides in June this year, it was bombarded and blamed by Musk. In addition, after Samsung uses OpenAI services, there is also a situation where the core internal design information is leaked through a large model. Therefore, it has become extremely important to protect the privacy and security of users, and Apple is well aware of this. At the press conference, Cook repeatedly emphasized that to ensure data security, "we not only have more powerful mobile phones, but also know how to protect users' privacy." However, is the new iPhone really safe?
01
Apple's Privacy Approach: Privacy Cloud and a Local-First Strategy
For the "army" of protecting user privacy and security, Apple launched the trump card - PCC (privacy cloud computing), a confidential computing technology. There are many media on the Internet who have misunderstood this and translated it as "private cloud computing", but in fact, it is completely two concepts here.
Privacy Cloud Computing focuses on protecting the privacy of users' data, using advanced encryption technology and privacy protection mechanisms when processing and analyzing data to ensure that user data will not be leaked or misused. Private cloud computing, on the other hand, is primarily concerned with management and control within a separate server or data center.
PCC is mainly for the purpose of serving on the cloud side. Data travels through multiple networks and servers in transit and storage, increasing the likelihood of interception, attack, or misuse, so the cloud does face a greater risk of data leakage. Therefore, Apple's thinking is that PCC needs to ensure that this data is not leaked during transmission and processing, and that the data can only be used for the operation requested by the user, and the data will be deleted immediately by the cloud after processing, and will not be stored by the cloud. In addition, Apple has adopted a "local first" strategy, which mainly has the advantage that users are more willing to hand over sensitive data, such as images, voice, and email, to the device side AI for processing, without worrying about data being cached or leaked by third parties. We can understand that this strategy can try to keep the user's data "locked" in the "safe" of the mobile phone, and not "run out" casually. Even for complex tasks that need to be handled in the cloud, Apple has a set of encryption and isolation technologies to ensure that users' information is not leaked, mainly including: ● Hardware Security Modules (HSMs): Apple's iCloud data is also complex encrypted when it is necessary to interact with the cloud. Apple uses HSMs to store and manage keys to ensure that users' data remains safe even if the cloud is attacked. ● Data encryption: Data on Apple devices is encrypted when it is stored and transmitted. On iPhone, for example, files and databases are automatically encrypted at the hardware level, using the AES (Advanced Encryption Standard) 256-bit encryption algorithm. Even if user data is stolen, it cannot be interpreted unless you have the decryption key. ●Secure Enclave: The Secure Enclave in iPhones and other Apple devices is a special processor area designed to handle sensitive information such as fingerprints, Face ID, and passwords. This part of the data does not go directly to the main system, so it cannot be accessed by malware or applications. ●Differential Privacy: Apple uses differential privacy technology in some scenarios, that is, randomizes data before it is collected, making it difficult to associate some data with a specific user, even if Apple obtains it. ● End-to-End Encryption: Apple's iMessage and FaceTime communication services use end-to-end encryption, ensuring that data is encrypted as it travels between the sender and receiver, and that no intermediary node, including Apple itself, can interpret these communications. ● Privacy Isolation: Apple uses this type of isolation to prevent malicious apps from accessing user data when applications run in a controlled, sandbox environment that prevents them from accessing system data beyond their permissions. Second, as a closed system, iOS has advantages in privacy and security control, and can be effectively protected by various means. In addition, app permissions are tightly controlled, restricting access to features such as microphones, cameras, contacts, and photo albums. For example, when a user wants to install a new app, Apple will strictly control its access to the microphone, camera, contacts, or photo album, and the app can only use these features with the user's explicit consent. In this way, user privacy is more guaranteed, and data is not taken by third parties. At the same time, Apple has taken some additional steps to protect user privacy, such as obfuscating users' IP addresses and MAC addresses to prevent apps from tracking users through this data. To put it simply, Apple can "hide" some of the user's online identity information, making it difficult for the app to know the specific information about the user and the data.
02
The local side is not equal to a security fortress, and the risk of accessing a third party is still there
From a compliance perspective, if data remains local to the device at all times (not uploaded to an external server or cloud), you reduce the risk of violating privacy regulations such as GDPR (Europe's General Data Protection Regulation) and CCPA (United States California's Consumer Privacy Act).
The GDPR (General Data Protection Regulation) is a regulation introduced by the European Union in 2018 to protect personal data privacy and applies to any company that collects and processes data from EU residents, regardless of where the company is located. The core requirements of the GDPR include ensuring data transparency, data subjects' rights to know, consent, access, and deletion, among others. Individuals can ask the Company to delete their data (the "right to be forgotten"), and the Company must promptly notify users in the event of a data breach. Data can only be transferred from the EU to other countries, in particular non-EU countries, if the data is secure. Companies need to put in place strong security measures to ensure that data is not leaked, and companies that violate the rules face hefty fines, up to 4% of annual global turnover. The CCPA (California Consumer Privacy Act) is a similar regulation introduced by California in the United States that primarily protects the privacy rights of California residents. The CCPA requires companies to disclose to consumers the types of data they collect, the purposes for which they collect it, and whether the data will be sold. Consumers can ask the company to delete their data or prohibit the company from selling their data. The CCPA also has strict requirements for the transfer and use of data, and imposes a notification obligation in the event of a data breach. Companies that fail to comply with the CCPA also face fines. These regulations set clear boundaries and high requirements for the cross-border transfer, storage, and use of data, and businesses need to ensure that data remains in a secure environment at all times and in accordance with users' privacy rights. Put simply, if data stays on the device and doesn't travel abroad, the risk of violating these regulations can be greatly reduced. While security is relative, there is a risk of leakage with any data interaction. However, by completing the calculation on the device side as much as possible, the process of data transmission and cloud processing is reduced, the attack surface is reduced, and the overall security of the product is improved. However, despite Apple's local-first strategy, there may still be uncertain risks in the use of mobile phones. For example, the iPhone allows access to a third-party large-scale model service like OpenAI, and users may not realize that certain requests have been sent to OpenAI's GPT product when they access, potentially leading to the leakage of sensitive information. The user thinks the operation is happening locally, but in fact these query requests have been forwarded to the cloud. Moreover, the processing of user data by third parties such as OpenAI is not regulated by Apple, and users may mistakenly believe that the data is protected by Apple throughout the process, and the actual data has been processed by the third party and follows the privacy policy of the third party, which may be more lenient than Apple's, which is a potential risk. There are many apps on our mobile phones, and different apps may call Apple's security features, or they may use their own large models. Many users can't tell the difference between the two, increasing the risk of sensitive information being leaked. At the same time, Apple provides APIs for different apps, and the information entered by users may be cached by these apps and then forwarded to Apple's large model interface. This means that the user's information needs to go through the third-party app first, and if these apps cache the user's sensitive information, it will still lead to the risk of information leakage. Therefore, how to maintain the consistency of the privacy policies of different data providers is still the direction that needs to be strengthened by regulators.
03
third-party application security,
Apple alone can't solve the problem
Even if Apple has built a security system for the iPhone, once third-party apps are connected, the line of defense against privacy may not be so strong. But the responsibility here does not lie with Apple itself, Apple's core competence is that the iPhone itself can provide a secure architecture platform, software and hardware capabilities. From a technical point of view, it has achieved the maximum security protection under the existing capabilities as much as possible, far exceeding the system-level deployment of other peers in terms of security, and has profound value for user experience. The security issue of accessing a third party is inevitable and a problem that every mobile phone terminal will face. Apple can do its best not to do evil in AI (evil, here refers to improper behaviors such as illegal use of user data and information), and provide relevant technical means and APIs to App developers, but App as the entrance to user information, if the App itself wants to do evil, even if Apple itself does not cache user information such as user prompts, this information can be cached by the App before entering Apple's reinforced AI interface, and the cached information App may be evil. This requires the involvement of regulatory authorities. Data security is not only a mobile phone manufacturer can solve its own security problems, but also requires continuous technological iteration under the guidance of supervision and the requirements of laws and regulations, as well as the joint efforts of users in improving security awareness. On the user side, Apple still has a lot of room for improvement in user education and privacy protection. For example, any security measures are accompanied by costs, which usually require users to perform additional operations during use, which also increases the learning cost. Many users often skip and click confirm when faced with informed authorization or privacy terms, so how to let users truly understand these risks and security points needs to be improved. In the process of using the iPhone, it is recommended that Apple not only inform users and let them authorize, but also provide clear prompts and warnings when it comes to data transmission or the use of third-party models, so that users can have better decision-making power when potential risks occur, so as to protect data privacy. For example, through better user interaction design, key risk warnings can be integrated into the use process to help users understand and respond more easily. In terms of apps, due to the large number of apps in the Apple Store, it is difficult for Apple to conduct detailed reviews of each app, and there may be cases where the app is illegal or abuses user data. In this case, it is necessary for national regulators (such as the Cyberspace Administration of China, the Ministry of Public Security, and other departments) to strengthen the compliance supervision of apps in the process of information collection, use, and transmission, and require stricter review mechanisms. In the event of privacy leakage or infringement, accountability should also be increased and the cost of violation should be increased. This requires a multi-party effort. Looking at the large-scale model industry, since privacy and security protection is one of Apple's core concepts, it can compensate for its additional investment in security through the revenue of mobile phones and App Store. Many other large-scale model companies are still in the stage of "burning money", and any increase in computing costs is a huge burden. As a result, there is a trade-off between the costs and benefits of when these companies can adopt Apple-like security measures, which also amplifies privacy risks for users. In addition, the awareness of security for cloud-side large model companies is crucial. While we all know that data security is important, protecting it does add a lot of costs. For example, high-end devices such as H100 GPU actually have confidential computing functions, which can better protect data privacy, but many large model companies have not turned on this function, because once it is enabled, the processing speed will be affected, and the service cost will also rise. As a result, companies have to make a trade-off between security and cost – to be safe, the cost is high; To be cheap, safety may be compromised, which has become a major problem for large model companies. Therefore, the awareness of security is very important for cloud-side large model companies, and it is necessary to ensure that users' sensitive information is not misused or leaked. At the same time, enabling higher levels of encryption and confidential computing increases operational costs but also improves user trust.
04
The smarter the phone, the more global privacy law challenges come one after another
Apple faces a dilemma when promoting its products around the world: how to maintain personalized service while protecting user privacy. Apple's personalized recommendations rely on user data analytics, but privacy regulations like GDPR limit data collection and processing. This allows Apple to provide intimate service while also carefully complying with regulations. For example, many of the features launched by Apple in the closed ecosystem are currently not available in the EU, such as Apple Maps, FaceTime Audio, including some AI services, etc., which cannot be launched and can only be provided in United States, Canada and other regions. In this case, privacy protection and functionality are often not at the same time. The release of the iPhone 16, Apple once again said that the launch time of Apple Smart in China needs to be subject to the approval of regulatory authorities. Of course, Apple's privacy protection policy is not limited to Apple, but is a data level law that requires all companies involved. For example, the EU's GDPR, United States' CCPA, China's Data Security Law, Cybersecurity Law and Personal Information Protection Law, these laws cover data security and personal information protection, and Apple's devices are also subject to their jurisdiction. In addition, there is a need to comply with regulations related to AI privacy, especially in China and the European Union. The Framework Convention on Artificial Intelligence and Human Rights, Democracy and the Rule of Law, published by the European Union on September 5, 2024, also involves the participation of 57 countries, focusing on AI risks and security issues, including data security and privacy protection. Ideally, an international consortium could design a data security baseline that identifies some common service capabilities and provides customized services based on them will help drive technological innovation. Organizations such as ISO, IEEE and other organizations are formulating data privacy security standards at the international level, or subdivision standards for specific industries, including Zhongwei Technology, which has also participated in the formulation of a number of data security and privacy protection related standards, such as the recently released IEEE-P3158 trusted data space international standard and so on. For the rollout of standardization, the industry is mainly involved in two directions: the first is the standardization of technology architecture, clarifying the boundaries of technology stacks and security protection, and determining best practices so that enterprises can follow these standards and achieve internationally recognized security boundaries. The second is hierarchical classification of data and applications, and different levels and categories of data can adopt different privacy and security protection policies to avoid a one-size-fits-all approach to solving all problems. Similar to Apple, it tries to process core data on the mobile phone as much as possible, and non-sensitive data can be sent to the cloud for computing. In the future, device-side computing will become more and more popular. As mobile phone chips become more and more powerful, memory becomes larger, and algorithms become smarter, many complex calculations can be completed directly on the mobile phone, reducing dependence on the cloud. In addition, manufacturers such as Apple and Huawei need to invest more in infrastructure around the world. Because not every country has a well-established cloud computing infrastructure, once data needs to be transferred across borders, it may encounter compliance issues. For example, in countries with strict privacy regulations, such as the European Union, there may be additional challenges when data crosses borders, and it is important to ensure compliance with local legal requirements.
05
Under the tightening of regulations in various countries,
How does the AI keep running
In the future, countries will become more and more stringent in terms of data security and privacy protection, because it is not only about personal privacy, but also about "data sovereignty" - that is, whoever has the core data of a country has influence on national security. Especially in the context of the rapid development of large model technology, data sovereignty is becoming more and more important. In order to protect this core data, countries will set more restrictions, which also means that it will be more difficult to obtain public data, which is not very friendly to the training of general-purpose models. For manufacturers, the demand for technology iteration and innovation is never-ending. Taking Apple's iPhone as an example, although it has some "black technology" blessings, it does not apply complex technologies such as homomorphic encryption to every privacy protection link, and only uses it for some relatively simple tasks. For example, the cloud can identify spam calls by comparing them with homomorphic encrypted phone numbers, which is equivalent to "lightweight protection". At present, Apple relies more on confidential computing technology, which can keep data secure while maintaining processing speed and flexibility. However, there is a limit to any security measure, and achieving absolute security is as unrealistic as requiring you to exchange information without talking without a brain-computer interface. Today's security means are to find a compromise protection scheme under certain constraints. These limitations will increase the cost of AI development and use, and it will take a while to popularize the current related applications. For the AI industry, in the past, relying on big data and large models was like an accelerator to produce results quickly, but now data resources are becoming more and more scarce, and the difficulty and cost of research and development are rising. And, with more and more data protection regulations in place, the cross-border use of AI technology becomes more complex. If you want to use AI in some countries, if you don't have enough infrastructure, you have to transfer data and computing power across borders, which is like running a long-distance plus horse race, and the difficulty has risen several notches. However, data constraints also present new opportunities. For example, AI models in vertical domains (domain-specific AI) can be trained more accurately using "private data" within an enterprise or industry. This is also why China has an advantage in promoting vertical models. From the perspective of development trends, the emergence of quantum computing poses new challenges to existing security measures. While most cryptography today relies on solving complex mathematical problems such as large integer factorization and discrete logarithms for security, quantum computing has the ability to bypass these complex calculations as if a shortcut had suddenly been found. Therefore, in the future, we may need to introduce new technologies such as "post-quantum cryptography", such as the "lattice cryptography" scheme in homomorphic encryption. Currently, these technologies also face efficiency and performance bottlenecks, which can be slow to compute. Therefore, before quantum computing can really become popular, we still have a lot of technical problems to solve. In short, the development of AI technology in the future will definitely encounter more rules and challenges. No matter how smart AI is, it needs to leave a "safe channel" for user privacy. - - - - - - END - - - - - - If you want to get industry information and share your experience with like-minded technology enthusiasts, then scan the code to add "Dynamic Jun" to join the group chat! There are more irregular benefits in the group!
Wonderful article is worth recommending!
One-click forwarding, poke and watch!