Compile | Chen Junda
Edit | Cheng Qian
Zhidong reported on August 1 that in the early morning of July 30, Apple launched the iOS 18.1 Beta version for developers, which initially demonstrated functions such as summarizing, writing tools, photo search and AI-blessed Siri. In addition to Siri, the updated Apple Intelligence features are distributed throughout the system, from email to Safari to audio recordings, and the newly added AI functions can be seen, making it a veritable "system-level AI".
Specifically, the summary function allows users to see a summary of the content of the web page in a matter of seconds, saving unnecessary reading time. The writing tool is fully embedded in almost all pages involved in text input and editing, allowing you to adjust your tone and organize your text into tables with just one click. In addition, photos in the album can now be quickly located with search, and memory videos can also be generated with one click.
▲One-click memory video generation function (Source: YouTube)
Siri has gotten smarter with this wave of updates, and its ability to understand situations and contexts has improved a lot. Testers at The Verge found that Siri now doesn't interrupt the user when hesitating or pauses, but instead determines whether to wait for the user to finish based on the context of the conversation.
However, testers also mentioned that the newly updated Apple Intelligence features were still slow to respond at times and could not understand complex text commands, but these features are expected to be improved in the next update. In the past, Apple generally didn't launch beta versions of the system before the official release of the system, perhaps in order to work with developers to address potential issues with Apple Intelligence before the official release.
1. Safari has a built-in summary function to save reading time
Apple has embedded a web page summarization feature in Safari. Users can simply tap the icon on the left side of the address bar to enter reading mode, then click the "Summary" button, and Apple Intelligence will provide content similar to reading notes, which is helpful when users need to read quickly.
▲Summarization function demonstration (Source: YouTube)
Summaries can help users quickly understand whether an article is worth reading, and if the article in the summary looks good, it can also entice readers to read the whole article. The tool can also be used to assist users in reading more complex papers, analyzing articles, summarizing transcribed recordings, and reading emails.
▲The summary function is embedded in the call recording function (Source: Apple)
One of the drawbacks of summarization tools is that they aren't particularly fast and take a few seconds to output summaries, but similar AI tools take about the same amount of time. The Apple Intelligence output is a well-organized summary, and even quick bullet points can be extracted from the conversation recording, so it seems that the wait of a few seconds is worth it.
2. Customize and modify the tone and style of the message, and organize the text into tables
The writing tool is one of the Apple Intelligence features that Apple highlighted at this year's WWDC. From the perspective of the actual user interface, Apple's writing tool can provide text proofreading, rewriting and other functions, and can also adjust the tone of the text, such as "friendly", "workplace" or "concise". The writing tool can also output text in a specified format, which can be organized into bullet points, lists, and tables.
▲Writing tool interface (source: Mashable)
In the iOS 18.1 developer beta, all text input and editing pages can use the Writing Tool, users can simply select the text to be modified, click the "Writing Tools" button, and let Apple Intelligence modify the text content in the specified way. If the user is not satisfied with the modification result, he can also click the "Retry" button to generate a new version.
▲The writing tool can quickly modify the tone according to the user's instructions (Source: YouTube)
After using Apple Intelligence's writing tool, testers at Mashable found that she did not express her context or emotions in some messages, but the writing tool helped her circumvent these problems, and she received a more positive response after sending a message that was modified by the writing tool.
3. Quickly locate old photos and find text in pictures
The integration of Apple Intelligence in Albums is a real "good news for lazy people". Some users don't have the habit of organizing their photo albums after taking photos, but Apple Intelligence allows them to quickly locate the location of the image containing the corresponding element by eliminating the need to scroll endlessly in the album, just enter keywords such as "laptop".
▲Album search function (Source: Mashable)
This search can also detect text on photos, but this natural language-based search feature is not yet mature. Mashable's testers found that while she could type in text like "laptop" and "food," Apple Intelligence was able to find the photos quickly. But it hasn't been able to grasp more complex searches, such as "woman in red shirt."
Albums can also use Apple Intelligence to generate memory videos for users. In the picture below, the album generated a short video for the user's cat after analyzing the user's mobile phone photos in the past period, and also wrote "Perfect" as "Purr-fect" when naming the short video, playing with homophonic stalks.
▲Memory video generated by the album (Source: Youtube)
However, the beta version of iOS 18 is still in its early stages, and when iOS 18 is officially released later this year, the photo search feature may be further improved.
Fourth, Siri is more useful, and the situational awareness is improved
While it will be months before Siri fully introduces a major update to Apple Intelligence, Siri in developer beta has already improved in terms of language comprehension. According to an experience report by The Verge, Siri is more patient to wait without interrupting Siri when there is a hesitation or pause in the conversation between the user and Siri. Siri understands when the user asks a follow-up question.
In addition, Siri has also introduced a new way to interact, users can simply double-tap the bottom of the screen to "send a message" to Siri and interact with Siri in the form of text.
Mashable's testers believe that Siri's biggest improvement is contextual awareness. For example, while browsing the web, a user can simply say to Siri, "Send this article to Jason," and Siri will be able to forward the article she's reading.
Conclusion: Apple Intelligence is showing initial results, but many features may be delayed
Judging from the multiple Apple Intelligence features embedded in the iOS 18.1 developer beta, Apple has indeed deeply integrated AI technology at the system level and brought real convenience to users.
However, according to a report by Mark Gurman, a well-known Apple whistle·blower reporter, the time for ordinary users to use Apple Intelligence may be delayed, and Apple devices such as the iPhone 16 series, which will be released in September this year, may not be equipped with Apple Intelligence. It may take a series of updates at the end of this year and early next year before most of the Apple Intelligence features demonstrated at WWDC can be used on US devices, and there is no definite word on when Apple Intelligence will be available to domestic users at this time.
来源:Mashable、The Verge