laitimes

Apple spoilers in advance! New features such as eye tracking, musical haptics, vocal shortcuts, and more are coming soon

author:Qilu one point

On Wednesday morning local time, Apple spoiled a batch of new accessibility features that will be unveiled "later this year" - that is, when the new batch of iOS 18 systems is released in advance before Global Accessibility Awareness Day (May 16).

Apple spoilers in advance! New features such as eye tracking, musical haptics, vocal shortcuts, and more are coming soon

According to Apple's announcement, the features include allowing users to control the iPhone and iPad with just their eyes, use the haptic engine to feel music, reduce motion sickness with vehicle motion cues, set voice shortcuts, and support Mandarin for the "Personal Voice" feature.

Apple revealed that after the new feature goes live, users only need a pair of eyes to operate the iPhone and iPad.

AI-powered eye tracking gives users the built-in option to use iPads and iPhones with just their eyes. Eye tracking is a feature designed for users with disabilities to set up and calibrate the front-facing camera in seconds, and with on-device machine learning, all data to set up and control this feature is securely stored on the device and not shared with Apple. Eye tracking works in iPadOS and iOS apps and requires no additional hardware or accessories. With eye tracking, users can navigate through the app's elements and use dwell controls to activate each element, accessing other features such as physical buttons, swipes, and other gestures only through their eyes.

Music Haptics is a new way for hearing-impaired users to experience music on their iPhones. When this accessibility feature is turned on, the haptic engine in your iPhone embody taps, textures, and subtle vibrations as music plays. Music Haptics is available for millions of songs in Apple Music and will be available to developers as an API to enable more users to experience music in their apps.

Apple spoilers in advance! New features such as eye tracking, musical haptics, vocal shortcuts, and more are coming soon

iPhone and iPad users can add custom utterances to Siri via voice shortcuts to launch shortcuts and complete complex tasks. Another new feature, Listen for Atypical Speech, provides the option to enhance the range of speech recognition. The Listen to Atypical Speech feature uses on-device machine learning to identify a user's speech patterns. Designed for users whose speech function has been affected by cerebral palsy, amyotrophic lateral sclerosis (ALS) or stroke, these features build on the features introduced in iOS 17 and provide new personalization and control features for users who are unable to speak or are at risk of speaking ability.

Apple spoilers in advance! New features such as eye tracking, musical haptics, vocal shortcuts, and more are coming soon

On iPhone 15 Pro, "Set Vocal Shortcut" is displayed on the screen and prompts the user to select an action and record a phrase that teaches the iPhone to recognize their voice.

Apple spoilers in advance! New features such as eye tracking, musical haptics, vocal shortcuts, and more are coming soon

On the iPhone 15 Pro, the screen displays "Say 'ring' for the last time" and prompts the user to teach the iPhone to recognize the phrase by repeating it three times.

Apple spoilers in advance! New features such as eye tracking, musical haptics, vocal shortcuts, and more are coming soon

On the iPhone 15 Pro, users received a reminder from a voice shortcut that reads "Open Activity Ring".

Another practical feature is the upcoming system-wide "real-time subtitles" function of Apple Vision Pro, which can convert live conversations and communication in app audio into subtitles in real time. Vision Pro will also add the ability to use the window bar to move captions during immersive videos.

Apple spoilers in advance! New features such as eye tracking, musical haptics, vocal shortcuts, and more are coming soon

Apple also disclosed that the update to the visual accessibility feature will include the addition of "Reduce Transparency", "Smart Invert" and "Reduce Flickering Lights" features for users with low vision or who want to avoid bright lights and frequent flickering.

Apple will also introduce a new feature for the iPhone and iPad aimed at reducing motion sickness.

Apple says motion sickness is often caused by a sensory conflict between what people see and what they actually feel, so by showing some points of movement at the edge of the screen, you can reduce sensory conflict while avoiding affecting the text display. Apple devices can automatically recognize whether the user is in a moving car, and this feature can also be turned on and off through the control center.

Apple spoilers in advance! New features such as eye tracking, musical haptics, vocal shortcuts, and more are coming soon

More features:

For visually impaired users, VoiceOver will include new voices, flexible volume rotors, custom volume controls, and the ability to customize VoiceOver keyboard shortcuts on Mac.

The amplifier will provide a new reading mode and easily initiate the detection mode with the operation button.

Braille users can start and maintain braille screen input in a new way, improving control and text editing; Braille screen input is now available in Japanese; Support for using the braille keyboard to enter multiple lines of braille text, as well as to select different input and output methods.

For low-vision users, hover typing will zoom in on the text entered in the text box and display it in the user's preferred font and color.

For users who are at risk of losing their ability to speak, Personal Voices will be available in Mandarin. Users who have difficulty pronouncing or reading complete sentences can now use shortened sentences to create their own voices.

For users with speech impairments, real-time speech will include classification as well as features that are compatible with real-time captions.

For users with disabilities, the Assistive Touch feature of the Virtual Trackpad allows the user to control the device by using a small area of the screen as an resizable trackpad.

Toggle control now allows the iPhone and iPad camera to recognize the finger tap gesture and use it as a switch.

Voice control will support custom vocabulary and complex words.

Apple spoilers in advance! New features such as eye tracking, musical haptics, vocal shortcuts, and more are coming soon

Demonstrate the new reading mode in the magnifier on the iPhone 15 Pro.

Apple spoilers in advance! New features such as eye tracking, musical haptics, vocal shortcuts, and more are coming soon

Read on