laitimes

Apple has announced that it will introduce a new accessibility feature that will allow you to control the iPhone with eye movements in the future

author:Sina Technology

Sina Digital News reported on the morning of May 16 that Apple announced that it would launch a new accessibility function later this year to improve more convenient operations for people with hearing impairments and physical disabilities.

Landing on iPad and iPhone with eye tracking

Apple has announced that it will introduce a new accessibility feature that will allow you to control the iPhone with eye movements in the future

AI-powered eye tracking gives users the built-in option to use iPads and iPhones with just their eyes. Eye tracking is a feature designed for users with disabilities to set up and calibrate the front-facing camera in seconds, and all data from setting up and controlling this feature is securely stored on the device and not shared with Apple through on-device machine learning.

Eye tracking works in iPadOS and iOS apps and requires no additional hardware or accessories. With eye tracking, users can navigate through the app's elements and use dwell controls to activate each element, accessing other features such as physical buttons, swipes, and other gestures only through their eyes.

Tactile engines allow people with hearing impairments to "hear" music

Apple has announced that it will introduce a new accessibility feature that will allow you to control the iPhone with eye movements in the future

Music Haptics is a new way for hearing-impaired users to experience music on their iPhones. When this accessibility feature is turned on, the haptic engine in your iPhone embody taps, textures, and subtle vibrations as music plays. Music Haptics is available for millions of songs in Apple Music and will be available as an API to developers, enabling more users to experience music in their apps.

More new features for voice

iPhone and iPad users can add custom utterances to Siri via voice shortcuts to launch shortcuts and complete complex tasks. Another new feature, Listen for Atypical Speech, provides the option to enhance the range of speech recognition. The Listen to Atypical Speech feature uses on-device machine learning to identify a user's speech patterns. Designed for users who have had an impact on speech due to cerebral palsy, amyotrophic lateral sclerosis (ALS) or stroke, these features build on the features introduced in iOS 17 and provide new personalization and control features for users who are unable to speak or at risk of speaking ability.

"AI has the potential to improve speech recognition for millions of atypical speech speakers, and we're excited to see Apple bring these new accessibility capabilities to users," said Mark Hasegawa-Johnson, head of the Voice Assistance Program at the University of Illinois at Urbana-Champaign's Beckman Center for Advanced Science and Technology As one of the advocates of accessibility, Apple has promoted the implementation of voice-assisted programs. ”

Vehicle motion cues can help reduce motion sickness

Apple has announced that it will introduce a new accessibility feature that will allow you to control the iPhone with eye movements in the future

Vehicle Motion Alerts is a new feature for iPhone and iPad that helps riders reduce motion sickness. Research shows that motion sickness is often caused by a clash of senses between what people see and what they actually feel, which can cause some users to not be able to comfortably use their iPhone or iPad while riding in a moving vehicle. Vehicle motion cues display animated dots at the edges of the screen, representing changes in vehicle motion to help reduce sensory clashes while not interfering with the main display. Using the built-in sensors on iPhone and iPad, vehicle motion alerts can recognize whether a user is in a moving vehicle and respond accordingly. The feature can be set to appear automatically on your iPhone or turned on and off in the Control Center.

CarPlay voice control and more accessibility updates

Upcoming accessibility features in CarPlay include voice control, color filtering, and voice recognition. With voice control, users can use CarPlay and control apps using only their voice. With sound recognition, the hearing-impaired driver or passenger can turn on the alarm function and be notified by the car horn and alarm sound. Color filters make the CarPlay interface easier for colorblind users, and other visual accessibility features such as bold text and large text are also available.

Upcoming accessibility features coming to visionOS

This year, visionOS introduced accessibility features including system-wide live captioning, which helps all users, including those with hearing loss, understand conversations and conversations in app audio in real time. With FaceTime's Live Captions feature in visionOS, more users can easily connect and collaborate using their own Persona for a unique experience. Apple Vision Pro will add the ability to move captions using the window bar in Apple's immersive videos, as well as support for other Made for iPhone (MFi) hearing devices and cochlear hearing processors. Updates to visual accessibility will include the addition of "Reduce Transparency", "Smart Invert", and "Reduce Flickering Lights" features for users with low vision or who want to avoid bright lights and frequent flashing.

Apple has announced that it will introduce a new accessibility feature that will allow you to control the iPhone with eye movements in the future

These features, along with dozens of accessibility features already available in Apple Vision Pro, provide a flexible input system and an intuitive interface designed for a broad user base. Features such as narration, zoom, and color filters can provide visually impaired users with the ability to use spatial computing, while features such as guided access can support the cognitively impaired. Users can control the Vision Pro with any combination of eyes, hands, or sound, and accessibility features include on/off control, sound operation, and dwell control, which can also help people with disabilities.

"Apple Vision Pro is definitely the most accessible tech I've ever experienced." "As someone who was born without hands and couldn't walk, I knew that the world was inconvenient for me, so it was amazing to see how visionOS works," said Ryan Hudson-Peralta, product designer, accessibility consultant, and co-founder of Equal Accessibility LLC in Detroit. This is a testament to the role and importance of accessible and inclusive design. ”

Other updates

For visually impaired users, VoiceOver adds new voice, a flexible volume rotor, custom volume controls, and the ability to customize VoiceOver keyboard shortcuts on Mac.

The amplifier will provide a new reading mode and easily initiate the detection mode with the operation button.

Braille users can start and maintain braille screen input in a new way, improving control and text editing; Braille screen input is now available in Japanese; Support for using the braille keyboard to enter multiple lines of braille text, as well as to select different input and output methods.

For low-vision users, hover typing will zoom in on the text entered in the text box and display it in the user's preferred font and color.

For users who are at risk of losing their ability to speak, Personal Voices will be available in Mandarin. Users who have difficulty pronouncing or reading complete sentences can now use shortened sentences to create their own voices.

For users with speech impairments, real-time speech will include classification as well as features that are compatible with real-time captions.

For users with disabilities, the Assistive Touch feature of the Virtual Trackpad allows the user to control the device by using a small area of the screen as an resizable trackpad.

Toggle control now lets the camera on iPhone and iPad recognize the finger tap gesture as a switch.

Voice control will support custom vocabulary and complex words.

Read on