Key Takeaways
- Apple shared some new accessibility features that will arrive to the iPhone, iPad, and Vision Pro later this year.
- The brand debuted Eye Tracking, Music Haptics, Vocal Shortcuts, Vehicle Motion Cues, and more.
- Apple’s upcoming WWDC event will most likely shed more light on these and upcoming software updates.
Apple is set to have a big year in 2024 as it makes its first real push into AI with its upcoming hardware and software products. And while we don’t exactly know what’s going to come from the brand, there have been rumors swirling that it could be partnering with some big names in order to bring its vision to life.
Related
iPadOS 18: Everything we’re expecting
The first official iPadOS 18 preview is just a few months away!
With that said, the company is set to show off what it’s been working on during its annual developer conference, which is set to take place on June 10. But ahead of the event, Apple has shared some upcoming features like Eye Tracking, Music Haptics, Vocal Shortcuts, and more that it will implement in a future update that could really change the way some interact with their Apple devices.
New features that could really make a huge difference
The brand took to its website to share the news of new accessibility features that are coming to devices later this year. Perhaps the most exciting feature of the pack is going to be the new Eye Tracking feature that will let users navigate on iPad and iPhone just by using their sight. The feature doesn’t look to require any special hardware and will utilize the existing front-facing cameras on these devices along with a little help from AI. The brand states that this feature is easy to set up and will “calibrate in seconds,” utilizing on-device machine learning, with data kept “securely on device” without it being shared with Apple.
Music Haptics will provide a new way for users to experience music with the iPhone’s Taptic Engine, creating “taps, textures, and refined vibrations to the audio of the music.” While this will work with millions of songs on Apple Music, the brand states that the API will be available for developers to utilize with their own apps to provide more accessibility features. In addition, with Vocal Shortcuts, users will be able to “custom utterances” to launch shortcuts and tasks. Apple will also enhance speech recognition to account for all different kinds of speech patterns.
In addition to the above, Apple will also add Vehicle Motion Cues that will help reduce motion sickness when trying to read content in a moving vehicle. The motion cues will show up as an overlay, with small animated dots that will show up on the edges of the screen to show the changes in direction of the vehicle. CarPlay will also get some new accessibility features like Voice Control, Color Filters, and Sound Recognition. Sound Recognition will be able to alert drivers and passengers that are deaf or hard of hearing of nearby sirens and horns.
Apple’s visionOS will also be getting some new accessibility features as well with “systemwide Live Captions” that will populate the menu system and supported apps. And if all of that wasn’t enough, Apple will also add a variety of update which can be seen in full in the list below.
- For users who are blind or have low vision, VoiceOver will include new voices, a flexible Voice Rotor, custom volume control, and the ability to customize VoiceOver keyboard shortcuts on Mac.
- Magnifier will offer a new Reader Mode and the option to easily launch Detection Mode with the Action button.
- Braille users will get a new way to start and stay in Braille Screen Input for faster control and text editing; Japanese language availability for Braille Screen Input; support for multi-line braille with Dot Pad ; and the option to choose different input and output tables.
- For users with low vision, Hover Typing shows larger text when typing in a text field, and in a user’s preferred font and color.
- For users at risk of losing their ability to speak, Personal Voice will be available in Mandarin Chinese. Users who have difficulty pronouncing or reading full sentences will be able to create a Personal Voice using shortened phrases.
- For users who are nonspeaking, Live Speech will include categories and simultaneous compatibility with Live Captions .
- For users with physical disabilities, Virtual Trackpad for AssistiveTouch allows users to control their device using a small region of the screen as a resizable trackpad.
- Switch Control will include the option to use the cameras in iPhone and iPad to recognize finger-tap gestures as switches.
- Voice Control will offer support for custom vocabularies and complex words.
Of course, these updates won’t be coming until later this year when Apple debuts the latest version of iOS, iPadOS, and visionOS. And while this is a great sample of what’s to come, we will most likely hear a lot more when Apple holds its WWDC event on June 10. And from what we can see so far, we can’t wait to hear what Apple has in store for iPhone, iPad, and Mac next month.