Site icon Stuff South Africa

Want to control your iPhone with your eyes? That, and more accessibility features are coming to iPhone 

iPhone basic

Look guys, Apple really does care about accessibility for its iPhone customers. What other company would save up some potentially life-changing features to coincide with Global Accessibility Awareness Day? Certainly not Apple, which just announced a whole bunch of accessibility features that are slated to arrive “in the coming months.” Oh, wait.

Most notable among them is Eye Tracking, predicted to arrive alongside the new iPadOS 18 and iOS 18 updates next month at Apple’s WWDC conference. Eye Tracking is about as cool as it sounds, allowing users to control basic features of their iPhone or iPad simply by looking. It’s something macOS has had for a while, and about time Apple’s other products received the change.

What are you looking at, iPhone?

What’s powering Eye Tracking? AI, of course. And Dwell Controls, which laid the foundation for the feature in the accessibility keyboard on macOS. If the video above is accurate, it appears that Apple’s got the feature down pat. All it takes is a couple of seconds to calibrate the feature using an iPad or iPhone’s front-facing camera, and you’re good to go. Gestures include swiping up, down and button-pressing to help you navigate around.This is all happening locally on your device, so nobody, not even Apple, can keep track of this data.

Those customers with hearing issues are being looked after, also. There’s a new feature called Music Haptics that turns iPhones into vibrating music systems. That’s all down to the iPhone’s Taptic Engine, which will play “taps, textures and refined vibrations to the audio of the music.” Don’t fret about availability, either. Apple Music already leverages the feature across “millions of songs in the Apple Music catalog.”

Better yet, the Music Haptics API is available for developers who want to expand the reach of the feature to their own apps.

Wait, there’s more

Vocal Shortcuts is another new feature making its way to the iPhone and iPad. It’ll allow users to assign custom “utterances” and replace specific words or phrases that Siri can understand.  It’s complemented by ‘Listen for Atypical Speech’, another new feature that uses on-device machine learning to pick up its users’ vocal intricacies and speech patterns.

“Artificial intelligence has the potential to improve speech recognition for millions of people with atypical speech, so we are thrilled that Apple is bringing these new accessibility features to consumers,” says Mark Hasegawa-Johnson, an investigator for the Speech Accessibility Project at the Beckman Institute. “The Speech Accessibility Project was designed as a broad-based, community-supported effort to help companies and universities make speech recognition more robust and effective, and Apple is among the accessibility advocates who made the Speech Accessibility Project possible.”


Read More: Meet the Proteus controller, designed for gamers with disabilities


And finally, if you’re the type to get motion sick using a phone or tablet in the car, Apple is working on that. Vehicle Motion Cues can, apparently, reduce this phenomenon by including animated dots on the edges of the screen to represent changes in the vehicle’s motion. It uses sensors in the devices to do this, and can even be set to turn on automatically when sitting in a car.

Apple hasn’t landed on any exact dates for these features to debut on iPhone or iPad, but we’re guessing more info will turn up at Apple’s WWDC developer conference in June.

Exit mobile version