Tech

Apple’s new accessibility features let you control an iPhone or iPad with your eyes

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on telegram
Share on email
Share on reddit
Share on whatsapp
Share on telegram


Litter I just announced a number of new accessibility features coming to its software platforms in the coming months, including eye tracking, which the company says uses artificial intelligence to allow people with physical disabilities to more easily navigate iOS and iPadOS.

A new “musical haptics” option will use iPhone’s Taptic Engine vibration system to “reproduce refined touches, textures, and vibrations in music audio” for supported Apple Music tracks. Apple is also adding features to reduce motion sickness for those who are susceptible to it when using an iPhone in a moving vehicle.

All of these new accessibility options will likely roll out in iOS and iPadOS 18, although Apple will only say “later this year,” ahead of its WWDC event next month. The eye-tracking feature “uses the front camera to configure and calibrate in seconds, and with on-device machine learning, all data used to configure and control this feature is kept securely on the device and is not shared with Apple .” The company claims it is designed to work on iOS and iPadOS apps without the need for extra hardware or accessories.

Music haptics will allow those who are deaf or hard of hearing to “experience music on iPhone” by producing a variety of vibrations, taps and other effects in rhythm with millions of tracks on Apple Music. Apple says developers will also be able to add the feature to their own apps through a new API.

These animated dots can help some people avoid sensory conflicts and thus reduce motion sickness.
GIFs: apple

Other upcoming accessibility features include vocal shortcuts, which will allow anyone to “assign custom expressions that Siri can understand to launch shortcuts and complete complex tasks.” A new “Hear Atypical Speech” feature uses machine learning to recognize someone’s unique speech patterns; This is “designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke.”

If you often feel motion sickness when using your technology in a moving vehicle, Apple has a new method to help reduce those unpleasant sensations:

With vehicle movement cues, animated dots at the edges of the screen represent changes in vehicle movement to help reduce sensory conflict without interfering with the main content. Using sensors built into iPhone and iPad, vehicle motion signals recognize when a user is in a moving vehicle and respond accordingly. The feature can be configured to appear automatically on the iPhone or can be turned on and off in the control center.

The company full press release contains a longer list of other accessibility features coming to Apple platforms in a few months. AI and machine learning appear throughout the text, offering further confirmation that iOS 18, iPadOS 18, and the company’s other software platforms will use AI-based features. Apple is reportedly in discussions with OpenAI and Google about collaborating on some generative AI features.

But even outside of all that, these are big steps toward making Apple products more accessible to as many people as possible. The company announced them the day before Global Accessibility Awareness Day, which takes place on May 16.



Source link

Support fearless, independent journalism

We are not owned by a billionaire or shareholders – our readers support us. Donate any amount over $2. BNC Global Media Group is a global news organization that delivers fearless investigative journalism to discerning readers like you! Help us to continue publishing daily.

Support us just once

We accept support of any size, at any time – you name it for $2 or more.

Related

More

1 2 3 5,988

Don't Miss