KUALA LUMPUR, July 16 — July is Disability Pride Month in New York. While it's not a nationally recognised event in the US, it has picked up popularity in many other US states as well as online.

It also seems like a good time to have a deeper look at Apple's latest accessibility features that were announced at its recent Worldwide Developers Conference (WWDC).

Having loved ones who are disabled, tech accessibility has been something I cared about enough to track over the years — Google's Vinton Cerf was a particular joy to talk to about the search engine's approach to accessibility.

A history of accessibility

Advertisement

Accessibility features have long been part of Apple's offerings. Switch Control, a feature tailored for users with limited mobility, was introduced as far back as 2013 in iOS7.

While Switch Control is still being used to help users control iOS and  iPadOS devices as well as Macs and Apple TVs, new accessibility features have emerged and a particular "star" among them was the new AssistiveTouch for the Apple Watch.

Advertisement

Targeted at users with limited mobility, Apple Watch wearers can instead use gestures such as pinching and clenching their fists to control their Apple Watch.

I downloaded the latest Watch OS 8 beta to try it and my main gripe is that there isn't a detailed "how to" or manual on how to use all of Assistive Touch's features.

Perhaps it's hidden in some developer's guide somewhere or maybe Apple will release more details once the Watch software update goes completely "live" but I didn't find it quite as intuitive as the videos show.

Enabling it is as easy as activating the feature in the Watch app. Once it's enabled, you will then be prompted to try the various gestures — Pinch, Double Pinch, Clench and Double Clench to get a hang of them.

I have hands stiffened by decades of carpal tunnel syndrome so while pinch gestures were not too hard, it was more difficult for me to ascertain just how hard I needed to clench to make things work.

There was, sadly, also some pain, which comes with the territory, but I don't know if Apple's Accessibility testers also factored in users with carpal tunnel or arthritis.

I also got a little bit confused figuring out the pointer and had a bit of trouble trying to navigate from one screen to another, with my choice of watch face also determining just how easy the process would be.

AssistiveTouch isn't perfect but I see why it would be a god-send for those who cannot easily reach over to, say, answer a call from the Watch or who can barely move their hands.

Hopefully a more detailed FAQ comes out sooner than later.

It's in the eyes

Another feature that will be useful for users with very limited upper body mobility is eye-tracking support on iPadOS.

The feature will allow third-party MFi hardware to be used with iPads -- in a way similar to current existing eye-tracking devices, which are used to control computers.

Eye-tracking allows for users to be able to control their devices with just their gazes, without the need for an external mouse or keyboard.

For iPads, this would make it easier for users for whom the general touch interface would just not be as accessible for them.

The voice has it

Apple's VoiceOver feature has made many apps more accessible to the blind and the upcoming iOS 15 updates has made it even more powerful.

VoiceOver will now be optimised for image searches, and be able to describe photos for users with low vision.

iOS 15's ability to detect text in a photo will now enable VoiceOver users to have that text read out loud for the very first time.

Also good news for those with vision disabilities: the Magnifier app now comes standard with iOS 15 on the Home Screen instead of being only accessible via settings.

Besides that, you can now customise your accessibility settings by app, which comes in handy if you don't want to use the same blanket settings for all of them.

Sounds to listen to

For users who rely on Voice Control, Apple has announced support for more languages including Mandarin, Cantonese, French and German.

It's not just about the sounds but what you're hearing, too. iOS 15 will now support  importing PDF or paper audiograms.

Audiograms are basically charts or graphs that depict the results of a hearing test. By importing audiograms, you can easily customise Headphone Accommodation settings to adjust frequencies and loudness based on the data.

Speaking of audio, Apple also announced Background Sounds, which plays one of six different types of background sounds: Balanced Noise, Bright Noise, Dark Noise, Ocean, Rain and Stream.

It can be turned on from the Accessibility settings under Audio/Visual and users can decide whether to keep playing the sounds even when other media is playing or to shut them off instead.

While it could be argued that Apple could have just put it in an app instead, an alternative would be to create a Shortcut and add it to the Home Screen.

What I'd like to see personally is Apple also making room for those with invisible disabilities such as chronic pain, and mental or neurological conditions in their accessibility approach.

In the meantime, Apple's guide for developers on creating more inclusive apps lays a foundation for developers to keep making tech accessible to anyone and everyone.

For more information on Accessibility features across Apple's ecosystem, check out their official site.