With the latest updates for its mobile devices, Apple continues to refine the experience for those of us who have low vision and use features such as Zoom, Spoken Content (formerly Speech) and VoiceOver. There are now actually two distinct operating systems, each optimized for the device it runs on. iOS 13 continues to be the operating system for the iPhone, while the iPad gets a new iPadOS optimized for multi-tasking on the bigger screen. Both operating systems support a new Dark Mode that provides improved contrast for those of us who need it.
Apple has also changed where the accessibility features are located in Settings. They now have their own pane at the root level. . I like this change because it raises the profile of the accessibility features and makes them easier to find, but I do know that for some people it will take a while to get used to this change based on muscle memory gained over many years of using iOS. That’s why I have started each video in this post with reminder of this change. Let’s take a look at what else is new in iOS 13 (and iPadOS).
This first video starts with a new feature that is only available on the iPad running iPad OS. It’s called Pinned Zoom and can be found under the Zoom Region setting (along with the existingWindow and Full Screen Modes). Basically, this splits the iPad’s larger screen into two areas: one area shows a 100% view, while the other area shows a zoomed in view. The location of the zoomed in area can be changed and this area can also be resized using a divider. My favorite trick is to tap the handle between the two areas to show a highlight in the 100% view. Moving this highlight with one finger allows me to easily pan in the zoomed in area.
Those who use an external keyboard will love the fact that Zoom on iOS now supports many of the same keyboard shortcuts as the Mac (for example, Command + Option +8 to zoom in and out). Each keyboard shortcut can be turned on independently (I use them all) by going to Zoom, Keyboard Shortcuts.
Finally, the Zoom controller is more customizable – you can change its color and set up some actions to quickly activate specific Zoom features. I can’t remember what the defaults were, but I now have mine set up as follows:
- a single tap shows the options menu
- a double-tap zooms in and out
- a triple-tap activates the new Speak on Teach feature (see the next section)
Let Your Fingers Do The Talking
As I hinted in the last section, there is a new feature for text to speech (now called Spoken Content instead of Speech) on iOS 13 and iPadOS. It is called Speak on Touch, and it lets you drag your finger over any part of the screen to hear the content under your finger read aloud. You can think of it as a more flexible way to use Speak Screen.
You will find Speak on Touch as a hand icon in the Speech Controller, which you can now have on display all the time. The rest of the Zoom Controller itself looks a little different – instead of the turtle and the hare icons for adjusting the speaking rate, there is now a single button you can tap to cycle through the different speeds (1.5X, 2X and 1/2 speed).
As with Zoom, the Speech Controller also has a couple of actions that can be activated with a long press or a double tap. I have mine set to activate Speak on Touch when I double-tap the Speech Controller, while a long press activates Speak Screen and reads everything on the screen (a great option for anyone who is unable to perform the Speak Screen gesture – a two finger drag from the top of the screen).
VoiceOver For One
You can probably sense a theme by now – VoiceOver is also more customizable in iOS 13 and iPad OS, starting with the fact that you can customize many of the gestures and keyboard shortcuts. Just go to VoiceOver, Commands and don’t worry, if you go crazy and need to start over with a default configuration, that option is available.
VoiceOver also has a new Activities option that lets you set up custom VoiceOver settings for specific apps and contexts. For example, I have set up VoiceOver to have a faster speaking rate and also use a new, more natural sounding Siri female voice when I am reading a website in Safari. As soon as I exit Safari, VoiceOver automatically switches back to my default Alex voice that I have been using for years.
The final new feature in VoiceOver is one that developers should find very helpful. There is now a caption panel that appears at the bottom of the screen and shows a textual representation of what VoiceOver is reading aloud. My only wish is that Apple soon provides some customization for the caption panel, as the text is currently too small for me.
There are other tweaks to VoiceOver in iOS 13 and iPadOS, but rather than discussing them here, I would recommend checking out AppleVis’ excellent rundown of iOS 13 improvements for individuals who are blind or deaf-blind. Hopefully between AppleVis and this post we have covered everything you need to know the new vision supports in iOS 13/iPadOS.
All The Rest
The Accessibility pane has been reorganized and streamlined. Many of the low vision options that used to be at the root level of the pane (such as Bold Text and Larger Text) are now organized under a Display and Text category, and there is a new Differentiate Without Color option that should be helpful for those who can’t distinguish certain colors when they are used as the only visual cue.
The Reduce Motion setting (which removes animations that could cause problems for those with motion sensitivity) is now found in a new Motion category. There you will also find settings for replacing motion with a more subtle cross-fade, for turning off the automatic playback of full screen effects in Messages and for disabling automatically playing videos (such as App Store preview videos).
Both iOS and iPadOS now support the use of a mouse with a visible pointer. This feature is found under Assistive Touch. After connecting a Bluetooth or USB mouse, go to Accessibility, Touch, AssistiveTouch and choose Pointer Style. You can adjust both the size and color of the mouse pointer. If you also use Zoom with a connected pointing device, there is a setting for controlling how panning takes place: I prefer to have the panning take place when I move the pointer to the edge of the window.
Did I miss anything? Have any questions? Hit me up on Twitter: @eyeonaxs.