Designed for (fill in the blank)

On the occasion of Global Accessibility Day (GAAD), Apple has created a series of videos highlighting the many ways its iOS devices empower individuals with disabilities to accomplish a variety of goals, from parenting to releasing a new album for a rock band. Each of the videos ends  with the tagline “Defined for” followed by the name of the person starring in the video, closing with “Designed for Everyone.” In this brief post, I want to highlight some of the ways in which this is in fact true. Beyond the more specialized features highlighted in the video (a speech generating app, the VoiceOver screen reader, Made for iPhone hearing aids and Switch Control), there are many other Apple accessibility features that can help everyone, not just people with disabilities:

  • Invert Colors: found under Accessibility > Display Accommodations, this feature was originally intended for people with low vision who need a higher contrast display. However, the higher contrast Invert Colors provides can be helpful in a variety of other situations. One that comes to mind is trying to read on a touch screen while outdoors in bright lighting. The increased contrast provided by Invert Colors can make the text stand out more from the washed out display in that kind of scenario.
  • Zoom: this is another feature that was originally designed for people with low vision, but it can also be a great tool for teaching. You can use Zoom to not only make the content easier to read for the person “in the last row” in any kind of large space, but also to highlight important information. I often will Zoom In (see what I did there, it’s the title of one of my books) on a specific app or control while delivering technology instruction live or on a video tutorial or webinar. Another use is for hide and reveal activities, where you first zoom into the prompt, give students some “thinking time” and then slide to reveal the part of the screen with the answer.
  • Magnifier: need to read the microscopic serial number on a new device, or the expiration name on that medicine you bought years ago and are not sure is still safe to take? No problem, Magnifier (new in iOS 10) to the rescue. A triple-click of the Home button will bring up an interface familiar to anyone who has taken a photo on an iOS device. Using the full resolution of the camera, you can not only zoom into the desired text, but also apply a color filter and even freeze the image for a better look.
  • Closed Captions: although originally developed to support the Deaf and hard of hearing communities, closed captions are probably the best example of universal design on iOS. Closed captions can also help individuals who speak English as a second language, as well as those who are learning how to read (by providing the reinforcement of hearing as well as seeing the words for true multimodal learning). They can also help make the information accessible in any kind of loud environment (a busy lobby, airport, bar or restaurant) where consuming the content has to be done without the benefit of the audio. Finally, closed captions can help when the audio quality is low due to the age of the film, or when the speaker has a thick accent. On Apple TV, there is an option to automatically rewind the video a few seconds and temporarily turn on the closed captions for the audio you just missed. Just say “what did he/she say?” into the Apple TV remote.
  • Speak Screen: this feature found under Accessibility > Speech are meant to help people with vision or reading difficulties, but the convenience it provides can help in any situation where looking at the screen is not possible – one good example is while driving. You can open up a news article in your favorite app that supports Speak Screen while at a stop light, then perform the special gesture (a two finger swipe from the top of the screen) to hear that story read aloud while you drive. At the next stop light, you can perform the gesture again and in this way catch up with all the news while on your way to work! On the Mac, you can even save the output from the text to speech feature as an audio file. One way you could use this audio is to record instructions for any activity that requires you to perform steps in sequence – your own coach in your pocket, if you will!
  • AssistiveTouch: you don’t need to have a motor difficulty to use AssistiveTouch. Just having your device locked into a protective case can pose a problem this feature can solve. With AssistiveTouch, you can bring up onscreen options for buttons that are difficult to reach due to the design of the case or stand. With a case I use for video capture (the iOgrapher) AssistiveTouch is actually required by design. To ensure light doesn’t leak into the lens the designers of this great case covered up the sleep/wake button. The only way to lock the iPad screen after you are done filming is to select the “lock screen” option in AssistiveTouch. Finally, AssistiveTouch can be helpful with older phones with a failing Home button.

While all of these features are featured in the Accessibility area of Settings, they are really “designed for everyone.” Sometimes the problem is not your own physical or cognitive limitations, but constraints imposed by the environment or the situation in which the technology use takes place.

How about you? Are there any other ways you are using the accessibility features to make your life easier even if you don’t have a disability?

3 Ways the iPhone Has Disrupted My Life for the Better

The 10th anniversary of the iPhone announcement in 2007 was mentioned on a number of podcasts I listen to this past week, and this got me into a reflective mood. I can remember vividly where I was when the announcement took place. At the time I was a graduate student at the University of South Florida, and I watched the announcement on the big screen in the iTeach Lounge where I worked as a graduate assistant.

I must admit that at first I was a bit skeptical. The first version of the iPhone was pretty expensive, and it took me a year after the launch to decide that I wanted to get in on the fun.  If I remember correctly, it cost me $399 for 8GB of storage when I bought my first iPhone from Cingular Wireless (remember them?). As cool as that first iPhone was, it took two important developments to make me a true believer.  The first one was the release of the App Store in 2008, which opened up  a world of possibilities only limited to developers’ imagination. The second was the accessibility support announced with the release of the iPhone 3GS. After my first iPhone contract with Cingular was up, I actually returned to a traditional flip phone for a little while for my next phone. Once the accessibility support was announced, though, I was locked in. I have been an iPhone owner ever since.

In addition to the App Store and the built-in accessibility support, there are three other important ways in which the iPhone has disrupted my life in significant ways that go beyond just being able to have access to information and communication on the go.

A Better Set of Eyes

The iPhone couldn’t have come at a better time for me. At the time, my vision loss was getting the point where using a traditional DSLR camera was becoming harder and harder. As I detailed in an article for the National Federation of the Blind’s Future Reflections magazine, the built-in accessibility features of the iPhone have allowed me to continue with my passion for capturing the beauty in the world around me. The way I see it, the iPhone is now “a better set of eyes” for me. Most of the time, I can’t be sure that I have actually captured a decent image when I aim the phone at a scene. It is not until later, when I am reviewing the images more carefully at home, that I notice small details I didn’t even know were in the frame. You can see some examples of my photography on my Instagram page.

Instagram collage showing best nine images of 2016.

Going forward, this idea of the iPhone as my “best set of eyes” is going to be important to me beyond photography. As my vision loss progresses, I will be able to rely on the iPhone’s ever improving camera to recognize currency, capture and read aloud the text in menus, business cards and more, and tell me if my clothes are exactly the color I intended. I have no doubt that “computer vision” will continue to get better and this gives me hope for the future. Already, the VoiceOver screen reader can recognize some objects in your images and describe them aloud. This technology was developed to make searching through large image libraries more efficient, but it will be helpful to people with visual impairments like me as well.

Independence at the Touch of a Button

The second major way the iPhone has disrupted my life for the better is by giving me back my independence in a big way, through apps such as Uber and Lyft. Now, I know you can use these apps on other smartphones, so they are not exclusive to the iPhone. However, when you really think about it, no iPhone means no App Store. No App Store means there is no incentive for other companies to copy what Apple did.

Uber has replaced the many frustrations I had with public transportation (lateness, high taxi fares) with a much more convenient and less expensive solution. Yes, I know some of my blind friends have had a number of issues with Uber (such as outright discrimination from drivers who are not comfortable with a guide dog in their vehicles), but this would probably happen with taxicabs too.

My own experience with Uber has been mostly positive, and the service allows me to easily get to doctor’s appointments, and provides me with a reliable way to get to the airport so that I can do my work of spreading the message of accessibility and inclusive design for education to a broader audience beyond my local area. Uber and Lyft, and the iPhone as the platform that made them possible, have really opened up the world to me.

Can You Hear Me Now?

One of the big trends at the Consumer Electronics Show (CES) this year was the presence of Alexa, Amazon’s voice assistant, on all kinds of consumer appliances. Alexa joins Apple’s Siri, Microsoft’s Cortana and Google’s Assistant in heralding a future where voice and speech recognition replace the mouse and the touch screen as the primary input methods for our computing devices. We are not quite there yet, but the accuracy of these services will continue to improve and I am already seeing the potential with some of the home automation functions that are possible with the existing implementations (having my lights be automatically turned on when I arrive at home, for example).

Here, again, the iPhone deserves quite a bit of credit. The release of Siri as part of the iPhone 4S in 2011 brought the idea of speech recognition and voice control to the mainstream. Previously, its use was limited mostly to individuals with motor difficulties or niche markets like the medical and legal transcription fields. Siri helped popularize this method of voice interaction and made it more user friendly (remember when you had to sit for several minutes training speech recognition software to recognize just your voice?).

Looking Ahead

The smartphone is a mature technology and some have questioned whether it has reached its apex and will soon give way to other options for content consumption and communication. One possibility would involve virtual, augmented or even mixed reality. Given the visual nature of AR and VR this gives me some cause for concern just like I had at the release of the iPhone back in 2007. However, just like Apple took a slab of glass and made it accessible when few people thought it could, with some creativity we can make AR and VR accessible too.

We have come a long way in just 10 years (sometimes I find it hard to remember that it has only been that long). In that time, Apple has shown that “inclusion promotes innovation.”  Accessible touch screens, voice controlled assistants, ride sharing services, are just a few of the innovations that have developed within an accessible ecosystem started with the iPhone. Thank you Apple, and congrats on the 10th anniversary of iPhone.Here’s to the next 10, 20 or 30 years of innovation and inclusion.

 

 

Read to Me in Book Creator 5

Read to Me Screen in Book Creator

Book Creator for iPad recently added a new Read to Me text to speech feature that allows learners to hear their books read aloud within the app (without having to transfer the book to iBooks first). The feature also supports accessibility in two other ways:

  • all embedded media can be played automatically. This is great for those with motor difficulties, and it also creates a better flow during reading (no need to stop and start the text to speech to hear the embedded media).
  • automatic page flips: again this is a great feature for those who can’t perform the page flip gesture to turn the pages in a book.

These options can be configured through a Settings pane where it is possible to change the voice (you can choose any voice available for Speech in iOS), slow it down, or remove the word by word highlighting that goes along with it. For better focus, it is also possible to show one page at a time by unchecking the “Side by side pages” option under Display.

I created a short video to show how the new feature works (with a bonus at the end: how to fix the pronunciations with the new pronunciation editor built into the Speech feature in iOS 10).

 

4 New Accessibility Features of iOS 10 You Should Know

Apple today released iOS 10,  the latest version of its operating system for mobile devices such as the iPad and iPhone. This post is a quick review of some of the most significant enhancements to the accessibility support in iOS 10, starting with a brand new feature called Magnifier.

Magnifier

With Magnifier, users who have low vision can use the great camera on their devices to enlarge the text in menus, pill bottles, and other items where they might need a little support for their vision to read the content. Magnifier is found alongside Zoom (which enlarges onscreen content) in the Accessibility Settings. Once it is enabled, you can activate the Magnifier by triple-clicking the Home button.

While a number of existing apps such as Vision Assist and Better Vision provide similar functionality, having this functionality built into the OS should improve performance (through faster focusing, better clarity made possible by accessing the camera’s full native resolution, etc.). Magnifier has the following options:

  • a slider for adjusting the zoom level (or you can pinch in and out on the screen)
  • a shutter button that freezes the image for closer inspection – you can then pinch to zoom in on the captured image and drag on the screen with one finger to inspect a different part of it
  • a button for activating the device flash (on devices that have one) in torch mode  so that you get a bit more light in a dark environment
  • a button for locking the focus at a given focal length
  • a button for accessing a variety of filters or overlays

The available filters include: white/blue, yellow/blue, grayscale, yellow/black, and red/black. For each of these, you can press the Invert button to reverse the colors, and you can do this while in the live view or with a frozen image. Each filter also provides a set of sliders for adjusting the brightness and contrast as needed.

Display Accommodations

Display Accommodations is a new section in the Accessibility Settings that brings together a few existing display options (Invert Colors,  Grayscale, Reduce White Point) with a couple of new ones (Color Tint and and options for three different types of color-blindness).

Color filters pane has options for Grayscale and color blindness filters.

For those who have Irlen Syndrome (Visual Stress) there is a new option in iOS for adding a color tint over the entire display. Once you choose this option, you will be able to use a slider to specify the intensity and hue of the filter.

Color Filters with sliders for intensity and hue

Speech Enhancements

In addition to word by word highlighting, the text to speech options in iOS 10 (Speak Selection and Speak Screen) will now provide sentence by sentence highlighting as well. By choosing Highlight Content in the Speech Settings you can configure how the highlighting takes place: you can have only the words highlighted, only the sentences, or both, and you can choose whether the sentence highlight will be an underline or a background color (though you still can’t choose your own color).

A new Typing Feedback setting can help you if you find you are often entering the wrong text. You can choose to hear the last character or word you typed (or both). For the character feedback, you can specify a delay after which the character will be spoken and even whether a hint (“t, tango”) is provided. An additional setting allows you to hear the QuickType suggestions read aloud as you hover over them, to make sure you are choosing the right prediction.

The entire Speech system also can take advantage of some additional high quality voices: Allison, Ava, Fred, Susan, Tom and Victoria for U.S. English. Some of the voices (such as Allison) have both a default and an enhanced version as has been the case with previously introduced voices, and you preview each voice before downloading it by tapping a play button. An edit button allows you to remove voices you are not using if you are running low on space (you can always download them again).

VoiceOver Pronunciation Editor and New Voices

I’m sure the team at AppleVis will do a complete rundown of VoiceOver in iOS 10, so here I will just highlight one feature that I am really happy about: the new Pronunciation Editor. After all this time, I can finally get VoiceOver to get a little bit closer to the correct pronunciation for my name (the accent in Pérez still throws it off a little).

The Pronunciation Editor is found under VoiceOver > Speech > Pronunciations. Once there, you will press the Add (+) button, enter the phrase to be recognized and then either dictate or spell out the correct pronunciation. You can restrict the new pronunciation to specific Languages,  Voices and apps or choose All for each option for a more global availability.

In addition to the pronunciation editor, VoiceOver can take advantage of all the new voices for the Speech system in iOS 10: Allison, Ava, Fred, Susan, Tom and Victoria for U.S. English (each with an enhanced version). Like the Alex voice, you will have to download each of these new voices before you can use it, but you can  preview each voice before downloading it.

These are just a few of the new accessibility features in iOS 10. Others include:

  • the ability to auto-select the speaker for a call when you are not holding the iPhone to your ear.
  • an option for routing the audio for VoiceOver: you can hear the speech on one channel and the sound effects in the other when wearing headphones.
  • Switch access for Apple TV which will allow you to navigate the interface of the Apple TV using the Switch Control feature on your iOS device.
  • a new option for Switch Control Recipes that will allow to create a hold on point action right from the scanner menu. Before you could only crate a tap action in this way.

And of course, there are other enhancements to the rest of the Apple ecosystem which I will cover in their own blog posts as they become available: Siri for the Mac, Taptic Time for Apple Watch, new activity settings on Apple Watch for wheelchair users, and more.

Finally, there is the new hardware Apple just announced last week, which will soon be shipping. Apple Watch has a faster processor and a better display (helpful for those with low vision), and the iPhone 7 and 7 Plus come with even better cameras (12 MP, with two cameras for 2X zooming on the larger model). As both a visually impaired photographer and as someone who focuses on accessibility features that use the camera (Magnifier, Optical Character Recognition apps to convert print into digital content) this is very exciting.

What are your thoughts on Apple’s recent announcements? Are you upgrading to the new devices? Which features have you most excited?

 

 

 

 

Let’s Get Cooking with Recipes for Switch Control

In my last post I focused on Recipes, a feature of Switch Control for iOS devices that can help switch users more efficiently perform repetitive actions such as flipping the pages of a book. This week, I will focus on how to set up these Recipes with step by step directions.

The first step to get the most of recipes is to connect a switch interface to your device. While iOS allows you to use the screen as a switch source (tapping on the screen will be recognized as a switch press), having a switch interface with at least two switches will provide more options: for example, you can set up one switch to flip the page in one direction, the other to flip it in the opposite direction.  Some of my favorite switch interfaces are as follows:

Each of these switch interfaces will allow you to connect at least two switches (typically a round button you press to perform an action on your device). The wireless switch interfaces will connect to your device over a Bluetooth connection. Pairing instructions will vary by device, but if you have paired a Bluetooth keyboard or headset to your device before the steps will be familiar. The wired switch interfaces will typically use a Lightning connection.

Once you have your switch interface connected and you have configured at least two switch sources, you can proceed to create a new Recipe for flipping the pages in a book (Important: I highly recommend setting up the Accessibility Shortcut before trying these steps – this will allow you to triple-click the Home button if you get stuck at any time and need to turn off Switch Control):

  1. Go to Settings > General > Accessibility and choose Switch Control (under Interaction).
    IMG_0766
  2. Tap Recipes and choose the Turn Pages option.
    IMG_0767IMG_0768
  3. Tap Assign a Switch and follow the onscreen prompts to select one of your switches and assign the desired action (a Right to Left Swipe or a Left to Right Swipe).
    IMG_0769IMG_0770IMG_0771
  4. Repeat step 3 to assign the second action to a different switch (a swipe in the opposite direction).
  5. Navigate back to the screen listing your switches and their actions, then choose one of the switches and assign its Long Press action to Exit Recipe. This will allow you to switch back to the typical mode of operation for Switch Control when you are ready to step out of the Recipe.
    IMG_0772IMG_0773

That’s it. Your switches will be ready to use. Recipes are accessed through the Scanner Menu that pops up by default when you make a selection. You can review last week’s post to see this Recipe in action.

Want to learn more about Switch Control? You should really check out Handsfree on the iBookstore. This is a book I co-authored with switch master Christopher Hills. The book has more than 20 closed captioned videos and step by step instructions for every aspect of using Switch Control for access and inclusion.

5 Accessibility Features Every Learner Should Know

On the occasion of Global Accessibility Awareness Day (GAAD) this week (May 19th), I created this post to highlight some of the iOS accessibility features that can benefit a wide range of diverse learners, not just those who have been labeled as having a disability.

Screen Shot 2016-05-13 at 4.02.00 PM

 

It’s built in.

Every iOS device comes with a standard set of accessibility features that are ready to use as soon as you take the device out of the box. Let’s take a look at a few of these features that can benefit all users in the spirit of Universal Design.

Get started by going to Settings > General > Accessibility!

#1: Closed Captions

Closed captions were originally developed for those with hearing difficulties, but they can help you if you speak English as a second language or just need them as a support for improved processing. Captions can also help if your speakers are not working, or the sound in the video is of poor quality.

80% of caption users did not have any kind of hearing loss in one UK study.

Learn how to enable and customize closed captions on your iOS device.

#2: Speech

All iOS devices support built-in text to speech with the option to turn on word highlighting. Starting with iOS 8, it is possible to use the more natural Alex voice formerly available only on the Mac. TTS supports decoding, which frees you the reader to focus on the meaning of the text.

Breathe!: Alex takes a breath every once in a while to simulate the way we speak!

  • Learn how to enable and use Speak Selection on your iOS device.
  • Bonus tip!: Don’t want to make a selection first? No problem. Just bring up Siri and say “Speak Screen.” This will read everything on the screen!

#3: Safari Reader

Safari’s Reader is not really an accessibility feature (you will not find it in Settings) but it can help you if you find that you get distracted by all the ads when you are reading or doing research online. It is also a nice complement to the Speech features mentioned above. With iOS 9, you can now customize the appearance of the text and even change the background and font to make it easier to read when you surf the Web.

Left my heart in…San Francisco is a new system font available in iOS 9. It is designed to be easier to read, and is one of the font options available for Reader.

Learn how to use Safari Reader when you surf the Web.

#4: Dictation

Whenever you see the iOS keyboard, you can tap the microphone icon to the left of the space bar to start entering text using just your voice. This can help you get your words down on the page (or is it the screen?) more efficiently.

Try It!: Dictation can handle complex words. Try this: Supercalifragilisticexpialidocious.

Dictation supports more than just entering text. Follow the link for a helpful list of additional Dictation commands.

#5: QuickType and 3rd Party Keyboards

QuickType is Apple’s name for the word prediction feature now built into the iOS keyboard. Word prediction can help you if you struggle with spelling, and it can speed up your text entry as well. Starting with iOS 8, it is now possible to customize the built-in keyboard by installing a 3rd party app. The 3rd party keyboards add improved word prediction, themes for changing the appearance of the keys and more.

17 Seconds: World record for texting. Can you beat it?

Learn how to use QuickType and how to set up and use 3rd party keyboards.

Bonus Tips

Struggling to see the screen? – make sure to check out the Vision section in the Accessibility Settings. You can Zoom in to magnify what is shown on the screen,  Invert Colors to enable a high contrast mode, make the text larger with Dynamic Text, and much more.

Sources:

A Workflow for Independence – Logan’s Story

Recently I had the pleasure of meeting Logan Prickett, a second year student at Auburn University at Montgomery. Logan is an academically gifted STEM student and the inspiration behind The Logan Project at AUM, an initiative to develop software that will enable students who are blind or who have low vision to fully participate in all college-level math courses.

Luis and Logan.

At age 13, Logan suffered an anaphylactic reaction to the contrast dye in an MRI. His heart stopped beating on its own which left him without oxygen for 45 minutes. Logan believes that “a prayer chain that reached around the world was active during those 45 minutes and I credit God and those prayers for the heartbeat that brought me back to life.”

His time without oxygen left Logan blind, a wheelchair user, with fine motor control difficulties, and unable to speak above a whisper due to damage to his vocal cords that occurred during life saving measures. Logan has the cognitive ability to do the work in his courses, he just needs a few technology supports in place to ensure his vision and motor challenges do not get in the way and prevent him from tapping his full potential. The goal of the Logan Project is thus to eliminate barriers for students with complex needs like Logan so that they can not only complete required math coursework but pursue a career in a STEM field if they desire. This is worthy goal given the underrepresentation of people with disabilities in STEM fields. You can learn more about it by typing The Logan Project into the search bar on the AUM website (aum.edu).

The Goal: Independent Communication

When I met with Logan and his team the expressed goal was to get Logan started on the journey to independent communication, beginning with the ability to send and receive short messages with his family and those close to him. Logan had just acquired an iPhone 6 Plus and we considered the use of Switch Control since Logan has enough motor control to press a switch. To accommodate his visual impairment, we decided that Logan would use Switch Control with manual scanning and the speech support turned on. This way he would be able to hear the items on the screen as he presses the switches to scan through them at a pace that works for him. The one problem with this setup is the possibility of fatigue from repeated switch presses. Siri seemed like a possibility for getting around this issue, but unfortunately Siri is not able to recognize Logan’s low whisper to allow him to quickly send a text message or initiate a FaceTime call. Surprisingly, FaceTime can pick up Logan’s whisper well so that it can be understood on the other end of the call. Although he can be heard with an actual phone call as well, the audio with a FaceTime call is much better. Thus, if we could find a way to activate FaceTime with a minimum of effort we would go a long way toward giving Logan an option for communication while he develops his Switch Control skills. That’s where the Workflow app comes in.

Workflow to the Rescue

I knew about the Workflow app because it made history as the first app to get an Apple design award for its attention to accessibility. In fact, at the Design Awards, members of Apple’s engineering team who are blind were the ones who actually did the demo of the app to show how well it works with the VoiceOver screen reader built into Apple’s mobile devices. You can watch the demo on Apple’s WWDC 2015 site (the Workflow demo starts at 35 minutes and goes through the 42 minute mark.)

As the name suggests, Workflow is a utility for creating workflows that allow the user to chain together a series of actions to complete a given task. For example, as I often do tutorials with screenshots from my Apple Watch, I have created a workflow that automatically takes the latest Apple Watch screenshot saved to my Camera Roll on the iPhone and shares it to my computer using Airdrop so that I can quickly add it to a blog post or a presentation. This kind of workflow can save a lot of time and effort for tasks that you perform several times over the course of a day.

Workflow already includes many actions for built-in iOS apps such as Contacts, FaceTime and Messages. These actions can be chained together to create a workflow, with the output from one action used as the input for the next one in the chain. Thus, a workflow can consist of selecting an entry in the Contacts app and feeding its information into the FaceTime app to start a new call with the selected contact. In much the same way, the entry from the Contacts app can be combined with a Text action to start Messages, pre-fill the message body and automatically address the message. For Logan this kind of workflow would reduce the amount of work he would have to perform and allow him to send quick messages to his team, such as “I’m ready for pick up” or “class is running late.” There is even the possibility of sharing his location so that other team members can get an idea of where Logan is at different points in the day.

Once a workflow has been created it is possible to add it as a shortcut on the Home Screen, with its own descriptive name, icon and color. By organizing these shortcuts on the Home Screen it is possible to create a simple communication system for Logan, giving him the ability to use Switch Control to independently start FaceTime calls, send quick messages and more.

Going Forward

The ultimate goal is to develop Logan’s ability to communicate independently and this will require building up his skills as a new switch user. With time and practice, I have no doubt after getting to know Logan that he will become a proficient user of Switch Control. In the meantime, Workflow is a good option for building his confidence and giving him some good reasons to use those skills: communicating with those who are important to him with a minimum of effort. When he is ready, he could then add an alternative and augmentative communication (AAC) app such as Proloquo4Text to his arsenal of communication tools, as well as keyboards such as Keeble and Phraseboard that make it easier for switch user to enter text. Logan has demonstrated that he has the ability to do well in higher education; now we just have to figure out how to eliminate a few barriers that are standing in his way and preventing him from letting his ability shine.

What’s New in iOS 9 for Accessibility

With iOS 9, Apple continues to refine the user experience for those who have disabilities or just need additional supports to effectively interact with their iPhones and iPads. While there are only two new accessibility features in iOS 9 (Touch Accommodations and a new Keyboard pane for improved support of external Bluetooth keyboards), the existing features have received a number of enhancements. Probably the one that received the most attention in this update is Switch Control, which now includes a new scanning style, the ability to set separate actions for long presses, and Recipes for more easily performing repetitive actions such as turning the pages in a book in iBooks.

The first change you will notice when you go into the Accessibility pane in Settings is that things have been moved around just a bit. Really the only change is that the options for Interaction now follow those for Vision. Everything else then follows the same order as before. I like this change as I think both VoiceOver and Switch Control significantly change how the user interacts with the device and  this change should make it easier to navigate to Switch Control in the Accessibility pane. The change also works to highlight the new Touch Accommodations feature by placing it near the top of the Accessibility pane.

This post is a short summary of each accessibility feature that is either brand new or enhanced in iOS 9, starting with the new Touch Accommodations feature.

Touch Accommodations

This brand new feature is largely targeted at people with motor difficulties who may have problems with the accuracy of their touches as they interact with the touchscreen on an iOS device. Touch Accommodations consists of three options: Hold Duration, Ignore Repeat and Touch Assistance. Before you start experimenting with these options, I would recommend setting up your Accessibility Shortcut so that Touch Accommodations is the only option listed. This way if you get stuck while using Touch Accommodations you can quickly triple-click the Home button on your device to exit out of the feature.

Hold Duration will require the user to touch the screen for a given duration before a touch is recognized. This can be helpful for someone who struggles with accidental presses. When Hold Duration is turned on, touching the screen will display a visual cue with a countdown timer. If the user lifts the finger before the countdown runs out, the touch is not recognized. With Ignore Repeat, multiple touches within the specified duration are treated as a single touch. This can be specially helpful when typing with the onscreen keyboard. A user with a tremor may end up tapping repeatedly on the same spot, resulting in many unwanted keypresses.

Tap Assistance can be set to use the Initial Touch Location or the Final Touch Location.  The two options determine the spot on the screen where the touch is performed when you let go with your finger. With Initial Touch Location, you can tap and then move your finger around on the screen while a timer is displayed. If you let go with your finger during the countdown (which you can customize using the Tap Assistance Gesture Delay controls) the tap is performed where you first touched the screen. After the countdown expires, you can perform a gesture (a flick, swipe and so on) the way you are used to with iOS. With Final Touch Location, the touch is performed at the spot where you let go as long as you do it within the countdown time. This can be a different spot than where you first touched the screen.

Additions to Switch Control

Switch Control is an iOS feature introduced in iOS 7 that provides access to touchscreen devices for a number of people who rely on external assistive devices. My friend Christopher Hills, with whom I am co-authoring a book on this feature (stay tuned on that front),  is a good example of an expert user of Switch Control. Christopher has cerebral palsy and uses external switches to perform many of the gestures someone with typical motor functioning could do with their fingers on the touchscreen.

In iOS 9, Apple has continued the development of Switch Control with a number of new features:

  • A new Single Switch Step Scanning style: this new style requires the switch source to be continuously pressed until the user gets to the desired item. Letting go of the switch then will highlight that item and give it focus. With the default tap behavior, the next tap will bring up the scanner menu then within the scanner menu letting go of the switch will immediately select the option that has focus. A Dwell Time timing option determines how long it will take before an item is highlighted and the user can make a selection.
  • A new Tap Behavior: the Always Tap option is similar to Auto Tap in that it allows the user to make a selection with the first tap of the switch. However, with Always Tap, the scanner menu is available from an icon at the end of the scanning sequence instead of through a double-tap of the switch.
  • A Long Press action: the user an specified a separate action that can be performed when the switch is held down for a specified duration. This is a great way to exit out of the Recipes feature.
  • Recipes: the user can invoke a special mode for Switch Control where each press of the switch can perform the same action. A couple of actions are already included, such as tapping the middle of the screen or turning the pages in a book. These are primarily intended for use in iBooks. Creating a new recipe is as easy as giving it a name, assigning the switch that will be used to perform the action that will be repeated with each press, and choosing one of the built in actions or creating a custom one. Custom actions for Recipes can include a series of gestures and their timings. To exit out of the Recipe, the user has two options: setting a timeout after which the recipe will be ended if no switch presses take place, or setting the action for a long press of the switch to Exit Recipe.

A new option allows the switch user to combine tap behaviors when using the onscreen keyboard. With the Always Tap Keyboard Keys option, the keys will be selected with a single press of the switch even if the tap behavior is set to the default of showing the scanner menu at the first tap of the switch.

 

Customizable AssistiveTouch Menu

The layout of the AssistiveTouch menu can now be customized, with options for changing the number of items that appear on the top level shown and swapping out icons for features on secondary menus that are used more often. The number of icons on the top level menu can be set to as few as one and as many as eight. Tapping on any of the icons in the Customize Top Level Menu pane will open a list of all of the features supported by AssistiveTouch. Selecting an item from the list will move that option to the top level menu. Change your mind? No problem, a Reset option is available (in fact, I would love to see similar Reset options for other features such as VoiceOver and Switch Control).

Better Support for Bluetooth Keyboards

Under Interaction, you will find a new Keyboard option. Tapping that option will open a separate pane with options intended for those who use an external Bluetooth keyboard with their iOS devices:

  • Key Repeat: turns off the key repeat (it is enabled by default) in order to prevent multiple characters from being entered when a key is held down on the keyboard. The options for customizing this feature include adjustments for the delay before a key that is held down starts repeating, as well as how quickly the key repeat will take place.
  • Sticky Keys: allows the user to press the modifier keys for a keyboard shortcut in sequence rather than having to hold them down all at once. The options for this feature include a quick way to turn it on by pressing the Shift key quickly five times, as well as playing a sound to alert the user when it has been turned on.
  • Slow keys: changes how long the user has to hold down a key before it is recognized as a keypress (essentially a hold duration). The only option for this feature is to adjust the length the key has to be pressed before it is recognized.

The one option for the onscreen keyboard in the Keyboard pane addresses a usability problem by making the switch between lower case and upper case more prominent. By default, the keys on the onscreen keyboard are in lower case and only switch to uppercase when the shift key is pressed.

Tweaks to VoiceOver and Zoom

The Rotor in iOS 9 has two new options available: Text Selection and Typing Mode. The former is not a new feature or input mode, it just now can be changed through the rotor. With the latter, the user can more easily select text by character, word, line, or page (or select all) by flicking up or down with one finger after selecting Text Selection in the Rotor. A flick to the right will then select the text by the chosen granularity (word, line, etc.).

A new option allows the users of external Bluetooth keyboards to change the VoiceOver keys from Control + Option to the Caps Lock. Finally, users can now adjust the Double-tap Timeout at the bottom of the VoiceOver settings pane. This feature may be helpful to a VoiceOver user who also has motor difficulties and can’t perform the double-tap as quickly.

For Zoom, the only change is that the option for choosing different Zoom Filters is now available from the Zoom settings panel where before it could only be selected from the Zoom menu available after tapping the controller or the handle on the Zoom window.

Other Options

iOS 9 includes options for disabling the Shake to Undo feature as well as all system vibrations, both of which can be found under Interaction in the Accessibility pane.

As is often the case with iOS updates, a number of features that are not explicitly labeled as a accessibility features can benefit those who use assistive technologies. One example is the new Siri suggestions feature, which can be displayed with a swipe to the right from the Home screen. The suggestions include frequently contacted people, recently used apps, locations and more. Anything that puts less distance between users of VoiceOver and Switch Control and the information they need is a good thing in my book.

That’s it for this high level overview of the major (and some minor) changes in iOS 9 that impact those who rely on the accessibility features. I hope you have found it helpful.