Overview of VoiceOver on 4th Generation Apple TV (Video)

VoiceOver was already available on  older Apple TV models, but the touchpad on the new Siri remote allows it to be an even more robust accessibility solution on the new 4th generation model. This video provides an overview of the various gestures VoiceOver supports on the new Apple TV, including the Rotor gesture that can be used to change VoiceOver settings such as the speech rate.




Now on iBookstore: Supporting Students with Low Vision Using Apple Technology

My new book focusing on accessibility for Apple users who have low vision is now available for download from the iBookstore.

Cover of Supporting Students with Low Vision Using Apple Technology

The book includes more than 25 short video tutorials (closed-captoined) to go along with explanations of the built-in accessibility features of iOS devices, the Mac, Apple TV and even Apple Watch (I was only able to record one video on Zoom by visiting my local Apple Store, since I don’t yet have access to an Apple Watch – more videos on Apple Watch will be added in a future update). The book also has a section on apps for those with low vision as well as some tips for creating more accessible iBooks content for those who have low vision. A final section focuses on accessories that I use as a person with low vision (stands, bone conduction headphones and the like).

I hope you enjoy the book, find it a valuable resource and will provide me with feedback so that I can make future updates even better.

To celebrate the release of this new book, I have made a few updates to my previous book (A Touch of Light), which is now FREE.

Apple Watch Review Roundup and Thoughts

With the Apple Watch finally arriving in stores for trial and then initial deliveries starting on the 24th of April, a number of reviewers have had hands-on time with the device and shared their first impressions. These include Steven Aquino at iMore and David Woodbridge at AppleVis, both of whom have done an excellent job with their reviews. Apple has also created a nicely laid out page describing the key accessibility features that are available on the Apple Watch, which it divides into two categories: vision and hearing. To summarize, the Apple Watch continues Apple’s excellent track record of including accessibility and universal design features on all of its new products. On the new Apple Watch, these features include:

  • The VoiceOver screen reader
  • Zoom screen magnification
  • an Extra Large Watch Face option
  • Large Dynamic Type and Bold Text
  • Reduce Motion and Transparency
  • Grayscale and On/Off Labels for those with color difficulties
  • Mono Audio and balance control for those with hearing loss
  • an Accessibility Shortcut (triple-clicking the Digital Crown) to enable accessibility features such as Zoom and VoiceOver

Features can be enabled or disabled on the device itself or on the companion app that runs on the iPhone. From David’s review it looks like you will find learning the accessibility gestures of Apple Watch fairly easy if you are already familiar with these features on the iPhone or iPad. For example, with Zoom instead of using three fingers to zoom in/out and pan around the display, you use two fingers to account for the limited screen real estate.  The Zoom level can be adjusted in much the same way you used to on iOS before a slider was added in iOS 8 but instead of double-tapping and holding with three fingers and then sliding up or down, on the watch you do it with two fingers. Similarly, for VoiceOver you can flick left or right to move by item, move your finger over the screen to navigate by touch or double-tap with one finger to activate a control or launch an app. This is one of the things I have always appreciated about Apple’s approach to accessibility. What you learn on one device usually translates to the use of other similar devices, reducing the amount of time it takes you to become proficient with the accessibility features even when it is a new product category like the Apple Watch. 

 Surprisingly, there is little mention of any features for those with motor challenges on the Apple page for Apple Watch or the reviews I have read. This is surprising to me given the fact that the Apple Watch relies on the Digital Crown as the main way of interacting with the device. People with poor motor skills may find the Digital Crown to be difficult to operate, though initial reviews indicate that it is much easier to operate on the Apple watch than on traditional analog watches.  Sure, the user can use Siri to control the device with voice recognition, but Siri may not work accurately in environments with a lot of ambient noise, or for people who have speech difficulties. 

 One area that has not gotten as much attention in the reviews I have read is the potential for this device as a communication aid. With a press of the side button, the user can bring up a list of key people (parent, caregiver, etc.) to communicate with not only with a phone call or a message, but with sketches (quick drawings that animate on the other end if the other person also has an Apple Watch). For people with autism and other related disabilities, this more visual way of communicating could be very helpful. Taps are also supported for custom messages between two Apple Watch wearers who have hearing difficulties (the taps are felt on each end of the conversation as a silent tap pattern on the wrist). The true potential for the Apple Watch as a communication device will probably not be known until the app store for the device matures. One app I would love to see is one where the user who has a hearing loss can hold up the watch to another person’s face and have the audio amplified on a Bluetooth headset (similar to the Live Listen feature for hearing aids on the iPhone). 

Then there is the integration of the Apple Watch with environmental solutions Apple has developed: Apple Pay, HomeKit and iBeacons. As Steven Aquino suggests, Apple Pay and Apple Watch will make payment at places of business easier for those with motor difficulties, who will not have to fumble around trying to get the phone out of their pockets or a handbag to pay. The same goes for checking in for a flight at the airport with Passbook. Similarly, an app for Starwood Hotels will let you open your hotel room from the watch. HomeKit could also make the Apple Watch the controller for a number of devices, from lights to the alarm, the garage door and more (such as the Honeywell Lyric app for controlling a thermostat). As for iBeacons, their integration with the Apple Watch could be used to develop educational activities that add a kinesthetic component to learning (hints or prompts that are activated based on the user’s location in relation to a classroom-based beacon and more).

I believe the App Store will be key to unlocking the full potential of this new product, and if the iPhone is any indication the app store will grow rapidly, especially after Apple holds the World Wide Developer’s Conference this summer.

As for me, I will probably not be getting this first version of the Apple Watch. I want to wait to see how the battery life holds up, as this is an important consideration for an accessibility device you need to use for an entire day. Any device, no matter how fancy it is, will be of limited use if the battery dies when you most need it. If history holds, future versions will be better in this area. The Apple Watch is also expensive at $350 (for the Sport version). As a person with a limited tech budget I have to think about which device will have the most bang for the buck for me. As a photographer, that means that for me that device remains the iPhone, which I hope to upgrade as soon as my current contract runs out later this fall. Wouldn’t it be nice if in the future you could get both of these devices as a bundle (with a discount for buying this way). I wouldn’t hold my breath on that one…

New YouTube Video: Picto4Me Google App

Screenshot of Picto4Me app with simple Yes/No Board

As I was preparing for my Lunch and Learn with MACUL’s special ed folks today (I will be a featured speaker at MACUL in March), I came across a neat Chrome App that I want to share with all of you. The name of the app is Picto4Me (free). What it does is let you create simple communication boards that have speech support (looks like it’s using Google’s Text to Speech engine) and scanning support. It is a very easy to learn tool and while it may not be as sophisticated as some of the commercial solutions out there it can be a nice solution for those implementing Chromebooks.  Here is a quick video overview of this app:

What do you think of this app? Let me know in the comments.

I Am More Powerful Book Project and World Usability Day

The following blog post is cross-posted on the Red Jumper blog.

World Usability Day

Thursday, November 13th is World Usability Day. According to the World Usability Day website, WUD is:

A single day of events occurring around the world that brings together communities of professional, industrial, educational, citizen, and government groups for our common objective: to ensure that the services and products important to life are easier to access and simpler to use. It is about celebration and education – celebrating the strides we have made in creating usable products and educating the masses about how usability impacts our daily lives.

For me, usability and accessibility are personal. The small steps people take that make things like websites, documents and technology products  easier to use for people of all levels of ability have a big impact on my day to day life. Without usability and accessibility, it would not have been possible for me to complete my education or do the advocacy work I do today through this blog, my YouTube videos or my presentations.

I Am More Powerful Than You Think

To celebrate World Usability Day, a group of us are releasing a book with the title “I Am More Powerful Than You Think.” The idea behind the book is to show how technology empowers us as people of different levels of ability to pursue our dreams as students, teachers and world citizens.

There are a number of ways you can access the book:

  • you can download it from the iBookstore (this is the easiest way).
  • you can download a copy from Dropbox and open it in your Chrome web browser using the Readium app for Chrome to read the book on any device that can run that web browser, or
  • you can watch a YouTube preview which will auto-advance through each page and auto-play the embedded media. This is a great feature recently added to Book Creator. It is a great way to share the work broadly with the popular YouTube service and also a great way to collaborate. I used this feature to share drafts of the project with my collaborators as we went along.

Authoring the Book

To build the book, I used the Book Creator app from Red Jumper Studio. Why Book Creator? It is very easy to learn and use (great usability), includes features for making content accessible, and offers the flexibility we needed to tell our story in a way that models Universal Design for Learning principles. With UDL, information is provided in a variety of formats so that people can access it in the way that works best for them. Book Creator allowed us to each tell our stories of empowerment  in three different ways:

  • a text blurb that can read aloud with word highlighting, using the Speak Selection (text to speech) feature built into iOS devices. I tried to make sure the text was large enough (24px) for anyone with low vision to see.
  • a sound recording of the same blurb. Book Creator makes it very easy to record audio by tapping the Add (+) button, choosing Add Sound and then recording right into the device. As an alternative, you can also add a recording from iTunes, which I did for a few of the sound clips which were emailed to me by the rest of our team.
  • video: video was really important for this project. Video connects with people in a way that other formats just can’t. It has an emotional impact that is important for working toward change. One tip I learned after contacting the Book Creator team is that you need to make sure to have the right format. If you import the video from the Camera roll as I did, it will be in the QuickTime (.mov) format and this will cause the book to be rejected when you submit it to the iBookstore (it will still work when you preview it in iBooks but if you want to share it widely I recommend uploading it to the iBookstore). It’s a simple fix: with the video selected, open the Inspector and choose Format > M4V. That will ensure your video is in the right format for the iBookstore.Changing the video format in Book Creator

The videos were actually how this idea first came to be. It all started with a series of tweets and emails after the release of Apple’s Powerful commercial for the iPhone, which ends with the phrase “I’m more powerful than you think.”  Not long after, Christopher Hills released his own More Powerful video and created the hashtag #iAmMorePowerfulThanYouThink on Twitter.

After that it was on. I created my own More Powerful video and asked other people in my personal learning network if they would like to contribute. We ended up with five beautiful videos covering a range of experiences from around the world:

  • I am a visually impaired photographer based in Florida and use my iOS devices for photography and creative expression.
  • Carrie is a special education teacher in Illinois and she uses technology to improve access to education for her students.
  • Christopher is in Australia and he’s a certified Final Cut Pro X video editor with his own video production business.
  • Daniela is in Spain and runs an accessibility consultancy.
  • Sady is a student at Full Sail University in Florida, but she lives in North Dakota and is able to pursue her cinematography degree online.Contributors to I Am More Powerful Than You Think

We are diverse in terms of age, gender, geographic location and how we use technology, but we are united by a common mission: to show the world that technology can make a difference in people’s lives when it includes accessibility from the start.

For each person, there is also a page that documents our online presence using the Hyperlink feature in Book Creator. You can visit our websites, follow us on Twitter, view our YouTube videos and Instagram photos and more. This was important because the many places we post in and participate online are a big part of our stories as well. They build a narrative of what we do and how we do it that is important to understanding the impact of technology in our lives.

Accessibility Options

A nice picture accompanies the contact information and it includes an accessibility description that can be read aloud by VoiceOver to someone who is blind. The team at Red Jumper included was great to include this feature in Book Creator to make the books authored with it more accessible to those who use a screen reader. It is important that accessibility be included not just in the app used to create the books (as it is with Book Creator) but in the content that app outputs. With the accessibility descriptions, we can ensure that’s the case. You can learn how to add an accessibility description in Book Creator by watching this tutorial on my YouTube channel.

Get Involved

We don’t want this book to be the end of this conversation. If you have a story of how technology makes you or someone you work with more powerful, we would love to hear it. Drop me a line or post a link to your story on Twitter with the hashtag #iAmMorePowerfulThanYouThink so we can find it.

The best way to share your story is to use Book Creator to build a few pages according to the template in our book. A nice feature of Book Creator is that you can email a book to another person who can collect several submissions and combine them into one book on one device. This feature makes it very easy to collaborate on global projects like this one. Along with the fact that you can use the app on both iOS and Android, this made it a great choice for us to quickly and easily publish this project. A big thanks to the Red Jumper team for continuing to build on the accessibility and usability of this great app.

New iTunes U Course on UDL and iOS Devices

I recently published a course on iTunes U called nABLEing All Learners with iOS Devices. The course is organized according to the nABLE framework or heuristic I use to help me with technology integration. In designing the course I have tried to incorporate a number of UDL principles as a model:

  • Multiple pathways (UDL Checkpoint 8.2: Vary demands and resources to optimize challenge). Throughout the course there are “Dig Deeper” post that encourage learners to explore a given topic in more depth. This gives the learner some choice: skip these Dig Deeper posts and go through the course at a basic level designed to provide the most essential content; or follow the links and other resources available in these posts to go through the course at a more advanced level. The choice is there for the learner.
  • Accessible Materials (UDL Guidelines 1 and 2): I have paid attention to this guideline in a number of ways: all of the videos I have authored include closed captions (and soon I will be uploading transcripts as well). With the ebooks and other documents, I have paid attention to accessibility by adding tags for screen readers in the case of PDF and by including options such as accessibility descriptions in the case of ePub. For third-party content, I have tried to choose accessible content as much as possible (the resources from CAST are great in this respect).
  • Prompts for reflection and discussion: Throughout the course, I have made use of the new Discussions feature of iTunes U to prompt learners to reflect on their learning. I am going to keep these discussions, but I think in the future I will add some activities with apps (apptivities if you will) to make the learning even more concrete.

I invite you to visit the course and enroll if you are interested in learning more about UDL and how to support its implementation with the wonderful accessibility features available on iOS.

iOS 7.1 Accessibility Update

The great news this week is that Apple has released iOS 7.1 with a number of improvements as reported in this Macworld article.  A number of publications, including Macworld are reporting that the use of the camera as a switch is a new feature (actually, this feature was already included when iOS 7 was released in the fall).

AppleVis also has a really nice writeup aimed at blind and low vision users that includes a list of the different VoiceOver bugs that have been fixed in this update.

I thought I was the only one having issues with the camera  when I updated to iOS 7.0.6 recently. On my iPhone 5s the camera was no longer recognizing faces and announcing them with VoiceOver as it had done before.  Great to see that this is fixed in iOS 7.1 (which I was able to test and happy to report it works again the way I am used to).

As a quick summary, I did the following video that compares some of the features as they work on iOS 7.0.6 (on the left) and iOS 7.1 (on the right). Features discussed in the video include:

  • Reduce motion now works in more places, such as with the multi-tasking animation.
  • Bold text also now works in more places, including the keyboard. This is more noticeable when the dark keyboard is the one shown, as it is when doing a search with Spotlight. I too wish there was an option to always use the dark keyboard (as was the case in an earlier beta) but this is an improvement for those who need additional contrast with the keyboard.
  • Button shapes can be added to make them more perceptible.
  • Increase contrast now includes options for darkening the colors and reducing the white point.

Accessibility news from the Apple iPad Announcement

This week Apple gave us quite a bit to look over. I am still trying to catch up with the many updates that were made available when Apple unveiled not only new Macs and iPads, but also a new version of OS X (now free!) and its iLife and iWork apps for the Mac (yes, now those are free too!). After spending a few days playing with Mavericks, as the new version of OS X is now called (I do miss the felines), here are a few noteworthy additions from an accessibility perspective:

  • Switch Control is now available for OS X. This was a feature that was introduced in iOS 7 for iOS devices and for the most part it works in a similar way on the Mac as it does on iOS.  Rather than writing a lengthy description of this new feature, I created a video for you:
  •  Caption styles. This is another feature that first appeared in iOS 7 and now works in pretty much the same way on the Mac. You can create custom styles to make the captions easier to read on your Mac. Again, I have created a video that shows how this works:

  • Creation of Speakable Items with Automator. Automator is a Mac app that allows you to create workflows for automating repetitive tasks you might want to do on your computer. Speakable Items is found under Interacting in the Accessibility area of System Preferences, and it lets you control your Mac with your voice. You can perform commands such as launching apps, checking your email and more. With Mavericks, you can now create your own Speakable Items using Automator. This video shows you how. For students with physical and motor challenges being able to automate actions so that they can be performed with their speech opens up a lot of possibilities.
  • Improved dictation. In Mountain Lion, Dictation worked well but it was limited to short phrases and it only worked when you had an Internet connection. In Mavericks, Dictation can now work while you are off-line, and it has been improved so that you can speak your text continuously. As before you start Dictation by pressing the Function key twice, but you don’t have to do that again to see your text shown in your editing software. You can just continue speaking and the text will appear as you speak. I see so many applications of this feature for working with students who have writing difficulties, since now they will get almost real time feedback of their editing. The one thing to note is that enabling this feature does require an 800 MB file download so that it can work offline. To me, that’s a small price to pay for adding this cool new feature to my Mac.

Now, Mavericks was not the only big announcement. New versions of iWork and iLife, as well as iBooks Author were also announced. And iBooks and Maps finally come to the Mac. I really like the simpler design of all the iWork apps, and their support for VoiceOver has improved. However, there were two other changes that I found especially exciting:

  • The iWork apps now allow you to enter an accessibility description for your images in the new Format pane. This is huge for giving people the option to create more accessible documents. I also found that when I exported my Pages documents as ePub books, the image descriptions were preserved. This fix addresses what I saw as a big shortcoming with the old version of Pages.
  • Embedded closed captioned videos are supported. I do a lot of presentations, and when I present I try to model what I preach by including captions in my videos. However, in the past I had to jump through a few hoops to get my captions to show up (such as creating a captioned video file and then screen recording it before adding it to Keynote). No need to do that anymore. I can just drag my video that includes captions into my Keynote deck and it will even do the optimization in case I want to add the Keynote file into an iBooks Author project.

Speaking of iBooks Author: it now appears to preserve the captions when you add a Media interactive. This was a big problem before, where you had to use Compressor (not the friendliest program for the teachers I often work with) to combine the original video with a captions file created with MovieCaptioner. Well, now I can just export my video out of MovieCaptioner using the SCC Embed with QT option and then drag it right into an iBooks Author project and it works with no error warning. iBooks Author will do the compression (optimization) for me. One tip is to make sure your video matches the specs for video on the iPad as much as possible. Otherwise, this optimization, which you cannot disable, will take quite a long time.  Previewing your captions in a book is easier too, since iBooks is included with Mavericks and you do not have to connect your iPad to do a preview of your book.

The new iBooks app for the Mac is pretty much what you would expect if you have used the iOS version. All of the supports our students need are there: highlighting, notes, dictionary lookup, study cards for multi-touch books, etc. I really like that you can see the Notes in the margin by pressing Command +3, which works really well in full screen mode to create a nice reading experience.  Another nice feature is that you can open two books at once, which helps if you have a second book that you need to keep referring to while reading. Speak selection is available when you select text, from a contextual menu, but I was surprised that word highlighting is not included. This is one of my favorite features of Speak Selection on iOS and it makes it such a valuable tool. I hope this gets added soon.  My other beef is that some of the buttons at the top when you’re reading a book are missing labels for VoiceOver. Overall, I think having iBooks on the Mac will be welcomed news to many educators and I’m really excited about the convergence of the two platforms. It makes it much easier for those of us who need accessibility support, as we are not really learning different platforms with all the similarities between iOS and OS X.

On the hardware front, I was most excited about the new iPad mini with Retina. After having the original mini, I don’t see myself going back to the larger iPad. I just love the portability of it and it does everything I need it to do. Having Retina is not a huge deal for me (my own retinas don’t really know the difference), but having a better chip will make a difference if it leads to improve performance for VoiceOver, Speak Selection and all those accessibility features I love to use. I can’t wait to get my hands on a 32GB model.

After doing all of the updates on the many devices I own and use, I’m still learning about all that is new. Did I miss anything? Let me know and I will look into it. I’m always learning.

Overview of Accessibility Features in iOS 7

Update: My good friend and fellow ADE Daniela Rubio has created a similar post for our Spanish speaking friends on her Macneticos blog.

The long wait is over. It’s finally here: iOS 7, the latest and radically redesigned version of Apple’s mobile operating system.  Along with the redesigned interface, iOS 7 has a number of new and updated accessibility features which I will outline here (with videos to come soon). I will organize these according to the kinds of supports they provide.

The first thing you notice is that it is now easier to navigate to the accessibility area in the Settings. In iOS 6, Accessibility was toward the bottom of the General pane . In iOS 7, it is much closer to the top of the pane, so that you don’t have to scroll. A small change, but one that hopefully will get more people to explore these settings and to become aware of the powerful assistive technology that is built into their devices. It will also aid with navigation for the people who actually use features like VoiceOver and Switch Control.

Visual Supports

  • Large cursor for VoiceOver: you can now choose to have a larger, thicker cursor when VoiceOver is enabled. This is great for me, as I always had a difficult time seeing the old cursor’s faint outline. This option is found at the bottom of the VoiceOver pane.
  • Enhanced voices and language support: The Language Rotor option for VoiceOver has been replaced with a Languages and Dialects pane which provides a lot more flexibility. In this pane, you can specify a default dialect for your language (U.S. English, Australian English, etc.) and add languages to the rotor like you could in iOS 6. For each dialect or language, you can now download enhanced versions of the voices as well as separately control the speech rate.
  • VoiceOver’s option to use phonetics now has a few options (off, character and phonetics, and phonetics only), whereas before you could only turn the feature on and off.
  • You can use a switch to disable the VoiceOver sound effects. These are the sound cues that let you know when you are at the edge of the screen and so on.
  • New options in the VoiceOver rotor: you can add the option for turning sound effects on and off to the rotor, and there is a new handwriting option. Updated (09/18/13, 3pm): The handwriting option allows you to enter text using your handwriting. For example, you can open up the Notes app and start entering text by using the screen as a canvas where you write your text. The handwriting mode supports a number of gestures: two finger swipe left deletes, two finger swipe right adds a space, three finger swipe right adds a new line. You can also switch between lower case (the default) and upper case, punctuation and numbers by swiping up with three fingers. For navigation on the Home screen, you can enter the a letter and VoiceOver will announce the number of apps that start with that letter (even if they are not on the current screen). If there are several apps that start with the same name, you can swipe up or down with two fingers to navigate the list, then double-tap with one finger to open the desired app when it is announced. The handwriting option also works on the lock screen, where you can use it to enter the numbers for your passcode (it even defaults to numbers). In Safari, you can use the Handwriting feature to navigate by item type (for example, you can write “h” for headings, “l” for links and so on then swipe up or down with two fingers to navigate the various headings, links, etc).
  • Updated (09/18/13, 3pm): VoiceOver has a new gesture for accessing the help from anywhere in iOS: a four finger double-tap will allow you to practice VoiceOver gestures. When you’re done, a second four finger double-tap will exit the VoiceOver help.
  • Enhanced braille support: VoiceOver now supports Nemeth Code for equations, and there is an option for automatic braille translation (supporting U.S., Unified and United Kingdom options).
  • The Large Text option is now called Dynamic Type and it can work with any app that supports the feature rather than the limited set of built-in apps in previous versions of iOS. The size of the text is controlled using a slider rather than by choosing from a list and a live preview shows how the text will appear.
  • Bold type and other visual appearance adjustments: overall, iOS 7’s new design has less contrast than previous versions. However, in addition to large type, there are a number of adjustments you can make to the UI to make it easier to see items on the screen. You can make text bold (requires a restart), increase the contrast when text appears against certain backgrounds, remove the parallax motion effect, and enable on/off labels (I’m guessing this feature is for people who are color blind. The feature will add a small mark to indicate when a control is in the on/off position, which would be helpful because green is used quite a bit throughout the interface and the changes in state could be difficult to perceive for those who are color blind to this color).

Auditory Supports

The big addition here is a Subtitles and Captions pane. This pane brings the Closed Captioning support under the Accessibility area of the Settings, whereas before it was found under Videos. It is a global setting that will control closed captions throughout iOS.

In addition to having a global control for closed captions, the Subtitles and Captioning pane also allows you to select from several presets that make captions more attractive and easier to read. You can even go further and specify your own styles for captions, with many options ranging from font, text size, color and opacity to the color and opacity of the box the captions sit on.

Learning Supports

Guided Access now allows disabling the Sleep/Wake and Volume buttons in iOS 7. You can also access the other options in your triple-click home shortcut (which has now been renamed the Accessibility Shortcut) while Guided Access is enabled. This will allow you to use VoiceOver, Zoom and other accessibility features along with Guided Access.

Like VoiceOver, Speak Selection has enhanced language support, including selection of different speaking rates for each of the supported languages and dialects as well as enhanced quality voices that are available for download as needed.

Both of these features are also supposed to get new APIs which I will verify once I can locate apps that implement them. For Speak Selection, a new speech API will allow apps to tap into the built-in voice support of iOS. The idea is that by not having to include as much voice data, the apps can be smaller and take up less space on the devices. In the case of Guided Access, a new API will allow developers to hide parts of the screen to reduce distractions. This builds on the previous version’s feature of disabling touch in certain areas of the screen.

The built-in dictionary feature now supports additional languages which can be downloaded and managed in the Define popover. When you select a word in a foreign language and tap Define, iOS will open the definition in the appropriate language if you have that dictionary downloaded. This is a nice feature for language learners.

Motor Supports

Probably the biggest addition in iOS 7 for accessibility is Switch Control.  This feature has the potential to do for people with motor and cognitive impairments what VoiceOver has done for the blind community. With Switch Control, items on the screen are highlighted with a cursor sequentially, and when the desired item is highlighted it can be activated by tapping the screen or a separate adaptive device connected to the iOS device over Bluetooth. A menu can also be brought up to access scrolling, saved gestures and a number of device functions such as clicking the Home button. Switch control is highly configurable in iOS 7:

  • you can enable auto scanning and adjust the timing parameters for the auto scanning feature, including the number of times it will loop, how long you have to hold down the switch to activate an item (hold duration) and so on.
  • you can adjust the visual appearance and audio effects: for the visual appearance you can choose a large cursor and select from a number of colors for the scanning cursor (I actually wish this feature were available for VoiceOver as well). For audio, you can choose to hear an audio cue when the cursor advances, as well as enable speech and adjust the speaking rate. This last feature may be helpful to someone who needs to use a switch device but also has low vision and needs the audio cues for the items on the screen.
  • You can add multiple switch sources, and the switch source supports three options: external, screen and camera. The first two are pretty self-explanatory. You either tap on an external device or on the iOS device’s screen to activate an item. I set my iPad up to interpret a tap on the screen as a select action and my external switch (a Pretorian Bluetooth switch/joystick device) to pause scanning. The last option is pretty interesting. The camera can be set to recognize your head movements as an action, and you can assign different actions to either a right or a left head turn.  When a head movement is added as a switch source an option for adjusting the head movement sensitivity will be available. One thing to note is that you should probably have your iOS device on a stand if you plan to make use of the camera as a switch source. Otherwise, moving the device may cause the camera to not recognize your face as desired.


Although not considered an accessibility feature, the improved Siri personal assistant with higher quality male and female voices could come in handy for people with disabilities when they wish to look up information or control their devices quickly.  For example, Siri recognizes a number of new commands: you can turn some of the settings on and off with a simple command (“turn Bluetooth on,” or “enable Do Not Disturb”), or navigate to specific areas of the Settings with a voice command (“open accessibility settings” or “go to accessibility settings”).

Similarly, the new TouchID feature (currently available only on the iPhone 5S) should make it easier for individuals who are blind or who have cognitive disabilities to access the information in their devices. As great as VoiceOver is, entering text has never been a strength, even when it is just a few digits on the lock screen. Using the fingerprint reader built into the Home button of the iPhone 5S (and hopefully future iPads) will make it easier to unlock the device while also ensuring privacy. For individuals with cognitive disabilities, the passcode becomes one less thing they have to remember.

On the iPhone, the Control Center includes a Torch feature that uses the flash to provide a constant source of light. I can see this feature being useful for those who need to scan documents in order to perform OCR. Along with the improved cameras in the new phones released with iOS 7, the additional light could improve the performance of the scanning apps used by many people with print disabilities.

iOS 7 also added the ability to perform automatic updates for apps you own. This could have some accessibility implications because you may have an app installed that is accessible in its current version but may become inaccessible after an update. To prevent this from happening, you can turn off the option for automatic updates in Settings > iTunes & App Store > Updates. The App Store also supports the option for redeeming gift cards using the camera (a feature already available on the Mac with iTunes). For individuals with low vision, the redeem codes on iTunes gift cards can be difficult to read, and this option to scan it with the camera makes the process of redeeming gift cards much easier.

Of the new accessibility features, I am most excited about the captioning styles and Switch Control. These two features build on Apple’s strong support for the blind community to extend accessibility to even more people (especially so in the case of Switch Control and its potential impact for people with motor and cognitive disabilities). What are your thoughts? What are you most excited about in iOS 7 with regard to accessibility?

Accessiblity Descriptions for Images in Book Creator for iPad

I commend the team at Red Jumper Studio, the creators of Book Creator for iPad, for adding an option that will let book authors add accessibility descriptions for images in version 2.7 of their app. This was already one of my favorite apps for content creation on the iPad, as it makes it really easy to create ebooks for the iPad that can include images, videos and audio recordings.  I created the following short video that shows how to add the accessibility descriptions in Book Creator for iPad: