Designed for (fill in the blank)

On the occasion of Global Accessibility Day (GAAD), Apple has created a series of videos highlighting the many ways its iOS devices empower individuals with disabilities to accomplish a variety of goals, from parenting to releasing a new album for a rock band. Each of the videos ends  with the tagline “Defined for” followed by the name of the person starring in the video, closing with “Designed for Everyone.” In this brief post, I want to highlight some of the ways in which this is in fact true. Beyond the more specialized features highlighted in the video (a speech generating app, the VoiceOver screen reader, Made for iPhone hearing aids and Switch Control), there are many other Apple accessibility features that can help everyone, not just people with disabilities:

  • Invert Colors: found under Accessibility > Display Accommodations, this feature was originally intended for people with low vision who need a higher contrast display. However, the higher contrast Invert Colors provides can be helpful in a variety of other situations. One that comes to mind is trying to read on a touch screen while outdoors in bright lighting. The increased contrast provided by Invert Colors can make the text stand out more from the washed out display in that kind of scenario.
  • Zoom: this is another feature that was originally designed for people with low vision, but it can also be a great tool for teaching. You can use Zoom to not only make the content easier to read for the person “in the last row” in any kind of large space, but also to highlight important information. I often will Zoom In (see what I did there, it’s the title of one of my books) on a specific app or control while delivering technology instruction live or on a video tutorial or webinar. Another use is for hide and reveal activities, where you first zoom into the prompt, give students some “thinking time” and then slide to reveal the part of the screen with the answer.
  • Magnifier: need to read the microscopic serial number on a new device, or the expiration name on that medicine you bought years ago and are not sure is still safe to take? No problem, Magnifier (new in iOS 10) to the rescue. A triple-click of the Home button will bring up an interface familiar to anyone who has taken a photo on an iOS device. Using the full resolution of the camera, you can not only zoom into the desired text, but also apply a color filter and even freeze the image for a better look.
  • Closed Captions: although originally developed to support the Deaf and hard of hearing communities, closed captions are probably the best example of universal design on iOS. Closed captions can also help individuals who speak English as a second language, as well as those who are learning how to read (by providing the reinforcement of hearing as well as seeing the words for true multimodal learning). They can also help make the information accessible in any kind of loud environment (a busy lobby, airport, bar or restaurant) where consuming the content has to be done without the benefit of the audio. Finally, closed captions can help when the audio quality is low due to the age of the film, or when the speaker has a thick accent. On Apple TV, there is an option to automatically rewind the video a few seconds and temporarily turn on the closed captions for the audio you just missed. Just say “what did he/she say?” into the Apple TV remote.
  • Speak Screen: this feature found under Accessibility > Speech are meant to help people with vision or reading difficulties, but the convenience it provides can help in any situation where looking at the screen is not possible – one good example is while driving. You can open up a news article in your favorite app that supports Speak Screen while at a stop light, then perform the special gesture (a two finger swipe from the top of the screen) to hear that story read aloud while you drive. At the next stop light, you can perform the gesture again and in this way catch up with all the news while on your way to work! On the Mac, you can even save the output from the text to speech feature as an audio file. One way you could use this audio is to record instructions for any activity that requires you to perform steps in sequence – your own coach in your pocket, if you will!
  • AssistiveTouch: you don’t need to have a motor difficulty to use AssistiveTouch. Just having your device locked into a protective case can pose a problem this feature can solve. With AssistiveTouch, you can bring up onscreen options for buttons that are difficult to reach due to the design of the case or stand. With a case I use for video capture (the iOgrapher) AssistiveTouch is actually required by design. To ensure light doesn’t leak into the lens the designers of this great case covered up the sleep/wake button. The only way to lock the iPad screen after you are done filming is to select the “lock screen” option in AssistiveTouch. Finally, AssistiveTouch can be helpful with older phones with a failing Home button.

While all of these features are featured in the Accessibility area of Settings, they are really “designed for everyone.” Sometimes the problem is not your own physical or cognitive limitations, but constraints imposed by the environment or the situation in which the technology use takes place.

How about you? Are there any other ways you are using the accessibility features to make your life easier even if you don’t have a disability?

5 Accessibility Features Every Learner Should Know

On the occasion of Global Accessibility Awareness Day (GAAD) this week (May 19th), I created this post to highlight some of the iOS accessibility features that can benefit a wide range of diverse learners, not just those who have been labeled as having a disability.

Screen Shot 2016-05-13 at 4.02.00 PM

 

It’s built in.

Every iOS device comes with a standard set of accessibility features that are ready to use as soon as you take the device out of the box. Let’s take a look at a few of these features that can benefit all users in the spirit of Universal Design.

Get started by going to Settings > General > Accessibility!

#1: Closed Captions

Closed captions were originally developed for those with hearing difficulties, but they can help you if you speak English as a second language or just need them as a support for improved processing. Captions can also help if your speakers are not working, or the sound in the video is of poor quality.

80% of caption users did not have any kind of hearing loss in one UK study.

Learn how to enable and customize closed captions on your iOS device.

#2: Speech

All iOS devices support built-in text to speech with the option to turn on word highlighting. Starting with iOS 8, it is possible to use the more natural Alex voice formerly available only on the Mac. TTS supports decoding, which frees you the reader to focus on the meaning of the text.

Breathe!: Alex takes a breath every once in a while to simulate the way we speak!

  • Learn how to enable and use Speak Selection on your iOS device.
  • Bonus tip!: Don’t want to make a selection first? No problem. Just bring up Siri and say “Speak Screen.” This will read everything on the screen!

#3: Safari Reader

Safari’s Reader is not really an accessibility feature (you will not find it in Settings) but it can help you if you find that you get distracted by all the ads when you are reading or doing research online. It is also a nice complement to the Speech features mentioned above. With iOS 9, you can now customize the appearance of the text and even change the background and font to make it easier to read when you surf the Web.

Left my heart in…San Francisco is a new system font available in iOS 9. It is designed to be easier to read, and is one of the font options available for Reader.

Learn how to use Safari Reader when you surf the Web.

#4: Dictation

Whenever you see the iOS keyboard, you can tap the microphone icon to the left of the space bar to start entering text using just your voice. This can help you get your words down on the page (or is it the screen?) more efficiently.

Try It!: Dictation can handle complex words. Try this: Supercalifragilisticexpialidocious.

Dictation supports more than just entering text. Follow the link for a helpful list of additional Dictation commands.

#5: QuickType and 3rd Party Keyboards

QuickType is Apple’s name for the word prediction feature now built into the iOS keyboard. Word prediction can help you if you struggle with spelling, and it can speed up your text entry as well. Starting with iOS 8, it is now possible to customize the built-in keyboard by installing a 3rd party app. The 3rd party keyboards add improved word prediction, themes for changing the appearance of the keys and more.

17 Seconds: World record for texting. Can you beat it?

Learn how to use QuickType and how to set up and use 3rd party keyboards.

Bonus Tips

Struggling to see the screen? – make sure to check out the Vision section in the Accessibility Settings. You can Zoom in to magnify what is shown on the screen,  Invert Colors to enable a high contrast mode, make the text larger with Dynamic Text, and much more.

Sources:

7 accessibility features every teacher should know for back to school.

It’s that time of the year again. The supplies and textbooks have come in. The room is decorated. Soon students will be walking through the door and it’s off the races. A new school year is upon us.

If you are lucky, you have a classroom set of iPads, or you may be in a BYOD situation where students are bringing their devices to school. Did you know about some of the features built into the iPad and other iOS devices that can help you empower all learners to access the curriculum this year? No? Well that’s what this post is about. You don’t need to have any students with IEPs or Section 504 plans to take advantage of these features. They are called universal design features because they can benefit any of a number of learners. Here are the top seven.

Embiggen the text

Yes, I know that’s not a real word (except maybe on The Simpsons) but it means to make the text larger. On iOS devices, this is easy to do and by making the text bigger you will allow your learners to focus all of their energy on understanding  the content rather than squinting and struggling to see it. To make the text larger,  go to Settings > Display and Brightness > Text Size and use the slider to adjust the size as needed. Not big enough? No problem. Go to General > Accessibility > Larger Text instead. There you can turn on even Larger Accessibility Sizes. While you are it you may as well turn on Bold Text to make that text really stand out.

Larger text option in Accessibility settings.

It’s like a negative

Well at least to you…your students probably don’t know what a negative from the film days is, seeing as the only photos they look at are probably on Instagram or Facebook. In any case, reading on screen can be very tiring for our eyes, with all that light coming at us from the screen. As our leaners spend more of the day in front of the screen one thing we can do is reverse the colors to help with eye strain. It’s really simple to turn this feature on and off, so why not try it. If it doesn’t work you can easily go back to the default black on white scheme. Invert colors can be found under Settings > General > Accessibility, or even better just use Siri to turn this feature on/off by saying “Turn on Invert Colors.” The kids will love that trick.

New note in Notes app with Invert Colors turned on.

Let’s hear it for Alex

Alex is not a person, though if you spend some time listening to him reading content on the iPad you may begin to think he is. Alex is the built-in high quality voice that has been available on the Mac for a number of years. Guess what? Now it’s available as a download for use with the text to speech feature built into iOS, officially called Speak Selection. This feature can even highlight the words as it reads them aloud, which can be a big help to some struggling readers. The video explains Speak Selection in more detail.

Speak Selection works great with the Reader feature built into Safari, which removes ads and other distractions from the page. In the upcoming iOS 9 release the Reader feature gains controls for adjusting the text size as well as changing the background and font for even better legibility.

Let’s hear that again

Don’t want to select the text first? No problem. Speak Screen is activated with a special two finger gesture and will read everything that is on the screen (and I do mean everything). Once you turn on Speak Screen in the Speech settings, you can perform a two-finger swipe from the top of the screen (except on the Home screen) to hear everything read aloud. Even better, have Siri help out. Just say “Speak Screen” and it will start reading. You even get an onscreen controller for adjusting the speaking speed.

You complete me

Although it is not technically an accessibility feature, the word prediction built into iOS 8 (QuickType) can be a big help for learners who struggle with spelling or just have a hard time producing text. This feature should be turned on by default but if not you can enable it by going to Settings > General > Keyboard and making sure Predictive is turned on. When you start typing a word, suggestions should pop up in a strip just above the onscreen keyboard.

QuickType bar in iOS onscreen keyboard

Say it with me…Dictation is awesome.

Again, this is not technically an accessibility feature, but it can help those who struggle with typing on the onscreen keyboard by giving them another option: their voice. Just make sure it’s your announcer’s voice by speaking clearly and enunciating as you tap the microphone icon to the left of the space bar to start dictating. You can even use a number of commands, such as “comma,” “period,” and “new line.”

Microphone icon to left of space bar activates Dictation in iOS.

CC is not just for copy

It also stands for closed captioning, a feature that is built into many videos for those who are unable to hear the audio. Closed captions can benefit a number of other leaners: English language learners, struggling readers, and anyone learning a topic with specialized vocabulary where seeing the words as well as hearing them could be helpful (science, for example). And as a bonus you will have a fallback for when your speakers just don’t work (because technology never fails, right?). You can enable the captions for a video that has them by going to Settings > General > Accessibility and choosing Subtitles & Captioning. You can even change the appearance of the captions to make them easier to read.

Have an Apple TV in your classroom? It too has support for captions. Just go to Settings > General > Accessibility > Closed Captions + SDH to turn them on. Just as with your iOS device, you can change the appearance of the captions on Apple TV.

There you have it. A few more things to have in your tool belt as you work to ensure access to learning for all of your students this year, which I hope will be a great one!

Global Accessibility Awareness Day: The Need for an Ecosystems Approach to Accessibility in Education

On the occasion of Global Accessibility Awareness Day, I am excited about the many online and face to face events that will mark this important step toward ensuring a more accessible and inclusive environment for those of us who have special needs.  I will be presenting a session on the use of photography as a tool for disability advocacy as part of Inclusive Design 24, a free 24-Hour event sponsored by The Paciello Group and Adobe Systems. Photography has long been a passion of mine, and I welcome any opportunity to share how I use it as an educator and advocate to challenge perceptions of ability/disability. I will also be sharing resources and insights during a #GAADILN  twitter chat sponsored by the Inclusive Learning Network of the International Society for Technology in Education (ISTE).

I love Global Accessibility Awareness Day (or GAAD as I will refer to it from now on) but if there is one thing that I would change is the name of the event. To me it should be Global Accessibility Action Day. With many of these types of events the focus is on raising awareness of the needs of people of disabilities, as if we have not been asking for our rights for decades now (the ADA is more than two decades old, you know). GAAD gets it right by focusing on action. Organizations such as Shaw Trust Accessibility Services, Deque Systems and Accessibility Partners are offering a number of free services such as document audits, app accessibility consultations and website user testing. Many others are providing webinars and live presentations that aim at more than raising awareness by providing practical information on how to make documents, website and apps more accessible. A review of the full list of events available on the GAAD website makes it clear that this event is about more than just awareness, it is about taking the next step for accessibility.

In my own field of education, I see much progress being made but I also see a need for a more ecosystems approach to inclusion and accessibility. When I think of ecology I think about systems that have a number of parts working together as one, with the sum of these parts being greater than they are on their own.  When it comes to students with disabilities, a number of technologies are now available as built-in options on the mobile devices many of them own. While I am a witness to the impact these technologies can have on the lives of students with disabilities (having been one who used these technologies myself) I believe their impact is limited by their use in isolation rather than as part of a more comprehensive system.

What I would like to see is a change in thinking to focus on a systems approach that addresses what I see as the three As of accessibility:

  • Accessibility Features: companies such as Apple  now include a comprehensive toolkit for accessibility that is built into the core of the operating system.  This means that when I take my new Mac, iPhone or Apple Watch out of the box it will be ready for me to use without the need to purchase or install additional software. Not only that but as my vision gets worse I know that I will be able to take my device out of the box and set it up independently, without having to wait for someone with better eyesight to help me.  These built-in accessibility features have been life-changing for me. Without them I’m not sure I would have been able to pursue higher education and complete my master’s and doctoral studies. I also would not be able to do my photography that brings so much joy and beauty into my life. Unfortunately, not all educators know about even the most basic of these features that are built into the technology their districts have spent so much money to purchase. I am often surprised when I do presentations around the country (and sometimes in other parts of the world) by how little awareness there is among educators of the potential they hold literally  in their hands to change a student’s life. We need to do better in this area of professional development to allow these tools to have an even greater impact on education for all students, not just students with disabilities but any student who struggles with the curriculum and needs additional support.
  • Accessibile Apps:  the built-in accessibility features provide a great baseline for addressing the needs of people with disabilities, but they can’t do it all. There is just too much diversity and variability for that to be the case: not just in the traits and needs of users, but in the settings and tasks where technology is used. For this reason, it is often necessary to extend the capabilities of the built-in accessibility features by installing apps that provide greater customization options. A great example is the Voice Dream Reader app. While iOS has a robust text to speech feature with word highlighting that now supports a high quality Alex voice, Voice Dream Reader allows for even greater customization. The user can adjust the color of both the word and sentence highlighting, something which cannot be done with the built-in Speak Selection feature of iOS.  For those who are blind and use the VoiceOver screen reader, the developer has done an excellent job of labeling all of the app’s controls.   A companion Voice Dream Writer app even provides a special mode for VoiceOver users to make it easier for them to enter and edit text, showing an strong commitment to usability for all users on the part of this developer. Other examples of developers who are doing exemplary work when it comes to creating accessible apps include AssistiveWare ( developers of Proloquo2Go, Proloquo4Text and Pictello, all apps with excellent support for VoiceOver and Switch Control) and Red Jumper  (developers of the popular Book Creator app). The latter added an Accessibility option for images and other objects to help students and educators create accessible content with the app. Unfortunately, these developers are still the exception rather than the rule. With too many apps, swiping through with VoiceOver results in hearing “button” over and over with no indication of what the button actually does. Worse, many of the buttons for key actions sometimes can’t even be selected. Without attention to accessibility from app developers, the accessibility features can’t work to their full potential. No matter how good the voice built into VoiceOver is (and Alex is pretty good) it does me no good if I can’t select the buttons within an app and determine what they do.
  • Accessible Content: the same problems that exist with apps that are inacessible comes into play with much of the content that is available online for students. Too many videos lack captions (or include only automatic computer generated captions that contain too many errors to be useful), and too many ebooks include images that are not labeled with accessibility descriptions  for those who can’t see them. Without these accessibility descriptions, which can be easily added in authoring tools such as iBooks Author, a blind student taking a science class or an economics class will not be able to access the diagrams and other graphics that are so often used in these fields. Again, adding in features such as accessibility descriptions allows the built-in accessibility feature, in this case VoiceOver, to work to its full potential. There are many wonderful examples of books that include accessibility, as well as resources to help educators develop their own accessible books with easy to learn and use tools such as iBooks Author. These include Creating Accessible iBooks Textbooks with iBooks Author from the National Center for Accessible Media and Inclusive Design for iBooks Author by my friend and fellow Apple Distinguished Educator Greg Alchin. For a great example of an engaging and accessible book, one need not look any further than Reach for the Stars, a  multi-touch book from SAS that makes astronomy come alive not only for blind students but anyone who wants to learn about our universe using all of their senses.

As shown by the following diagram, when the three components are present (robust accessibility features, accessible apps, and accessible content) we get a synergy that results in an even greater impact than each tool or feature can have on its own: this is the sweet spot for accessibility in education.

Three overlapping circles labeled as Accessibility Features, Apps and Accessible Content, with the spot where they converged labeled as Sweet Spot.

To ensure accessibility in education we all must work together to realize the advantages of an accessibility ecosystem: companies such as Apple and others who are building accessibility into their products, app developers and content authors. As AssistiveWare’s David Niemeijer so nicely stated in his own GAAD post when we  take accessibility into account we really are designing for everyone because we all one day get old and require the ability to customize the text size and other features of our devices to account for our aging vision and hands.

Furthermore, to quote from a recent Apple commercial, “inclusion promotes innovation.” Thinking about accessibility from the start, in line with the principles of universal design, requires us to be even more creative as we seek to solve problems of access that may someday result in usability improvements for everyone.

A great example of that is the recently released Apple Watch.  Since it has a small screen that makes it difficult to enter text, much of the interaction with the Apple Watch takes place through the Siri personal assistant. The voice recognition technology that makes Siri possible actually had its origins in the disability community, but now it  can be used to account for the constraints of a smart watch and its small screen.

The Apple Watch is also a  great example of an ecosystems approach to accessibility and  its benefits. This device includes many of the same accessibility features that are available on the iPhone and the iPad, which are the same features I can use on my Mac. What this means is that if I get a new Apple Watch I will already know how to use these features, with a few modifications to account for the smaller screen. Similarly, a blind student who has been using his or her iPhone can easily transfer the use of many VoiceOver gestures to the trackpad built into Mac laptops or the Magic Trackpad used on iMacs.

Why is an ecosystems approach like this so important? Ultimately it is because I as a person with a disability need accessibility 24/7, 365 days a year, most likely for the rest of my life (unless a cure is found for my condition). My need for accessibility doesn’t stop when I get up from my desk at home and walk out the door. I need accessibility as I order a ride from a ride sharing service from my smart phone (which has Zoom and VoiceOver built in) , as I take and share the photos that bring so much joy to my life and capture the beauty I encounter in the places I am lucky to visit (through accessible apps such as Instagram) and as I continue to develop my skills and knowledge base by reading ebooks about my field I download from the iBookstore and read with iBooks (accessible content) . For someone like me, accessibility is needed across a number of settings and situations if I am to be independent and continue to make a valuable contribution to society. Only an ecosystems approach can provide the kind of comprehensive accessibility I and many others who have disabilities need to live a fulfilling life.

IOS 6 Accessibility Features Overview

At today’s World Wide Developer’s Conference (WWDC) Apple announced IOS 6 with a number of accessibility enhancements. I am not a developer (yet!) so I don’t have a copy of the OS to check out,  so this post is primarily about what I read on the Apple website and on social media. A few of these features (word highlighting for speak selection, dictionary enhancements, custom alerts)  were tucked away in a single slide Scott Forstall showed, with little additional information on the Apple website. So far, these are the big features announced today:

  • Guided Access: for children with autism, this feature will make it easier to stay on task. Guided Access enables a single app mode where the home button can be disabled, so an app is not closed by mistake. In addition, this feature will make it possible to disable touch in certain areas of an app’s interface (navigation, settings button, etc.). This feature could be used to remove some distractions, and to simplify the interface and make an app easier to learn and use for people with cognitive disabilities. Disabling an area of the interface is pretty easy: draw around it with a finger and it will figure out which controls you mean. I loved how Scott Forstall pointed out the other applications of this technology for museums and other education settings (testing), a great example of how inclusive design is for more than just people with disabilities.
  • VoiceOver integrated with AssistiveTouch: many people have multiple disabilities, and having this integration between two already excellent accessibility features will make it easier for these individuals to work with their computers by providing an option that addresses multiple needs at once. I work with a wounded veteran who is missing most of one hand, has limited use of the other, and is completely blind. I can’t wait to try out these features together with him.
  • VoiceOver integrated with Zoom: people with low vision have had to choose between Zoom and VoiceOver. With IOS 6, we won’t have to make that choice. We will have two features to help us make the most of the vision we have: zoom to magnify and VoiceOver to hear content read aloud and rest our vision.
  • VoiceOver integrated with Maps: The VoiceOver integration with Maps should provide another tool for providing even greater  independence for people who are blind, by making it easier for us to navigate our environment.
  • Siri’s ability to launch apps: this feature makes Siri even more useful for VoiceOver users, who now have two ways to open an app, using touch or with their voice.
  • Custom vibration patterns for alerts: brings the same feature that has been available on the iPhone for phone calls to other alerts. Great for keeping people with hearing disabilities informed of what’s happening on their devices (Twitter and Facebook notifications, etc.).
  • FaceTime over 3G: this will make video chat even more available to people with hearing disabilities.
  • New Made for iPhone hearing aids: Apple will work with hearing aid manufacturers to introduce new hearing aids with high-quality audio and long battery life.
  • Dictionary improvements: for those of us who work with English language learners, IOS 6 will support Spanish, French and German dictionaries. There will also be an option to create a personal dictionary in iCloud to store your own vocabulary words.
  • Word highlights in speak selection: the ability to highlight the words as they are spoken aloud by text to speech benefits many  students with learning disabilities. Speak selection (introduced in IOS 5) now has the same capabilities as many third party apps in IOS 6.

These are the big features that were announced, but there were some small touches that are just as important. One of these is the deep integration of Facebook into IOS. Facebook is one of those apps I love and hate at the same time. I love the amount of social integration it provides for me and other people with disabilities, but I hate how often the interface changes and how difficult it is to figure it out with VoiceOver each time an update takes place. My hope is that Apple’s excellent support for accessibility in built-in apps will extend to the new Facebook integration, providing a more accessible alternative to the Facebook app which will continue to support our social inclusion into mainstream society. You can even use Siri to post a Facebook update.

Aside from the new features I mentioned above, I believe the most important accessibility feature shown today is not a built-in feature or an app, but the entire app ecosystem. It is that app ecosystem that has resulted in apps such as AriadneGPS and Toca Boca, both featured in today’s keynote. The built-in features, while great,  can only go so far in meeting the diverse needs of people with disabilities, so apps are essential to ensure that accessibility is implemented in a way that is flexible and customized as much as possible to each person. My hope is that Apple’s focus on accessibility apps today will encourage even more developers to focus on this market.

Another great accessibility feature that often gets ignored is the ease with which IOS can be updated to take advantage of new features such as Guided Access and the new VoiceOver integration. As Scott Forstall showed on chart during the keynote, only about 7% of Android users have upgraded to version 4.0, compared to 80% for IOS 5. What that means is that almost every IOS user out there is taking advantage of AssistiveTouch and Speak Selection, but only a very small group of Android users are taking advantage of the accessibility features in the latest version of Android.

Big props to Apple for all the work they have done to include accessibility in their products, but more importantly for continuing to show people with disabilities in a positive light. I loved seeing a blind person in the last keynote video for Siri. At this keynote, Apple showed another  blind person “taking on an adventure” by navigating the woods near his house independently. As a person with a visual disability myself, I found that inspiring. I salute the team at Apple for continuing to make people with disabilities more visible to the mainstream tech world, and for continuing to support innovation through inclusive design (both internally and through its developer community).

iPhoto App and Accessibility

This weekend I finally had a chance to try out the new iPhoto app Apple released along with iPad 3 (or as they are calling it “the new iPad.”) As an aspiring photographer I was impressed with the many options for organizing, editing, and sharing photos Apple has packed into this app which only costs $4.99 in the App Store. There have been many reviews of the new app posted online already, so I will not add another one here. However, I do have a unique perspective on the new app that I would like to share. Not only do I like to take photos (calling myself a photographer might be a stretch but it’s a hobby I enjoy and continue to try to get better at every day), but I also have a visual disability so I am part of a small community of blind photographers.

When I opened the iPhoto app on my iPhone, the first thing I did was turn on the VoiceOver built-in screen reader to hear how it would do with the new photo editing app. Frankly, I was not surprised that the new iPhoto app would be as accessible with VoiceOver as it is. I have come to expect accessible products from Apple over last few years, and I’m proud to be associated with it as an Apple Distinguished Educator. However, as I dug deeper into the iPhoto app with VoiceOver, the level of attention to detail in providing accessibility was still pretty impressive. For example, the brushes used to retouch photos (repair, lighten, darken, etc) are all accessible through VoiceOver gestures, as are the exposure and color correction controls and the various effects, . When I selected the crop tool, VoiceOver told me to pinch to resize the photo and as I did so it told me how much as I was zooming in as well as how far the image was offset (“image scaled to 15X, image offest by 15% x and 48% y).

On the iPad, there is a dedicated help button that opens up a series of overlays indicating what each button does. Not only was every part of the overlay accessible, but so is the entire help built into the iPad version of the app. The attention to detail is more impressive to me because there are so few blind photographers who would take advantage of an app such as iPhoto. What it does show is the level of commitment Apple has to accessibility, because it will go to great lengths to add accessibility even when only a few people will benefit from it.

In a recent blog post, accessibility advocate Joe Clark called out a number of hot new apps (Readability, Clear, Path, and Flipboard) for what he called irresponsible web development that results in accessibility barriers. Well, to me this new iPhoto app shows that you can design an app that is not only visually appealing, feature-packed and easy to use and learn, but also accessible to people with visual disabilities. I hope more developers start to realize that accessibility does not have to compete with good design, but that both complement each other.

When I first loaded the iPhoto app on my iPhone (that was the first device I installed the app on) I was too impatient to go on the Web and read about the app before I started to work with it. That’s just the kind of user I am, I like to get right in and try things out. Well, on the iPhone app the Help button from the iPad version of the app is missing. Most of the icons make sense, but in some cases I was unsure, so what I did was turn on VoiceOver and move my finger around the screen to have it announce what each button was for (or to at least give me a better idea). In that case, compatibility with VoiceOver helped me learn the app much faster without having to consult the help, and that got me to thinking. As these devices (phones, tablets, and whatever comes next) continue to get smaller and the interfaces start to use more visuals (tiles, buttons, etc.) and less text, the ability to hear the help may become an essential aspect of learning how to use the interface. In this way, features like VoiceOver would actually enhance the usability of a particular app for everyone – what universal design is all about.

 

Overview of new accessibility features in IOS 5

With IOS 5, Apple has introduced a number of features to make their mobile devices even more accessible to people with disabilities:

  • VoiceOver enhancements: IOS 5 includes an updated voice for VoiceOver, the built-in screen reader for people who have visual disabilities. I have found the new voice to be a great improvement over the old one, especially when reading long passages of text in apps such as iBooks. Another improvement is that the triple-click home option is set to toggle VoiceOver by default. Along with the PC-free setup introduced with IOS 5, this small change has made it possible for someone with a visual disability to independently configure his or her IOS device out of the box, without any help from a sighted person. The Mac-cessibility website has an excellent overview of the many new changes in VoiceOver that I highly recommend reading.
  • Camera app compatibility with VoiceOver: this is a neat feature that will make photography more accessible to people with low vision and those who are blind. With VoiceOver on, if you launch the Camera app it will announce how many faces are in the frame. In my testing this worked pretty well, and I’ve used it successfully on the iPad and the iPod touch. It should work even better on the iPhone, which has a better sensor and optics. Combined with the ability to turn on the camera app from the lock screen on some devices (iPhone and iPod touch) by double-tapping the home button and the fact that you can use the volume up button as a shutter release, Apple has done a lot to make photography more accessible to people with visual disabilities.
  • Text selection showing Speak menu option.Speak selection (text to speech): This is one of my favorite features introduced with IOS 5. It provides another modality for students with learning disabilities who can benefit from hearing the text read aloud to them. To use it, go into Settings, General, Accessibility, tap Speak Selection and choose On. Once you’ve enabled this feature, when you select text a popup will show the option to Speak the text using the VoiceOver voice. Note that you can control the speaking rate for the speak selection feature independently from VoiceOver.
  •  Balance controls for audio: In addition to mono-audio, which combines both channels of stereo audio into a single mono channel, there is now an option for controlling the  left/right balance for stereo sound. On the iPhone, there is now also a special Hearing Aid mode that is supposed to make the device more compatible with hearing aids.
  • Handling of incoming calls: you can choose to automatically route incoming calls to the speaker phone feature of the phone, or to a headset.
  • New alert types: on the iPhone, you can use one of five unique vibration patterns to identify who is calling if you have a hearing disability, or you can create your own pattern by tapping it on the screen. These custom vibration patterns can be assigned in the Contacts app by opening a contact’s information, choosing Edit, Vibration and then Create New Vibration. There is also an option to have the LED  flash go off when you get a notification, a new message, and so on.
  • Assistive touch: this was one of the most anticipated accessibility features in IOS 5. Assistive touch was designed to make IOS devices easier to use for people with motor difficulties. For example, someone who is not able to tap the Home button to exit an app can now bring up an overlay menu with icons for many of the hardware functions of their device, including the Home button. Overlay menu for assistive touch.Assistive touch also includes options allowing for single finger use of many of the multi-touch gestures (including the new four finger gestures available only for the iPad and the pinch gesture used for zooming). To use assistive touch, choose Settings, General, Accessibility and turn on Assistive Touch. You will know assistive touch is enabled when you see a floating circular icon on the screen. Tapping this icon will open the overlay menu with the assistive touch options. Note that you can move the assistive touch icon to another area of the screen if it gets in the way. Please note that Assistive Touch is not compatible with VoiceOver. I really wish the two features could work in tandem. This would be helpful to users with multiple disabilities.
  • Custom gestures: assistive touch includes an option to create your own gestures. Update: I was able to create a few useful gestures after watching this video from Cult of Mac. I created one for scrolling up on a page and one for scrolling down. Now when I’m reading a long web page, instead of having to swipe up or down to scroll I can bring up the assistive touch overlay menu, select the new gesture from the Favorites group and tap once on the screen to scroll.
  • Typing shortcuts: under Settings, General, Keyboard you can create shortcuts for common phrases. For example, you could create a shortcut that would enable you to enter an email signature by simply typing the letters “sig” and pressing the space bar. This feature should provide a big productivity boost to anyone who has difficulty entering text on their mobile device.
  • Siri and dictation (iPhone 4S only): the new personal assistant uses voice recognition and artificial intelligence to respond to a range of user queries that can be made using everyday language rather than preset commands. The Apple website has a video that demos some of the capabilities of Siri.  One of the amazing things about Siri is that it works without any training from the user. Along with Siri, the iPhone 4S also includes an option to dictate text by tapping a microphone button on the keyboard.  The ability to use your voice to control the device can be helpful to many different types of disabilities, including those who have disabilities that make it difficult to input text. One of the things I have found especially frustrating when using VoiceOver on IOS devices is inputting text, so I hope this new dictation feature makes that easier. I will have a chance to test it out more thoroughly once I get my own iPhone 4S (currently out of stock in my area). Update: I finally got my hands on an iPhone 4 and I tried using the dictation feature with VoiceOver. It is working really well for me. I find the microphone button on the onscreen keyboard by moving my finger over it, double-tap to start dictation (as indicated by a tone) and then I double-tap with two fingers to stop it. Even better, after I’m done dictating the text, if I move the phone away from my mouth,  it automatically stops listening! I love this feature.
  • Word selection showing Define menu option.Dictionary: While it is not listed as an accessibility feature, having a system dictionary is a new feature that is great for providing additional language supports to students with learning disabilities. To use this feature, select a word and a popup will show the Define option that will allow you to look it up using the same dictionary that has been previously available only in iBooks.
  • iMessages: a new  add-on for the Messages app makes it possible to send free MMS messages to any owner of an IOS device. Many people with hearing disabilities rely on text messaging as a convenient means of communication. The iMessages will be especially helpful to those who are on a limited text messaging plan.
  • Reminders app: The new Reminders app has a simple interface that will make it a nice app for people who need help with keeping track of assignments and other tasks. On the iPhone 4 or iPhone 4S, tasks can be tied to a location using the phone’s GPS capabilities. One use of this feature could be to set up a reminder for a person to take their medication when they get to a specific location, for example.
  • Airplay mirroring (iPad 2, requires an Apple TV): along with IOS 5, a recent firmware update for the Apple TV enables mirroring to a projector or TV using Airplay. I can see this option being helpful in a class where there are students in wheelchairs who have difficulty moving around the room. Using air mirroring, the teacher could bring the iPad 2 to the student and the rest of the class could still see what is displayed by the projector or TV.
The new accessibility features make IOS 5 a must-have update for anyone who has a disability, as well as for those who work with individuals with disabilities. For schools and other educational institutions, the accessibility features of IOS make Apple mobile devices an ideal choice for implementing mobile learning while complying with legal requirements such as Section 504, Section 508 and the Americans with Disabilities Act.
Disclosure: I am an Apple Distinguished Educator.