Tag Archives: universal design

10+ Accessibility Features Coming with iOS 11

 

Slide during Apple's Keynote at WWDC showing a big list of upcoming iOS 11 features that were not discussed during the keynote.At its recent World Wide Developers Conference (WWDC), Apple gave developers and the public a preview of the next version of iOS, its operating system for mobile devices such as the iPad, iPhone and iPod Touch. This post will not provide a full overview of all the changes coming to iOS 11, but will instead focus on  a few key ones that will have the most impact for people like me who rely on the built-in accessibility of these devices.

A public beta will be coming later this summer, and I can’t wait to get my hands on it to start testing some of these new features. For now, this overview is based on everything I have read on the Web (shout out to AppleVis for a great overview of the new features for those with visual impairments), what I saw in the videos available through the WWDC app after each day of the conference, and updates from people I follow on social media who were at the WWWDC (you should definitely follow Steven Aquino, who provided excellent coverage of all things iOS accessibility from WWDC).

Without further delay, here are ten new or enhanced iOS accessibility features coming soon to an iPad or iPhone near you:

  1. Smart Invert Colors: In iOS 11, Invert Colors will no longer be an all or nothing affair. The new version of Invert Colors will actually leave the images and video alone with a new Smart Invert option. This fixes a problem I have always had with Invert Colors. Sometimes there is text in a graphic or video that is essential for understanding, but with Invert Colors as it currently exists it can be difficult to read this text. This will no longer be the case with iOS 11 and its new version of Invert Colors.
  2. Enhanced Dynamic Type: Dynamic Type has been enhanced to reduce clipping and overlapping at larger text sizes, and Dynamic Type will work in more of the UI for apps that support it. In some areas of the UI where text is not resized dynamically, such as tab bars, a tap and hold on the selected control will show it at a larger size in the middle of the screen.
  3. VoiceOver descriptions for images: VoiceOver will be able to detect text that’s embedded in an image, even if the image lacks alternative text (or as Apple calls it, an accessibility description). VoiceOver will  also announce some of the items in a photo that has not been described (tree, dog, sunset, etc.), much like the Camera app already does when you take a photo with VoiceOver turned on.
  4. A more customizable Speech feature: you can now customize the colors for the word and sentence highlighting that is available for Speech features such as Speak Selection and Speak Screen. These features are helpful for learners who struggle with decoding print and need the content read aloud. The highlighting can also help with attention and focus while reading, and it’s nice to see we can now change the color for a more personalized reading experience.
  5. Type for Siri: In addition to using your voice, iOS 11 also allows you to interact with Siri by typing your requests. This is not only an accessibility feature (it can help those with speech difficulties who are not easily understood by Siri) but a privacy convenience for everyone else. Sometimes you are in a public place and don’t want those around you to know what you are asking Siri.
  6. More options for captions: for videos that include closed captions, you can now enable an additional style that makes the text larger and adds an outline to make it stand out from the background content.  Along with this new style, you can now turn on spoken captions or convert the captions to Braille . This last option could make the content more accessible to individuals with multiple disabilities.
  7. Switch Control enhancements: typing can take a lot of time and effort when using switch access technologies. With iOS 11, Apple hopes to make this process easier by providing better switch control word prediction  as well as a “scan same key after tap” option (this will repeat the same key without requiring the user to scan to it again, which can take some time). Other Switch Control enhancements for better overall usability include:
    • Point Mode has an additional setting for more precise selections: this will add a third scan to refine the selection at an even slower pace, and early reports are that it actually selects the actual point rather than the surrounding accessibility element (button, etc.).
    • Scanner Menu option for Media Controls: recognizing that media playback is a popular activity for switch users (just as it is for everybody else), a new category for Media Controls has been added to the scanner menu. I assume that this feature will work on any app with playback controls, which would make it a great option to use with Voice Dream Reader or any other app with playback controls at the bottom of the screen (which require a lot of scanning to access).
  8. Improved PDF accessibility support: while I am not a big fan of PDF as a format, there are still a lot of legacy PDF documents out there so it is nice to see improved support for PDF accessibilty in iOS 11. One of the common uses of PDFs is to collect information through forms, and with iOS 11 Apple promises better support for forms as well as for tagged (properly marked up) PDF documents.
  9. Better Braille support: as reported by AppleVis, the Braille improvements in iOS 11 include better text editing and more customizable actions that can be performed through shortcuts on a Braille display.
  10. A redesigned Control Center: you will have more ways to get to the features you use the most with the new Control Center, which will now allow you to add widgets for the Accessibility Shortcut,  the Magnifier and text resizing.

As with most releases of iOS, there are a number of features that can be beneficial to users with disabilities even if they are not found in the Accessibility area of Settings. These include:

  1. An improved Siri voice: we got a taste of how much more natural the new Siri voice will sound, but there was not a lot of detail provided. It is not clear if this voice will be available to VoiceOver users, or if it will be incorporated into the rest of the Speech features such as Speak Selection and Speak Screen when iOS 11 finally ships to the general public in the fall.
  2. Siri translation: Siri can now translate from English into a few languages – Chinese, French, German, Italian, or Spanish.
  3. Persistent Reader View in Safari: when you long-press the Reader icon, you can choose to have reader activate on all websites, or just on the current one. Reader view is helpful for removing ads and other distractions that can compete for attention, especially for people with learning difficulties such as ADHD.
  4. Apple TV remote in Control Center: It is also possible to add an onscreen Apple TV remote to the Control Center. This will be helpful for Switch Control or VoiceOver users who may prefer this option to using the physical Apple TV remote.
  5. One handed keyboard: this option is intended to help anyone who is trying to enter text while one hand is busy with groceries, etc. but it can also be helpful to someone who is missing a limb. Tapping on the Globe icon that provides access to third party keyboards will now show an option for moving the keyboard to either side of the screen where it can be easier to reach with one hand.
  6. One handed Zoom in Maps: this feature is intended for drivers to have better access to Maps while on the road, but as with the one-handed keyboard, others will benefit from this change as well. As someone who often has one hand busy with a white cane, I welcome all of these features that make the iPhone easier to use with just one hand.
  7. Redesigned onscreen keyboard: Letters, numbers, symbols, and punctuation marks are now all on the same keyboard. Switching between them is as easy as a simple flicking gesture on the desired key.
  8. Easier setup of new devices; anything that reduces the amount of time entering settings is helpful to switch and screen reader users. When setting up a new iOS device, there’s now an option in iOS 11 to hold it near an existing device to automatically copy over settings, preferences, and iCloud Keychain.
  9. More customizable AirPod tap controls: AirPods can now be customized with separate double tap gestures for the left and right AirPod.  One can be set to access Siri, for example, while another can be set to play the next track. Previously, double-tap settings were applied to both AirPods. This will be helpful for individuals who rely on AirPods to access Siri as an accessibility option.
  10. Less restrictive HomeKit development: it will now be possible to develop a HomeKit device without being part of Apple’s HomeKit licensing program. All that will be required is a developer account. The catch is that any HomeKit devices  developed this way cannot go up for sale. That should be fine for assistive tech makers who just want to experiment and prototype solutions for their clients without the investment required to be part of the official HomeKit program. As The Verge suggests, this could also encourage more developers to dip their toes with HomeKit development, which will hopefully lead to more options for those of us who depend on smart homes for improved accessibility.
  11. QR Code support in the Camera: QR codes can be a helpful way to provide access to online resources without requiring a lot of typing of URLS and the like. They are frequently used in the classroom for this purpose, so I know teachers will find having this feature as a built-in option a welcome addition.
  12. SOS: There’s an Emergency SOS option in Settings app that allows users to turn on an “Auto Call” feature. This will immediately dial 911 when the Sleep/Wake button is pressed five times. A similar feature has been available on the Apple Watch since the introduction of watchOS 3, and it’s nice to see it on the iPhone as well.
  13. Core Bluetooth support for Apple Watch: while this is an Apple Watch feature, I’m mentioning it here because the Apple Watch is still very closely tied to its paired iPhone. With watchOS 4, which was also previewed at the WWWDC, Apple Watch Series 2 is getting support for connecting directly to Bluetooth low energy accessories like those that are used for glucose tracking and delivery. Furthermore, the Health app will have better support for diabetes management in conjunction with CoreBluetooth, including new metrics related to blood glucose tracking and insulin delivery.
  14. Indoor navigation in Maps: Maps has been a big help for me whenever I find myself in an area I don’t know. I love the walking directions and how well they integrate with the Apple Watch so that I can leave my phone in my pocket as I navigate with haptic feedback and don’t give off that lost tourist look. With iOS 11, these features will be extended to indoor spaces such as major malls and airports.
  15. A redesigned App Store: the screenshots I have seen point to a bigger, bolder design for the App Store, which will be welcome news to those of us with low vision. If you like how Apple News looks now, you will be pleased with the redesigned App Store.
  16. Built-in screen recording: I rely on Screenflow to record my video tutorials, but having screen recording as a built-in feature will be convenient for quick videos. This will be great for providing tech support to parents, or for documenting accessibility bugs to developers.
  17. Person to Person payments in Messages: anything that allows payments without the need to use inaccessible currency is A OK with me.

The Notes app has been greatly enhanced in iOS 11 to make it an even better tool for diverse learners who need options for how they capture, store and retrieve information:

  • Instant Markup: adding an annotation to a PDF document or screenshot will be as easy as picking up the Apple Pencil and touching the screen to start drawing/annotating.
  • Instant Notes: tapping the lock screen with Apple Pencil will create a quick handwritten note that will appear in Notes once the device is unlocked.
  • Inline Drawing: when you begin to draw or annotate in Notes, the text around the annotation will move out of the way. You can add inline drawings in Mail as well.
  • Searchable annotations in Notes: everything you write with the Apple Pencil in Notes will now be searchable, making it much easier to find the highlights in long notes taken during a lecture or long presentation.
  • Document Scanner: the new Document Scanner in Notes will detect the edges to automatically scan a document, crop it, and remove glare and tilt to produce a cleaner image. This will result in a better scan you could then pass the scanned document off to a different app to perform even better optical character recognition (OCR). I am hoping this feature is just a start, and eventually we will get built-in OCR in iOS.

A major focus with iOS 11 is improved support for iPad productivity. This includes support for Drag and Drop in an enhanced Split View, as well as a new multi-tasking view with what appears to be the equivalent of Spaces on the Mac. With Apple’s excellent track record of accessibility, I’m confident these features will have the same level of accessibility as the rest of iOS.

I can’t wait to try out iOS 11 when the public beta becomes available to start enjoying some of these features on at least one of my devices (not the one I use to get work done, of course – at least not until late in the beta cycle when everything is much more stable).

How about you – which ones of these iOS 11 features have you most excited? Which ones do you think you will use the most?

 

Advertisements

Designed for (fill in the blank)

On the occasion of Global Accessibility Day (GAAD), Apple has created a series of videos highlighting the many ways its iOS devices empower individuals with disabilities to accomplish a variety of goals, from parenting to releasing a new album for a rock band. Each of the videos ends  with the tagline “Defined for” followed by the name of the person starring in the video, closing with “Designed for Everyone.” In this brief post, I want to highlight some of the ways in which this is in fact true. Beyond the more specialized features highlighted in the video (a speech generating app, the VoiceOver screen reader, Made for iPhone hearing aids and Switch Control), there are many other Apple accessibility features that can help everyone, not just people with disabilities:

  • Invert Colors: found under Accessibility > Display Accommodations, this feature was originally intended for people with low vision who need a higher contrast display. However, the higher contrast Invert Colors provides can be helpful in a variety of other situations. One that comes to mind is trying to read on a touch screen while outdoors in bright lighting. The increased contrast provided by Invert Colors can make the text stand out more from the washed out display in that kind of scenario.
  • Zoom: this is another feature that was originally designed for people with low vision, but it can also be a great tool for teaching. You can use Zoom to not only make the content easier to read for the person “in the last row” in any kind of large space, but also to highlight important information. I often will Zoom In (see what I did there, it’s the title of one of my books) on a specific app or control while delivering technology instruction live or on a video tutorial or webinar. Another use is for hide and reveal activities, where you first zoom into the prompt, give students some “thinking time” and then slide to reveal the part of the screen with the answer.
  • Magnifier: need to read the microscopic serial number on a new device, or the expiration name on that medicine you bought years ago and are not sure is still safe to take? No problem, Magnifier (new in iOS 10) to the rescue. A triple-click of the Home button will bring up an interface familiar to anyone who has taken a photo on an iOS device. Using the full resolution of the camera, you can not only zoom into the desired text, but also apply a color filter and even freeze the image for a better look.
  • Closed Captions: although originally developed to support the Deaf and hard of hearing communities, closed captions are probably the best example of universal design on iOS. Closed captions can also help individuals who speak English as a second language, as well as those who are learning how to read (by providing the reinforcement of hearing as well as seeing the words for true multimodal learning). They can also help make the information accessible in any kind of loud environment (a busy lobby, airport, bar or restaurant) where consuming the content has to be done without the benefit of the audio. Finally, closed captions can help when the audio quality is low due to the age of the film, or when the speaker has a thick accent. On Apple TV, there is an option to automatically rewind the video a few seconds and temporarily turn on the closed captions for the audio you just missed. Just say “what did he/she say?” into the Apple TV remote.
  • Speak Screen: this feature found under Accessibility > Speech are meant to help people with vision or reading difficulties, but the convenience it provides can help in any situation where looking at the screen is not possible – one good example is while driving. You can open up a news article in your favorite app that supports Speak Screen while at a stop light, then perform the special gesture (a two finger swipe from the top of the screen) to hear that story read aloud while you drive. At the next stop light, you can perform the gesture again and in this way catch up with all the news while on your way to work! On the Mac, you can even save the output from the text to speech feature as an audio file. One way you could use this audio is to record instructions for any activity that requires you to perform steps in sequence – your own coach in your pocket, if you will!
  • AssistiveTouch: you don’t need to have a motor difficulty to use AssistiveTouch. Just having your device locked into a protective case can pose a problem this feature can solve. With AssistiveTouch, you can bring up onscreen options for buttons that are difficult to reach due to the design of the case or stand. With a case I use for video capture (the iOgrapher) AssistiveTouch is actually required by design. To ensure light doesn’t leak into the lens the designers of this great case covered up the sleep/wake button. The only way to lock the iPad screen after you are done filming is to select the “lock screen” option in AssistiveTouch. Finally, AssistiveTouch can be helpful with older phones with a failing Home button.

While all of these features are featured in the Accessibility area of Settings, they are really “designed for everyone.” Sometimes the problem is not your own physical or cognitive limitations, but constraints imposed by the environment or the situation in which the technology use takes place.

How about you? Are there any other ways you are using the accessibility features to make your life easier even if you don’t have a disability?

5 Accessibility Features Every Learner Should Know

On the occasion of Global Accessibility Awareness Day (GAAD) this week (May 19th), I created an infographic using Piktochart to highlight some of the iOS accessibility features that can benefit a wide range of diverse learners, not just those who have been labeled as having a disability.

Screen Shot 2016-05-13 at 4.02.00 PM

This post is an alternative representation for those who cannot access it as an infographic.

It’s built in.

Every iOS device comes with a standard set of accessibility features that are ready to use as soon as you take the device out of the box. Let’s take a look at a few of these features that can benefit all users in the spirit of Universal Design.

Get started by going to Settings > General > Accessibility!

#1: Closed Captions

Closed captions were originally developed for those with hearing difficulties, but they can help you if you speak English as a second language or just need them as a support for improved processing. Captions can also help if your speakers are not working, or the sound in the video is of poor quality.

80% of caption users did not have any kind of hearing loss in one UK study.

Learn how to enable and customize closed captions on your iOS device.

#2: Speech

All iOS devices support built-in text to speech with the option to turn on word highlighting. Starting with iOS 8, it is possible to use the more natural Alex voice formerly available only on the Mac. TTS supports decoding, which frees you the reader to focus on the meaning of the text.

Breathe!: Alex takes a breath every once in a while to simulate the way we speak!

  • Learn how to enable and use Speak Selection on your iOS device.
  • Bonus tip!: Don’t want to make a selection first? No problem. Just bring up Siri and say “Speak Screen.” This will read everything on the screen!

#3: Safari Reader

Safari’s Reader is not really an accessibility feature (you will not find it in Settings) but it can help you if you find that you get distracted by all the ads when you are reading or doing research online. It is also a nice complement to the Speech features mentioned above. With iOS 9, you can now customize the appearance of the text and even change the background and font to make it easier to read when you surf the Web.

Left my heart in…San Francisco is a new system font available in iOS 9. It is designed to be easier to read, and is one of the font options available for Reader.

Learn how to use Safari Reader when you surf the Web.

#4: Dictation

Whenever you see the iOS keyboard, you can tap the microphone icon to the left of the space bar to start entering text using just your voice. This can help you get your words down on the page (or is it the screen?) more efficiently.

Try It!: Dictation can handle complex words. Try this: Supercalifragilisticexpialidocious.

Dictation supports more than just entering text. Follow the link for a helpful list of additional Dictation commands.

#5: QuickType and 3rd Party Keyboards

QuickType is Apple’s name for the word prediction feature now built into the iOS keyboard. Word prediction can help you if you struggle with spelling, and it can speed up your text entry as well. Starting with iOS 8, it is now possible to customize the built-in keyboard by installing a 3rd party app. The 3rd party keyboards add improved word prediction, themes for changing the appearance of the keys and more.

17 Seconds: World record for texting. Can you beat it?

Learn how to use QuickType and how to set up and use 3rd party keyboards.

Bonus Tips

Struggling to see the screen? – make sure to check out the Vision section in the Accessibility Settings. You can Zoom in to magnify what is shown on the screen,  Invert Colors to enable a high contrast mode, make the text larger with Dynamic Text, and much more.

Sources:

7 accessibility features every teacher should know for back to school.

It’s that time of the year again. The supplies and textbooks have come in. The room is decorated. Soon students will be walking through the door and it’s off the races. A new school year is upon us.

If you are lucky, you have a classroom set of iPads, or you may be in a BYOD situation where students are bringing their devices to school. Did you know about some of the features built into the iPad and other iOS devices that can help you empower all learners to access the curriculum this year? No? Well that’s what this post is about. You don’t need to have any students with IEPs or Section 504 plans to take advantage of these features. They are called universal design features because they can benefit any of a number of learners. Here are the top seven.

Embiggen the text

Yes, I know that’s not a real word (except maybe on The Simpsons) but it means to make the text larger. On iOS devices, this is easy to do and by making the text bigger you will allow your learners to focus all of their energy on understanding  the content rather than squinting and struggling to see it. To make the text larger,  go to Settings > Display and Brightness > Text Size and use the slider to adjust the size as needed. Not big enough? No problem. Go to General > Accessibility > Larger Text instead. There you can turn on even Larger Accessibility Sizes. While you are it you may as well turn on Bold Text to make that text really stand out.

Larger text option in Accessibility settings.

It’s like a negative

Well at least to you…your students probably don’t know what a negative from the film days is, seeing as the only photos they look at are probably on Instagram or Facebook. In any case, reading on screen can be very tiring for our eyes, with all that light coming at us from the screen. As our leaners spend more of the day in front of the screen one thing we can do is reverse the colors to help with eye strain. It’s really simple to turn this feature on and off, so why not try it. If it doesn’t work you can easily go back to the default black on white scheme. Invert colors can be found under Settings > General > Accessibility, or even better just use Siri to turn this feature on/off by saying “Turn on Invert Colors.” The kids will love that trick.

New note in Notes app with Invert Colors turned on.

Let’s hear it for Alex

Alex is not a person, though if you spend some time listening to him reading content on the iPad you may begin to think he is. Alex is the built-in high quality voice that has been available on the Mac for a number of years. Guess what? Now it’s available as a download for use with the text to speech feature built into iOS, officially called Speak Selection. This feature can even highlight the words as it reads them aloud, which can be a big help to some struggling readers. The video explains Speak Selection in more detail.

Speak Selection works great with the Reader feature built into Safari, which removes ads and other distractions from the page. In the upcoming iOS 9 release the Reader feature gains controls for adjusting the text size as well as changing the background and font for even better legibility.

Let’s hear that again

Don’t want to select the text first? No problem. Speak Screen is activated with a special two finger gesture and will read everything that is on the screen (and I do mean everything). Once you turn on Speak Screen in the Speech settings, you can perform a two-finger swipe from the top of the screen (except on the Home screen) to hear everything read aloud. Even better, have Siri help out. Just say “Speak Screen” and it will start reading. You even get an onscreen controller for adjusting the speaking speed.

You complete me

Although it is not technically an accessibility feature, the word prediction built into iOS 8 (QuickType) can be a big help for learners who struggle with spelling or just have a hard time producing text. This feature should be turned on by default but if not you can enable it by going to Settings > General > Keyboard and making sure Predictive is turned on. When you start typing a word, suggestions should pop up in a strip just above the onscreen keyboard.

QuickType bar in iOS onscreen keyboard

Say it with me…Dictation is awesome.

Again, this is not technically an accessibility feature, but it can help those who struggle with typing on the onscreen keyboard by giving them another option: their voice. Just make sure it’s your announcer’s voice by speaking clearly and enunciating as you tap the microphone icon to the left of the space bar to start dictating. You can even use a number of commands, such as “comma,” “period,” and “new line.”

Microphone icon to left of space bar activates Dictation in iOS.

CC is not just for copy

It also stands for closed captioning, a feature that is built into many videos for those who are unable to hear the audio. Closed captions can benefit a number of other leaners: English language learners, struggling readers, and anyone learning a topic with specialized vocabulary where seeing the words as well as hearing them could be helpful (science, for example). And as a bonus you will have a fallback for when your speakers just don’t work (because technology never fails, right?). You can enable the captions for a video that has them by going to Settings > General > Accessibility and choosing Subtitles & Captioning. You can even change the appearance of the captions to make them easier to read.

Have an Apple TV in your classroom? It too has support for captions. Just go to Settings > General > Accessibility > Closed Captions + SDH to turn them on. Just as with your iOS device, you can change the appearance of the captions on Apple TV.

There you have it. A few more things to have in your tool belt as you work to ensure access to learning for all of your students this year, which I hope will be a great one!

Global Accessibility Awareness Day: The Need for an Ecosystems Approach to Accessibility in Education

On the occasion of Global Accessibility Awareness Day, I am excited about the many online and face to face events that will mark this important step toward ensuring a more accessible and inclusive environment for those of us who have special needs.  I will be presenting a session on the use of photography as a tool for disability advocacy as part of Inclusive Design 24, a free 24-Hour event sponsored by The Paciello Group and Adobe Systems. Photography has long been a passion of mine, and I welcome any opportunity to share how I use it as an educator and advocate to challenge perceptions of ability/disability. I will also be sharing resources and insights during a #GAADILN  twitter chat sponsored by the Inclusive Learning Network of the International Society for Technology in Education (ISTE).

I love Global Accessibility Awareness Day (or GAAD as I will refer to it from now on) but if there is one thing that I would change is the name of the event. To me it should be Global Accessibility Action Day. With many of these types of events the focus is on raising awareness of the needs of people of disabilities, as if we have not been asking for our rights for decades now (the ADA is more than two decades old, you know). GAAD gets it right by focusing on action. Organizations such as Shaw Trust Accessibility Services, Deque Systems and Accessibility Partners are offering a number of free services such as document audits, app accessibility consultations and website user testing. Many others are providing webinars and live presentations that aim at more than raising awareness by providing practical information on how to make documents, website and apps more accessible. A review of the full list of events available on the GAAD website makes it clear that this event is about more than just awareness, it is about taking the next step for accessibility.

In my own field of education, I see much progress being made but I also see a need for a more ecosystems approach to inclusion and accessibility. When I think of ecology I think about systems that have a number of parts working together as one, with the sum of these parts being greater than they are on their own.  When it comes to students with disabilities, a number of technologies are now available as built-in options on the mobile devices many of them own. While I am a witness to the impact these technologies can have on the lives of students with disabilities (having been one who used these technologies myself) I believe their impact is limited by their use in isolation rather than as part of a more comprehensive system.

What I would like to see is a change in thinking to focus on a systems approach that addresses what I see as the three As of accessibility:

  • Accessibility Features: companies such as Apple  now include a comprehensive toolkit for accessibility that is built into the core of the operating system.  This means that when I take my new Mac, iPhone or Apple Watch out of the box it will be ready for me to use without the need to purchase or install additional software. Not only that but as my vision gets worse I know that I will be able to take my device out of the box and set it up independently, without having to wait for someone with better eyesight to help me.  These built-in accessibility features have been life-changing for me. Without them I’m not sure I would have been able to pursue higher education and complete my master’s and doctoral studies. I also would not be able to do my photography that brings so much joy and beauty into my life. Unfortunately, not all educators know about even the most basic of these features that are built into the technology their districts have spent so much money to purchase. I am often surprised when I do presentations around the country (and sometimes in other parts of the world) by how little awareness there is among educators of the potential they hold literally  in their hands to change a student’s life. We need to do better in this area of professional development to allow these tools to have an even greater impact on education for all students, not just students with disabilities but any student who struggles with the curriculum and needs additional support.
  • Accessibile Apps:  the built-in accessibility features provide a great baseline for addressing the needs of people with disabilities, but they can’t do it all. There is just too much diversity and variability for that to be the case: not just in the traits and needs of users, but in the settings and tasks where technology is used. For this reason, it is often necessary to extend the capabilities of the built-in accessibility features by installing apps that provide greater customization options. A great example is the Voice Dream Reader app. While iOS has a robust text to speech feature with word highlighting that now supports a high quality Alex voice, Voice Dream Reader allows for even greater customization. The user can adjust the color of both the word and sentence highlighting, something which cannot be done with the built-in Speak Selection feature of iOS.  For those who are blind and use the VoiceOver screen reader, the developer has done an excellent job of labeling all of the app’s controls.   A companion Voice Dream Writer app even provides a special mode for VoiceOver users to make it easier for them to enter and edit text, showing an strong commitment to usability for all users on the part of this developer. Other examples of developers who are doing exemplary work when it comes to creating accessible apps include AssistiveWare ( developers of Proloquo2Go, Proloquo4Text and Pictello, all apps with excellent support for VoiceOver and Switch Control) and Red Jumper  (developers of the popular Book Creator app). The latter added an Accessibility option for images and other objects to help students and educators create accessible content with the app. Unfortunately, these developers are still the exception rather than the rule. With too many apps, swiping through with VoiceOver results in hearing “button” over and over with no indication of what the button actually does. Worse, many of the buttons for key actions sometimes can’t even be selected. Without attention to accessibility from app developers, the accessibility features can’t work to their full potential. No matter how good the voice built into VoiceOver is (and Alex is pretty good) it does me no good if I can’t select the buttons within an app and determine what they do.
  • Accessible Content: the same problems that exist with apps that are inacessible comes into play with much of the content that is available online for students. Too many videos lack captions (or include only automatic computer generated captions that contain too many errors to be useful), and too many ebooks include images that are not labeled with accessibility descriptions  for those who can’t see them. Without these accessibility descriptions, which can be easily added in authoring tools such as iBooks Author, a blind student taking a science class or an economics class will not be able to access the diagrams and other graphics that are so often used in these fields. Again, adding in features such as accessibility descriptions allows the built-in accessibility feature, in this case VoiceOver, to work to its full potential. There are many wonderful examples of books that include accessibility, as well as resources to help educators develop their own accessible books with easy to learn and use tools such as iBooks Author. These include Creating Accessible iBooks Textbooks with iBooks Author from the National Center for Accessible Media and Inclusive Design for iBooks Author by my friend and fellow Apple Distinguished Educator Greg Alchin. For a great example of an engaging and accessible book, one need not look any further than Reach for the Stars, a  multi-touch book from SAS that makes astronomy come alive not only for blind students but anyone who wants to learn about our universe using all of their senses.

As shown by the following diagram, when the three components are present (robust accessibility features, accessible apps, and accessible content) we get a synergy that results in an even greater impact than each tool or feature can have on its own: this is the sweet spot for accessibility in education.

Three overlapping circles labeled as Accessibility Features, Apps and Accessible Content, with the spot where they converged labeled as Sweet Spot.

To ensure accessibility in education we all must work together to realize the advantages of an accessibility ecosystem: companies such as Apple and others who are building accessibility into their products, app developers and content authors. As AssistiveWare’s David Niemeijer so nicely stated in his own GAAD post when we  take accessibility into account we really are designing for everyone because we all one day get old and require the ability to customize the text size and other features of our devices to account for our aging vision and hands.

Furthermore, to quote from a recent Apple commercial, “inclusion promotes innovation.” Thinking about accessibility from the start, in line with the principles of universal design, requires us to be even more creative as we seek to solve problems of access that may someday result in usability improvements for everyone.

A great example of that is the recently released Apple Watch.  Since it has a small screen that makes it difficult to enter text, much of the interaction with the Apple Watch takes place through the Siri personal assistant. The voice recognition technology that makes Siri possible actually had its origins in the disability community, but now it  can be used to account for the constraints of a smart watch and its small screen.

The Apple Watch is also a  great example of an ecosystems approach to accessibility and  its benefits. This device includes many of the same accessibility features that are available on the iPhone and the iPad, which are the same features I can use on my Mac. What this means is that if I get a new Apple Watch I will already know how to use these features, with a few modifications to account for the smaller screen. Similarly, a blind student who has been using his or her iPhone can easily transfer the use of many VoiceOver gestures to the trackpad built into Mac laptops or the Magic Trackpad used on iMacs.

Why is an ecosystems approach like this so important? Ultimately it is because I as a person with a disability need accessibility 24/7, 365 days a year, most likely for the rest of my life (unless a cure is found for my condition). My need for accessibility doesn’t stop when I get up from my desk at home and walk out the door. I need accessibility as I order a ride from a ride sharing service from my smart phone (which has Zoom and VoiceOver built in) , as I take and share the photos that bring so much joy to my life and capture the beauty I encounter in the places I am lucky to visit (through accessible apps such as Instagram) and as I continue to develop my skills and knowledge base by reading ebooks about my field I download from the iBookstore and read with iBooks (accessible content) . For someone like me, accessibility is needed across a number of settings and situations if I am to be independent and continue to make a valuable contribution to society. Only an ecosystems approach can provide the kind of comprehensive accessibility I and many others who have disabilities need to live a fulfilling life.

IOS 6 Accessibility Features Overview

At today’s World Wide Developer’s Conference (WWDC) Apple announced IOS 6 with a number of accessibility enhancements. I am not a developer (yet!) so I don’t have a copy of the OS to check out,  so this post is primarily about what I read on the Apple website and on social media. A few of these features (word highlighting for speak selection, dictionary enhancements, custom alerts)  were tucked away in a single slide Scott Forstall showed, with little additional information on the Apple website. So far, these are the big features announced today:

  • Guided Access: for children with autism, this feature will make it easier to stay on task. Guided Access enables a single app mode where the home button can be disabled, so an app is not closed by mistake. In addition, this feature will make it possible to disable touch in certain areas of an app’s interface (navigation, settings button, etc.). This feature could be used to remove some distractions, and to simplify the interface and make an app easier to learn and use for people with cognitive disabilities. Disabling an area of the interface is pretty easy: draw around it with a finger and it will figure out which controls you mean. I loved how Scott Forstall pointed out the other applications of this technology for museums and other education settings (testing), a great example of how inclusive design is for more than just people with disabilities.
  • VoiceOver integrated with AssistiveTouch: many people have multiple disabilities, and having this integration between two already excellent accessibility features will make it easier for these individuals to work with their computers by providing an option that addresses multiple needs at once. I work with a wounded veteran who is missing most of one hand, has limited use of the other, and is completely blind. I can’t wait to try out these features together with him.
  • VoiceOver integrated with Zoom: people with low vision have had to choose between Zoom and VoiceOver. With IOS 6, we won’t have to make that choice. We will have two features to help us make the most of the vision we have: zoom to magnify and VoiceOver to hear content read aloud and rest our vision.
  • VoiceOver integrated with Maps: The VoiceOver integration with Maps should provide another tool for providing even greater  independence for people who are blind, by making it easier for us to navigate our environment.
  • Siri’s ability to launch apps: this feature makes Siri even more useful for VoiceOver users, who now have two ways to open an app, using touch or with their voice.
  • Custom vibration patterns for alerts: brings the same feature that has been available on the iPhone for phone calls to other alerts. Great for keeping people with hearing disabilities informed of what’s happening on their devices (Twitter and Facebook notifications, etc.).
  • FaceTime over 3G: this will make video chat even more available to people with hearing disabilities.
  • New Made for iPhone hearing aids: Apple will work with hearing aid manufacturers to introduce new hearing aids with high-quality audio and long battery life.
  • Dictionary improvements: for those of us who work with English language learners, IOS 6 will support Spanish, French and German dictionaries. There will also be an option to create a personal dictionary in iCloud to store your own vocabulary words.
  • Word highlights in speak selection: the ability to highlight the words as they are spoken aloud by text to speech benefits many  students with learning disabilities. Speak selection (introduced in IOS 5) now has the same capabilities as many third party apps in IOS 6.

These are the big features that were announced, but there were some small touches that are just as important. One of these is the deep integration of Facebook into IOS. Facebook is one of those apps I love and hate at the same time. I love the amount of social integration it provides for me and other people with disabilities, but I hate how often the interface changes and how difficult it is to figure it out with VoiceOver each time an update takes place. My hope is that Apple’s excellent support for accessibility in built-in apps will extend to the new Facebook integration, providing a more accessible alternative to the Facebook app which will continue to support our social inclusion into mainstream society. You can even use Siri to post a Facebook update.

Aside from the new features I mentioned above, I believe the most important accessibility feature shown today is not a built-in feature or an app, but the entire app ecosystem. It is that app ecosystem that has resulted in apps such as AriadneGPS and Toca Boca, both featured in today’s keynote. The built-in features, while great,  can only go so far in meeting the diverse needs of people with disabilities, so apps are essential to ensure that accessibility is implemented in a way that is flexible and customized as much as possible to each person. My hope is that Apple’s focus on accessibility apps today will encourage even more developers to focus on this market.

Another great accessibility feature that often gets ignored is the ease with which IOS can be updated to take advantage of new features such as Guided Access and the new VoiceOver integration. As Scott Forstall showed on chart during the keynote, only about 7% of Android users have upgraded to version 4.0, compared to 80% for IOS 5. What that means is that almost every IOS user out there is taking advantage of AssistiveTouch and Speak Selection, but only a very small group of Android users are taking advantage of the accessibility features in the latest version of Android.

Big props to Apple for all the work they have done to include accessibility in their products, but more importantly for continuing to show people with disabilities in a positive light. I loved seeing a blind person in the last keynote video for Siri. At this keynote, Apple showed another  blind person “taking on an adventure” by navigating the woods near his house independently. As a person with a visual disability myself, I found that inspiring. I salute the team at Apple for continuing to make people with disabilities more visible to the mainstream tech world, and for continuing to support innovation through inclusive design (both internally and through its developer community).

iPhoto App and Accessibility

This weekend I finally had a chance to try out the new iPhoto app Apple released along with iPad 3 (or as they are calling it “the new iPad.”) As an aspiring photographer I was impressed with the many options for organizing, editing, and sharing photos Apple has packed into this app which only costs $4.99 in the App Store. There have been many reviews of the new app posted online already, so I will not add another one here. However, I do have a unique perspective on the new app that I would like to share. Not only do I like to take photos (calling myself a photographer might be a stretch but it’s a hobby I enjoy and continue to try to get better at every day), but I also have a visual disability so I am part of a small community of blind photographers.

When I opened the iPhoto app on my iPhone, the first thing I did was turn on the VoiceOver built-in screen reader to hear how it would do with the new photo editing app. Frankly, I was not surprised that the new iPhoto app would be as accessible with VoiceOver as it is. I have come to expect accessible products from Apple over last few years, and I’m proud to be associated with it as an Apple Distinguished Educator. However, as I dug deeper into the iPhoto app with VoiceOver, the level of attention to detail in providing accessibility was still pretty impressive. For example, the brushes used to retouch photos (repair, lighten, darken, etc) are all accessible through VoiceOver gestures, as are the exposure and color correction controls and the various effects, . When I selected the crop tool, VoiceOver told me to pinch to resize the photo and as I did so it told me how much as I was zooming in as well as how far the image was offset (“image scaled to 15X, image offest by 15% x and 48% y).

On the iPad, there is a dedicated help button that opens up a series of overlays indicating what each button does. Not only was every part of the overlay accessible, but so is the entire help built into the iPad version of the app. The attention to detail is more impressive to me because there are so few blind photographers who would take advantage of an app such as iPhoto. What it does show is the level of commitment Apple has to accessibility, because it will go to great lengths to add accessibility even when only a few people will benefit from it.

In a recent blog post, accessibility advocate Joe Clark called out a number of hot new apps (Readability, Clear, Path, and Flipboard) for what he called irresponsible web development that results in accessibility barriers. Well, to me this new iPhoto app shows that you can design an app that is not only visually appealing, feature-packed and easy to use and learn, but also accessible to people with visual disabilities. I hope more developers start to realize that accessibility does not have to compete with good design, but that both complement each other.

When I first loaded the iPhoto app on my iPhone (that was the first device I installed the app on) I was too impatient to go on the Web and read about the app before I started to work with it. That’s just the kind of user I am, I like to get right in and try things out. Well, on the iPhone app the Help button from the iPad version of the app is missing. Most of the icons make sense, but in some cases I was unsure, so what I did was turn on VoiceOver and move my finger around the screen to have it announce what each button was for (or to at least give me a better idea). In that case, compatibility with VoiceOver helped me learn the app much faster without having to consult the help, and that got me to thinking. As these devices (phones, tablets, and whatever comes next) continue to get smaller and the interfaces start to use more visuals (tiles, buttons, etc.) and less text, the ability to hear the help may become an essential aspect of learning how to use the interface. In this way, features like VoiceOver would actually enhance the usability of a particular app for everyone – what universal design is all about.