7 Apple Watch Apps for Diverse Learners

Over on my YouTube Channel, I have posted a few video tutorials focusing on the built-in accessibility features of Apple Watch.

I also discuss these accessibility features in more detail in my recently updated book on low vision supports for Apple users, Zoom In (available as a free download on the iBookstore). VoiceOver, Zoom and many of the accessibility features familiar to users of iOS devices are included in Apple Watch. These accessibility features ensure users with a variety of special needs can personalize and use their Apple wearables.

As with iOS devices, Apple Watch also supports apps that provide even more flexibility for users in how they can use the wearable. With the release of watchOS 2 these apps can now run natively on the device itself, promising faster loading times and better performance overall. More importantly, apps can now use many of the hardware features available on Apple Watch, such as the Taptic engine, the Digital Crown, the various sensors (heart rate sensor and accelerometer) and the microphone.  Basically, apps can do much more than they could with the initial release of Apple Watch, opening the door for developers to be even more creative in how they address the needs of users who have special needs. This post focuses on my favorite apps for such users.

With any of the apps mentioned here, you will install the app from App Store just as you would do it for iPhone apps. You would then open the Watch app, go to My Watch, tap on the name of the app and make sure the toggle for “Show App on Apple Watch” is set to on. For some apps, you will also have an option to include it in Glances, the Apple Watch feature that allows you to access information with a swipe up from the current watch face.

For your convenience, I  have collected all of the resources mentioned in this post into an AppoLearning Collection. Have anything to add (apps, ideas for use in the classroom)? Let me know and I can add you as a collaborator to this collection (a new feature available on Appo Learning).

Children with Autism: A Visual Schedule

Visual schedules are supports that allow children and adults with autism and related disabilities to better handle transitions in their daily routines. These schedules use pictures to indicate to learners what they are to do and where they are to go next, helping to ease anxiety around transitions and building their ability to act independently.  According to developer Enuma (formerly Locomotion Apps), Visual Schedule ($12.99)  is the first picture-based scheduler for the Apple Watch.

Current activity shown on Apple WatchOn the Apple Watch, the app will display each activity as it is happening, with a progress indicator to let the learner know how much time is left before transitioning . Swiping up on any activity will display any associated tasks so that they can be checked off. Swiping to the left will then show what activities come next.

Building the visual schedules themselves takes place on the paired iPhone, where most of the work is done through a drag and drop interface.  From the Today view, tapping Edit in the upper right corner will display icons for 14 preset activities on the right side of the screen.

Visual Schedule today view in edit mode, showing activities that can be added to daily schedule.

You can also tap Add (+) at the bottom of the list to create your own custom activity (which can use a photo from your Camera Roll as its icon).  For each activity, you can double-tap its icon in the Edit view to specify the color of the label as well as any associated tasks.

To build the visual schedule itself, you  drag the activity icons onto the Today view and use the provided handles to  adjust the activity’s duration .

Proloquo2Go

Proloquo2Go ($249.99) is a robust symbol-based communication app that has been available for iOS devices for some time. The price listed is not for the Apple Watch app itself, but rather for the communication app that runs on iOS and which includes an Apple Watch component. Developer AssistiveWare originally  created the Apple Watch app when the watch was announced and at that time the app only allowed the wearable to be used as a switch for a paired iPhone running Proloquo2Go. With watch OS 2, Proloquo2Go now also provides basic communication capabilities on the Apple Watch app.

When you first launch Proloquo2Go on the Apple Watch you will be prompted to select the mode you want the app in: Switch or Communication. To change the mode after the initial selection, you will have to go into the Proloquo2Go options on the paired iPhone and change it there. You can see how the Switch feature works in a video I have posted on my YouTube channel.

Proloquo2Go Phrase BuilderThe new Communication option works by providing a basic set of phrases to start, then a Builder which can be accessed through a Force Touch to build new phrases. The Builder works very much in the same way you customize the various Complications  on the Apple Watch faces. The window is divided into three cells. You use the Digital Crown to select a sentence starter from the first cell, then repeat the process to select a noun or adjective from one of the available categories to complete the phrase (again, using the Digital Crown to navigate the choices). When the sentence is selected, it is displayed upside down so that the Apple Watch can be shown to the other person.  I found this to work out best when I turned off the option for “Wake Screen on Wrist Raise” on the Apple Watch settings. Otherwise, the screen would go to sleep as soon as I turned my wrist to display the message. Hopefully in the future the Apple Watch app can include text to speech, which according to AssistiveWare is a limitation imposed by Apple.

Proloquo4Text

Proloquo4Text ($119.99)  is a text-based communication app from AssistiveWare that, like Proloquo2Go,  includes an Apple Watch component.  However, unlike the Proloquo2Go Apple Watch app, this one does not include a phrase builder. You can choose to store a number of phrases into a special Apple Watch folder on the iOS app, and these phrases are then available for selection on the Apple Watch. As with the phrase builder in the  Proloquo2Go app, the phrases are displayed upside down when selected.

Wunderlist

For learners who have executive functioning challenges that make it difficult to stay organized, a good to do list app with reminders can be a helpful support. Surprisingly, there is no Reminders app for Apple Watch, though you can use Siri to create  a new reminder that shows up on the Reminders app on a paired iPhone.

Wunderlist Home ViewWunderlist (free) is an option if you would like to both create and view reminders on the Apple Watch. It is a free service for accessing to dos from just about any device (a full list of supported platforms is available on the Wunderlist site). On Apple Watch, the Wunderlist app provides a simple Home View with four options for viewing to dos: Inbox, Today, Starred and Surp to Me.  The Glances feature is also supported so you can access your to dos for the current day with a quick swipe up from the watch face.

To create a new to do item, you use the Force Touch feature of Apple Watch. You press and hold firmly to reveal the Add (+) button, then use your voice to dictate the text for the new to do and tap Done when you’re finished.

On the companion iPhone app you can then add details such as the due date, set up a reminder (which will use the Taptic Engine on the Apple Watch to get your attention with a gentle tap) and organize the to dos into lists that can be accessed by scrolling with the Digital Crown on the Apple Watch. The idea is that the Apple Watch app is an interface for creating quick to dos and checking them off as they are completed, while the iPhone app provides more options for managing said to dos.

Evernote

Evernote launch screen on Apple WatchEvernote (free) is one of my favorite apps for collecting random bits of information before it can get lost. It has become my replacement for all the post it notes I used to keep around my desk. With the Apple Watch app, you can create quick notes, use your voice to search through all the notes you have on your account, and see a list of recently viewed or updated notes. Like Wunderlist, Evernote supports reminders for notes that are time sensitive. However, with Evernote you can indicate the time for the reminder right on the Apple Watch itself as you create the note (though the options are limited to “this evening,” “tomorrow,” “next week” and “next month”). I find Evernote to be a nice compliment to Wunderlist and I use both: Wunderlist for to dos and Evernote for quick notes I will need to refer to at a later time but don’t necessarily have to act on right away. Together the two apps are great supports for staying organized and minimizing the risks of losing important information.

Just Press Record

Just Press Record opens with a big record buttonJust Press Record ($4.99) is a new audio recording app that is made possible by the access developers now have to the microphone with  watchOS 2.  Just Press Record can record audio directly from the Apple Watch’s microphone and play it back with the built-in speaker. The interface couldn’t be simpler: a nice big microphone button you press to start your recording. A nice touch is that you can see the wave form as you record, and when you are finished you can preview the recording before you choose to save the file to your iCloud account. You can even record when your iPhone is not within range of the Apple Watch (the recording will be synced the next time you connect the two devices). This app is useful as another option for students to capture their thoughts and ideas using just speech. It could even be used for students to reflect on their progress at regular intervals (at the end of each
day or week). Recordings can be shared with the teacher from the iCloud account the app uses to store the recordings.

iTranslate

iTranslate app ready to translate to SpanishiTranslate (free, in-app purchase) is a nice app to have if you have English Language Learners in your class and your command of their native language is not as strong as you would like it to be. When the app launches, you can use Force Touch to change the language (more than 90 are supported). Once you have set your language, tap the middle of the screen and use your voice to speak the phrase you want translated. You can then play the translation back using the Apple Watch speaker. This is not the fastest app (especially at launch) but hopefully it will continue to improve on performance over time. 

The number of apps for Apple Watch will continue to grow as developers become more comfortable with the device. What is exciting to me is the ability for developers to tap into the hardware features of the device with watch OS 2. I look forward to how the developer community will take advantage of the microphone and I hope that soon text to speech is made available to third-party apps as well. That would make many of these apps even more useful. What do you think?

New iPad Gestures for Cursor Movement and Text Selection

With iOS 9, Apple has added a new option for selecting text to the onscreen keyboard. Using a two-finger drag gesture, it is now much easier (at least for me) to place the cursor right where I want it. Another two-finger tap selects the word closest to the cursor, and another two-finger drag makes a selection.

I have found this method of text selection to be much faster than the old one where you had to tap and hold to get a magnifying glass which allowed you to place the cursor and then select from editing options from a popover menu. The new gestures work very well with the new Shortcut Bar that appears above the onscreen keyboard on the iPad. This Shortcut Bar provides shortcuts for editing and formatting options such as cut, copy, past, bold, underline and italicize. Finally, if you use Zoom, you can have it follow the cursor as you move within the text area by making sure Follow Focus is enabled in the Zoom settings (General > Accessibility > Zoom).

Here is a brief video showing the new cursor movement and text selection gestures for the iPad in action. At the end of the video I show how these gestures can work with Zoom.

Reader View in Safari for iOS 9

The Reader view in Safari has been one of my favorite features in iOS and in iOS 9 it has been updated to provide more options for customization.  Not only can you adjust the text size, but you can also choose from four (4) different backgrounds and select different fonts including the new San Fransisco iOS 9 font.

The Reader feature behaves in a similar way to how it did in previous versions of iOS,  in that you access it by tapping a special icon on the left side of the browser’s address bar which will only be visible when you are on a website/blog that supports this feature (usually news sites and other sites that update frequently).

Reader Icon.

Once you are in the Reader view, which removes a lot of the extra content (such as site navigation and many ads) you will now see an additional icon on the right side (big A and little A).

IMG_0020

Tapping on this icon will reveal a popover with text and background options that are now available in iOS 9.

IMG_0021

I have posted a short video on my YouTube channel that shows the iOS 9 Reader view for Safari in action:

The Reader view is great for not only removing a lot of the clutter on a page to make it easier for learners to focus on the content, but also for making it easier to use some of the accessibility features such as Speak Selection (the built-in text to speech feature of iOS).

On the usability of touch screens to screen readers

Recently, Katie Sherwin of the Nielsen Norman Group published an article on the NNG website summarizing her experience with the VoiceOver screen reader for iOS devices, and her suggestions for designing better interactions for blind users. The article has some good design suggestions overall: creating cleaner copy with streamlined code is always a good thing. So is including alternative text for images, making sure all interactions work with the keyboard, and the hierarchy of the content is clearly  indicated by headings that separate it into logical sections. On these suggestions, I am in full agreement with the author.

Where I disagree with her is on the representation of what the experience of using a mobile device is really like for actual blind users. The author herself acknowledges that she only started experimenting with the screen reader after attending a conference and seeing how blind users interacted with their devices there. It is not clear how much time she has had to move beyond the most basic interactions with VoiceOver. Thus she states that “screen readers also present information in strict sequential order: users must patiently listen to the description of the page until they come across something that is interesting to them; ; they cannot directly select the most promising element without first attending to the elements that precede it. ”

This may be accurate if we are talking about someone who has just started using VoiceOver on an iPhone or iPad. It ignores  the existence of the Rotor gesture familiar to many power users of VoiceOver. With this gesture, users actually can scan the structure of a web page for the content they want to focus on. They can see how the page  is organized with headings and other structural elements such as lists, form elements and more.  Many users of VoiceOver also use the Item Chooser (a triple-tap with two fingers) to get an alphabetical list of the items on the screen. Both of these features, the Rotor and the Item Chooser, allow users to scan for content, rather than remaining limited to the sequential kind of interaction described in the NNG article.

As for the point about the cognitive load of the gestures used on a touch screen device like the iPhone, it should be pointed out that the number of gestures is actually quite small compared with the extensive list of keyboard shortcuts needed on other platforms. I do agree with the author that typing remains a challenge when using the onscreen keyboard, but there are other options available to make text entry easier: the user can choose to use any of the many Bluetooth keyboards available on the market for a more tactile experience; dictation is built in and has a pretty good level of accuracy for those who prefer using their voice; and new input modes introduced in iOS 8 and 9 allow for handwriting recognition as well as Braille input.

To help new users with the learning curve (and the cognitive load), Apple provides a built-in help feature that is only available in the VoiceOver settings when the feature is active. Once a user goes into the help, he or she can perform gestures to hear a description of what they do. Another benefit for users is the fact that many of the gestures are the same across the Apple ecosystem. Thus, a VoiceOver user can transfer much of what they have learned on an iOS device to the Mac, which has a trackpad roughly the size of an iPhone, the new Apple TV with its touchpad remote, and even the Apple Watch (with a few modifications to account for the limited screen real estate). Finally, I have found that learning the gestures is as much a  matter of muscle memory as it is about remembering the gestures and what they do. The more time you spend performing the gestures, the easier they become. As with any learned skill, practice makes a difference.

Again, there is a lot of good advice in this article as it relates to the need for more inclusive designs that minimize unnecessary cognitive load for users. However, a key point that is missing from that advice is the need to get feedback on designs from actual people who are blind. The way a blind user of VoiceOver interacts with his or her iOS device will often be a lot different from the way a developer just becoming familiar with the feature will do so (same goes for a usability expert).

What’s New in iOS 9 for Accessibility

With iOS 9, Apple continues to refine the user experience for those who have disabilities or just need additional supports to effectively interact with their iPhones and iPads. While there are only two new accessibility features in iOS 9 (Touch Accommodations and a new Keyboard pane for improved support of external Bluetooth keyboards), the existing features have received a number of enhancements. Probably the one that received the most attention in this update is Switch Control, which now includes a new scanning style, the ability to set separate actions for long presses, and Recipes for more easily performing repetitive actions such as turning the pages in a book in iBooks.

The first change you will notice when you go into the Accessibility pane in Settings is that things have been moved around just a bit. Really the only change is that the options for Interaction now follow those for Vision. Everything else then follows the same order as before. I like this change as I think both VoiceOver and Switch Control significantly change how the user interacts with the device and  this change should make it easier to navigate to Switch Control in the Accessibility pane. The change also works to highlight the new Touch Accommodations feature by placing it near the top of the Accessibility pane.

This post is a short summary of each accessibility feature that is either brand new or enhanced in iOS 9, starting with the new Touch Accommodations feature.

Touch Accommodations

This brand new feature is largely targeted at people with motor difficulties who may have problems with the accuracy of their touches as they interact with the touchscreen on an iOS device. Touch Accommodations consists of three options: Hold Duration, Ignore Repeat and Touch Assistance. Before you start experimenting with these options, I would recommend setting up your Accessibility Shortcut so that Touch Accommodations is the only option listed. This way if you get stuck while using Touch Accommodations you can quickly triple-click the Home button on your device to exit out of the feature.

Hold Duration will require the user to touch the screen for a given duration before a touch is recognized. This can be helpful for someone who struggles with accidental presses. When Hold Duration is turned on, touching the screen will display a visual cue with a countdown timer. If the user lifts the finger before the countdown runs out, the touch is not recognized. With Ignore Repeat, multiple touches within the specified duration are treated as a single touch. This can be specially helpful when typing with the onscreen keyboard. A user with a tremor may end up tapping repeatedly on the same spot, resulting in many unwanted keypresses.

Tap Assistance can be set to use the Initial Touch Location or the Final Touch Location.  The two options determine the spot on the screen where the touch is performed when you let go with your finger. With Initial Touch Location, you can tap and then move your finger around on the screen while a timer is displayed. If you let go with your finger during the countdown (which you can customize using the Tap Assistance Gesture Delay controls) the tap is performed where you first touched the screen. After the countdown expires, you can perform a gesture (a flick, swipe and so on) the way you are used to with iOS. With Final Touch Location, the touch is performed at the spot where you let go as long as you do it within the countdown time. This can be a different spot than where you first touched the screen.

Additions to Switch Control

Switch Control is an iOS feature introduced in iOS 7 that provides access to touchscreen devices for a number of people who rely on external assistive devices. My friend Christopher Hills, with whom I am co-authoring a book on this feature (stay tuned on that front),  is a good example of an expert user of Switch Control. Christopher has cerebral palsy and uses external switches to perform many of the gestures someone with typical motor functioning could do with their fingers on the touchscreen.

In iOS 9, Apple has continued the development of Switch Control with a number of new features:

  • A new Single Switch Step Scanning style: this new style requires the switch source to be continuously pressed until the user gets to the desired item. Letting go of the switch then will highlight that item and give it focus. With the default tap behavior, the next tap will bring up the scanner menu then within the scanner menu letting go of the switch will immediately select the option that has focus. A Dwell Time timing option determines how long it will take before an item is highlighted and the user can make a selection.
  • A new Tap Behavior: the Always Tap option is similar to Auto Tap in that it allows the user to make a selection with the first tap of the switch. However, with Always Tap, the scanner menu is available from an icon at the end of the scanning sequence instead of through a double-tap of the switch.
  • A Long Press action: the user an specified a separate action that can be performed when the switch is held down for a specified duration. This is a great way to exit out of the Recipes feature.
  • Recipes: the user can invoke a special mode for Switch Control where each press of the switch can perform the same action. A couple of actions are already included, such as tapping the middle of the screen or turning the pages in a book. These are primarily intended for use in iBooks. Creating a new recipe is as easy as giving it a name, assigning the switch that will be used to perform the action that will be repeated with each press, and choosing one of the built in actions or creating a custom one. Custom actions for Recipes can include a series of gestures and their timings. To exit out of the Recipe, the user has two options: setting a timeout after which the recipe will be ended if no switch presses take place, or setting the action for a long press of the switch to Exit Recipe.

A new option allows the switch user to combine tap behaviors when using the onscreen keyboard. With the Always Tap Keyboard Keys option, the keys will be selected with a single press of the switch even if the tap behavior is set to the default of showing the scanner menu at the first tap of the switch.

 

Customizable AssistiveTouch Menu

The layout of the AssistiveTouch menu can now be customized, with options for changing the number of items that appear on the top level shown and swapping out icons for features on secondary menus that are used more often. The number of icons on the top level menu can be set to as few as one and as many as eight. Tapping on any of the icons in the Customize Top Level Menu pane will open a list of all of the features supported by AssistiveTouch. Selecting an item from the list will move that option to the top level menu. Change your mind? No problem, a Reset option is available (in fact, I would love to see similar Reset options for other features such as VoiceOver and Switch Control).

Better Support for Bluetooth Keyboards

Under Interaction, you will find a new Keyboard option. Tapping that option will open a separate pane with options intended for those who use an external Bluetooth keyboard with their iOS devices:

  • Key Repeat: turns off the key repeat (it is enabled by default) in order to prevent multiple characters from being entered when a key is held down on the keyboard. The options for customizing this feature include adjustments for the delay before a key that is held down starts repeating, as well as how quickly the key repeat will take place.
  • Sticky Keys: allows the user to press the modifier keys for a keyboard shortcut in sequence rather than having to hold them down all at once. The options for this feature include a quick way to turn it on by pressing the Shift key quickly five times, as well as playing a sound to alert the user when it has been turned on.
  • Slow keys: changes how long the user has to hold down a key before it is recognized as a keypress (essentially a hold duration). The only option for this feature is to adjust the length the key has to be pressed before it is recognized.

The one option for the onscreen keyboard in the Keyboard pane addresses a usability problem by making the switch between lower case and upper case more prominent. By default, the keys on the onscreen keyboard are in lower case and only switch to uppercase when the shift key is pressed.

Tweaks to VoiceOver and Zoom

The Rotor in iOS 9 has two new options available: Text Selection and Typing Mode. The former is not a new feature or input mode, it just now can be changed through the rotor. With the latter, the user can more easily select text by character, word, line, or page (or select all) by flicking up or down with one finger after selecting Text Selection in the Rotor. A flick to the right will then select the text by the chosen granularity (word, line, etc.).

A new option allows the users of external Bluetooth keyboards to change the VoiceOver keys from Control + Option to the Caps Lock. Finally, users can now adjust the Double-tap Timeout at the bottom of the VoiceOver settings pane. This feature may be helpful to a VoiceOver user who also has motor difficulties and can’t perform the double-tap as quickly.

For Zoom, the only change is that the option for choosing different Zoom Filters is now available from the Zoom settings panel where before it could only be selected from the Zoom menu available after tapping the controller or the handle on the Zoom window.

Other Options

iOS 9 includes options for disabling the Shake to Undo feature as well as all system vibrations, both of which can be found under Interaction in the Accessibility pane.

As is often the case with iOS updates, a number of features that are not explicitly labeled as a accessibility features can benefit those who use assistive technologies. One example is the new Siri suggestions feature, which can be displayed with a swipe to the right from the Home screen. The suggestions include frequently contacted people, recently used apps, locations and more. Anything that puts less distance between users of VoiceOver and Switch Control and the information they need is a good thing in my book.

That’s it for this high level overview of the major (and some minor) changes in iOS 9 that impact those who rely on the accessibility features. I hope you have found it helpful.

7 accessibility features every teacher should know for back to school.

It’s that time of the year again. The supplies and textbooks have come in. The room is decorated. Soon students will be walking through the door and it’s off the races. A new school year is upon us.

If you are lucky, you have a classroom set of iPads, or you may be in a BYOD situation where students are bringing their devices to school. Did you know about some of the features built into the iPad and other iOS devices that can help you empower all learners to access the curriculum this year? No? Well that’s what this post is about. You don’t need to have any students with IEPs or Section 504 plans to take advantage of these features. They are called universal design features because they can benefit any of a number of learners. Here are the top seven.

Embiggen the text

Yes, I know that’s not a real word (except maybe on The Simpsons) but it means to make the text larger. On iOS devices, this is easy to do and by making the text bigger you will allow your learners to focus all of their energy on understanding  the content rather than squinting and struggling to see it. To make the text larger,  go to Settings > Display and Brightness > Text Size and use the slider to adjust the size as needed. Not big enough? No problem. Go to General > Accessibility > Larger Text instead. There you can turn on even Larger Accessibility Sizes. While you are it you may as well turn on Bold Text to make that text really stand out.

Larger text option in Accessibility settings.

It’s like a negative

Well at least to you…your students probably don’t know what a negative from the film days is, seeing as the only photos they look at are probably on Instagram or Facebook. In any case, reading on screen can be very tiring for our eyes, with all that light coming at us from the screen. As our leaners spend more of the day in front of the screen one thing we can do is reverse the colors to help with eye strain. It’s really simple to turn this feature on and off, so why not try it. If it doesn’t work you can easily go back to the default black on white scheme. Invert colors can be found under Settings > General > Accessibility, or even better just use Siri to turn this feature on/off by saying “Turn on Invert Colors.” The kids will love that trick.

New note in Notes app with Invert Colors turned on.

Let’s hear it for Alex

Alex is not a person, though if you spend some time listening to him reading content on the iPad you may begin to think he is. Alex is the built-in high quality voice that has been available on the Mac for a number of years. Guess what? Now it’s available as a download for use with the text to speech feature built into iOS, officially called Speak Selection. This feature can even highlight the words as it reads them aloud, which can be a big help to some struggling readers. The video explains Speak Selection in more detail.

Speak Selection works great with the Reader feature built into Safari, which removes ads and other distractions from the page. In the upcoming iOS 9 release the Reader feature gains controls for adjusting the text size as well as changing the background and font for even better legibility.

Let’s hear that again

Don’t want to select the text first? No problem. Speak Screen is activated with a special two finger gesture and will read everything that is on the screen (and I do mean everything). Once you turn on Speak Screen in the Speech settings, you can perform a two-finger swipe from the top of the screen (except on the Home screen) to hear everything read aloud. Even better, have Siri help out. Just say “Speak Screen” and it will start reading. You even get an onscreen controller for adjusting the speaking speed.

You complete me

Although it is not technically an accessibility feature, the word prediction built into iOS 8 (QuickType) can be a big help for learners who struggle with spelling or just have a hard time producing text. This feature should be turned on by default but if not you can enable it by going to Settings > General > Keyboard and making sure Predictive is turned on. When you start typing a word, suggestions should pop up in a strip just above the onscreen keyboard.

QuickType bar in iOS onscreen keyboard

Say it with me…Dictation is awesome.

Again, this is not technically an accessibility feature, but it can help those who struggle with typing on the onscreen keyboard by giving them another option: their voice. Just make sure it’s your announcer’s voice by speaking clearly and enunciating as you tap the microphone icon to the left of the space bar to start dictating. You can even use a number of commands, such as “comma,” “period,” and “new line.”

Microphone icon to left of space bar activates Dictation in iOS.

CC is not just for copy

It also stands for closed captioning, a feature that is built into many videos for those who are unable to hear the audio. Closed captions can benefit a number of other leaners: English language learners, struggling readers, and anyone learning a topic with specialized vocabulary where seeing the words as well as hearing them could be helpful (science, for example). And as a bonus you will have a fallback for when your speakers just don’t work (because technology never fails, right?). You can enable the captions for a video that has them by going to Settings > General > Accessibility and choosing Subtitles & Captioning. You can even change the appearance of the captions to make them easier to read.

Have an Apple TV in your classroom? It too has support for captions. Just go to Settings > General > Accessibility > Closed Captions + SDH to turn them on. Just as with your iOS device, you can change the appearance of the captions on Apple TV.

There you have it. A few more things to have in your tool belt as you work to ensure access to learning for all of your students this year, which I hope will be a great one!

Apple Watch Accessibility Video Tutorials

Late last week,  I was finally able to get my hands on an Apple Watch for hands on exploration of its accessibility features i have been reading about. Right away, I customized the new watch with the extra-large watch face in a bright green to go along with the green sports band I am using for daily wear. I also got a bright green case since I know how often I walk into things due to my lack of peripheral vision. The case I got only cost about $5 on Amazon and is made by Poetic. It is a simple affair: a piece of rubber that goes around the watch and has openings for the sensors in the back and the Digital Crown and side button.

FullSizeRender 13

In addition to my first Apple Watch accessibility video on Zoom (much of which I recorded while at my local Apple Store), I have now created three other videos focusing on VoiceOver and the other accessibility features built into Apple Watch. These videos are available on my YouTube channel as a separate playlist (I am still working on the captions but hope to have them ready by the end of the week).

As for my early experiences with the device: I have been really enjoying the notifications that I get on my wrist with the Taptic Engine (I have Prominent Haptic turned on for even more of a tingle), and how I can quickly archive and dismiss emails with one touch once I glance at the watch. As someone who uses a white cane when out in public, not having to fumble around to get my phone out of my pocket when I get a notification will be really handy.

So far I have been very choosy with the apps I have installed on the Apple Watch. Aside from the ones from Apple, there are only a few third-party apps on my watch:

  • Overcast for controlling podcast playback on my iPhone.
  • American Airlines because I have some upcoming travel and want to test out use of the watch for check-in.
  • MLB At Bat (need to keep up on my Mets, of course),
  • Proloquo2Go (to test out the feature that uses the watch as a switch).
  • Evernote for quick notes.

Maybe the fact that I don’t have so many apps has helped me get pretty good battery life: as I write this at 9:30 PM, I have about 70% of my battery left.

A couple of nifty features that I have found useful as someone with poor eyesight:

  • pinging my iPhone so that I can locate it by sound. As someone with poor vision (and sometimes equally bad memory) I tend to misplace my phone around the house quite often. This minor feature has already saved me quite a bit of time. It would be so nice if a future Apple TV remote included support for this feature. That’s another item I frequently misplace (and sometimes lose forever, there’s currently one stuck inside my couch where I can’t get to it).
  • being prompted to stand. For most people these are stand breaks, for me they are eye rest breaks. I do a lot of work on the computer and sometimes forget that I need to take regular breaks to avoid eye fatigue. The stand reminder is great for getting me out of my chair and getting me to focus my eyes on something other than the screen to give them a rest.
  • remote shutter for Camera on iPhone. I recently learned that you can use the side button to trigger the shutter for the Camera on your iPhone from Apple Watch. This is supposed to work with any photography app that supports the volume buttons on iOS (which is the way I often take photos so that I have something tactile to work with), but so far I have only tried it with the built-in Camera app. The only other photography app I have installed on my Apple Watch was Pro Camera but it was very slow so I removed it.

I look forward to continuing my Apple Watch journey over the next few weeks as I engage on quite a bit of travel. In fact, I am going to try an experiment. For the first time, I plan to travel without my Mac and will rely on only an iPad, my iPhone and Apple Watch for all of my computing needs (including accessibility). That includes presenting at a couple of conferences in addition to all of the travel-related uses of tech (checking in, getting transportation from Uber or Lyft, etc.). It should be an exciting experiment.

Now on iBookstore: Supporting Students with Low Vision Using Apple Technology

My new book focusing on accessibility for Apple users who have low vision is now available for download from the iBookstore.

Cover of Supporting Students with Low Vision Using Apple Technology

The book includes more than 25 short video tutorials (closed-captoined) to go along with explanations of the built-in accessibility features of iOS devices, the Mac, Apple TV and even Apple Watch (I was only able to record one video on Zoom by visiting my local Apple Store, since I don’t yet have access to an Apple Watch – more videos on Apple Watch will be added in a future update). The book also has a section on apps for those with low vision as well as some tips for creating more accessible iBooks content for those who have low vision. A final section focuses on accessories that I use as a person with low vision (stands, bone conduction headphones and the like).

I hope you enjoy the book, find it a valuable resource and will provide me with feedback so that I can make future updates even better.

To celebrate the release of this new book, I have made a few updates to my previous book (A Touch of Light), which is now FREE.

Sneak Peek: New Ebook on Apple Accessibility Supports for Low Vision

Out of all the amazing accessibility features built into my Apple devices, the ones that are most meaningful to me are those that are intended for people with low vision. These are the features I use most frequently since I still have some vision left and I am not a full time VoiceOver user.

To share what I have learned about these features with the rest of the educational technology and assistive technology communities, I have authored a new multi-touch book: Supporting Students with Low Vision with Apple Technology. I had hoped to have the book available on the iBookstore in time for Global Accessibility Awareness Day, but with more than 25 videos that needed captioning it took longer than I expected. I am providing a sneak peek of a work in progress available for download from my Dropbox account. A word of caution: the file is 345 MB due to the videos.

Cover of Supporting Students with Low Vision Using Apple Technology

The book explores the concept of an ecosystems approach to accessibility which I discussed in my Global Accessibility Awareness Day post. It focuses not only on the accessibility features found throughout the Apple ecosystem (on iOS, Mac, Apple TV and even Apple Watch), a number of apps to designed to meet the needs of those with low vision, and techniques for creating more accessible content for low vision readers.

I hope you like this multi-touch book and I welcome any feedback related to it: things I missed, things that need to be clearer, any feedback you wish to provide. Here is the intro video I created for it with PowToon:

Global Accessibility Awareness Day: The Need for an Ecosystems Approach to Accessibility in Education

On the occasion of Global Accessibility Awareness Day, I am excited about the many online and face to face events that will mark this important step toward ensuring a more accessible and inclusive environment for those of us who have special needs.  I will be presenting a session on the use of photography as a tool for disability advocacy as part of Inclusive Design 24, a free 24-Hour event sponsored by The Paciello Group and Adobe Systems. Photography has long been a passion of mine, and I welcome any opportunity to share how I use it as an educator and advocate to challenge perceptions of ability/disability. I will also be sharing resources and insights during a #GAADILN  twitter chat sponsored by the Inclusive Learning Network of the International Society for Technology in Education (ISTE).

I love Global Accessibility Awareness Day (or GAAD as I will refer to it from now on) but if there is one thing that I would change is the name of the event. To me it should be Global Accessibility Action Day. With many of these types of events the focus is on raising awareness of the needs of people of disabilities, as if we have not been asking for our rights for decades now (the ADA is more than two decades old, you know). GAAD gets it right by focusing on action. Organizations such as Shaw Trust Accessibility Services, Deque Systems and Accessibility Partners are offering a number of free services such as document audits, app accessibility consultations and website user testing. Many others are providing webinars and live presentations that aim at more than raising awareness by providing practical information on how to make documents, website and apps more accessible. A review of the full list of events available on the GAAD website makes it clear that this event is about more than just awareness, it is about taking the next step for accessibility.

In my own field of education, I see much progress being made but I also see a need for a more ecosystems approach to inclusion and accessibility. When I think of ecology I think about systems that have a number of parts working together as one, with the sum of these parts being greater than they are on their own.  When it comes to students with disabilities, a number of technologies are now available as built-in options on the mobile devices many of them own. While I am a witness to the impact these technologies can have on the lives of students with disabilities (having been one who used these technologies myself) I believe their impact is limited by their use in isolation rather than as part of a more comprehensive system.

What I would like to see is a change in thinking to focus on a systems approach that addresses what I see as the three As of accessibility:

  • Accessibility Features: companies such as Apple  now include a comprehensive toolkit for accessibility that is built into the core of the operating system.  This means that when I take my new Mac, iPhone or Apple Watch out of the box it will be ready for me to use without the need to purchase or install additional software. Not only that but as my vision gets worse I know that I will be able to take my device out of the box and set it up independently, without having to wait for someone with better eyesight to help me.  These built-in accessibility features have been life-changing for me. Without them I’m not sure I would have been able to pursue higher education and complete my master’s and doctoral studies. I also would not be able to do my photography that brings so much joy and beauty into my life. Unfortunately, not all educators know about even the most basic of these features that are built into the technology their districts have spent so much money to purchase. I am often surprised when I do presentations around the country (and sometimes in other parts of the world) by how little awareness there is among educators of the potential they hold literally  in their hands to change a student’s life. We need to do better in this area of professional development to allow these tools to have an even greater impact on education for all students, not just students with disabilities but any student who struggles with the curriculum and needs additional support.
  • Accessibile Apps:  the built-in accessibility features provide a great baseline for addressing the needs of people with disabilities, but they can’t do it all. There is just too much diversity and variability for that to be the case: not just in the traits and needs of users, but in the settings and tasks where technology is used. For this reason, it is often necessary to extend the capabilities of the built-in accessibility features by installing apps that provide greater customization options. A great example is the Voice Dream Reader app. While iOS has a robust text to speech feature with word highlighting that now supports a high quality Alex voice, Voice Dream Reader allows for even greater customization. The user can adjust the color of both the word and sentence highlighting, something which cannot be done with the built-in Speak Selection feature of iOS.  For those who are blind and use the VoiceOver screen reader, the developer has done an excellent job of labeling all of the app’s controls.   A companion Voice Dream Writer app even provides a special mode for VoiceOver users to make it easier for them to enter and edit text, showing an strong commitment to usability for all users on the part of this developer. Other examples of developers who are doing exemplary work when it comes to creating accessible apps include AssistiveWare ( developers of Proloquo2Go, Proloquo4Text and Pictello, all apps with excellent support for VoiceOver and Switch Control) and Red Jumper  (developers of the popular Book Creator app). The latter added an Accessibility option for images and other objects to help students and educators create accessible content with the app. Unfortunately, these developers are still the exception rather than the rule. With too many apps, swiping through with VoiceOver results in hearing “button” over and over with no indication of what the button actually does. Worse, many of the buttons for key actions sometimes can’t even be selected. Without attention to accessibility from app developers, the accessibility features can’t work to their full potential. No matter how good the voice built into VoiceOver is (and Alex is pretty good) it does me no good if I can’t select the buttons within an app and determine what they do.
  • Accessible Content: the same problems that exist with apps that are inacessible comes into play with much of the content that is available online for students. Too many videos lack captions (or include only automatic computer generated captions that contain too many errors to be useful), and too many ebooks include images that are not labeled with accessibility descriptions  for those who can’t see them. Without these accessibility descriptions, which can be easily added in authoring tools such as iBooks Author, a blind student taking a science class or an economics class will not be able to access the diagrams and other graphics that are so often used in these fields. Again, adding in features such as accessibility descriptions allows the built-in accessibility feature, in this case VoiceOver, to work to its full potential. There are many wonderful examples of books that include accessibility, as well as resources to help educators develop their own accessible books with easy to learn and use tools such as iBooks Author. These include Creating Accessible iBooks Textbooks with iBooks Author from the National Center for Accessible Media and Inclusive Design for iBooks Author by my friend and fellow Apple Distinguished Educator Greg Alchin. For a great example of an engaging and accessible book, one need not look any further than Reach for the Stars, a  multi-touch book from SAS that makes astronomy come alive not only for blind students but anyone who wants to learn about our universe using all of their senses.

As shown by the following diagram, when the three components are present (robust accessibility features, accessible apps, and accessible content) we get a synergy that results in an even greater impact than each tool or feature can have on its own: this is the sweet spot for accessibility in education.

Three overlapping circles labeled as Accessibility Features, Apps and Accessible Content, with the spot where they converged labeled as Sweet Spot.

To ensure accessibility in education we all must work together to realize the advantages of an accessibility ecosystem: companies such as Apple and others who are building accessibility into their products, app developers and content authors. As AssistiveWare’s David Niemeijer so nicely stated in his own GAAD post when we  take accessibility into account we really are designing for everyone because we all one day get old and require the ability to customize the text size and other features of our devices to account for our aging vision and hands.

Furthermore, to quote from a recent Apple commercial, “inclusion promotes innovation.” Thinking about accessibility from the start, in line with the principles of universal design, requires us to be even more creative as we seek to solve problems of access that may someday result in usability improvements for everyone.

A great example of that is the recently released Apple Watch.  Since it has a small screen that makes it difficult to enter text, much of the interaction with the Apple Watch takes place through the Siri personal assistant. The voice recognition technology that makes Siri possible actually had its origins in the disability community, but now it  can be used to account for the constraints of a smart watch and its small screen.

The Apple Watch is also a  great example of an ecosystems approach to accessibility and  its benefits. This device includes many of the same accessibility features that are available on the iPhone and the iPad, which are the same features I can use on my Mac. What this means is that if I get a new Apple Watch I will already know how to use these features, with a few modifications to account for the smaller screen. Similarly, a blind student who has been using his or her iPhone can easily transfer the use of many VoiceOver gestures to the trackpad built into Mac laptops or the Magic Trackpad used on iMacs.

Why is an ecosystems approach like this so important? Ultimately it is because I as a person with a disability need accessibility 24/7, 365 days a year, most likely for the rest of my life (unless a cure is found for my condition). My need for accessibility doesn’t stop when I get up from my desk at home and walk out the door. I need accessibility as I order a ride from a ride sharing service from my smart phone (which has Zoom and VoiceOver built in) , as I take and share the photos that bring so much joy to my life and capture the beauty I encounter in the places I am lucky to visit (through accessible apps such as Instagram) and as I continue to develop my skills and knowledge base by reading ebooks about my field I download from the iBookstore and read with iBooks (accessible content) . For someone like me, accessibility is needed across a number of settings and situations if I am to be independent and continue to make a valuable contribution to society. Only an ecosystems approach can provide the kind of comprehensive accessibility I and many others who have disabilities need to live a fulfilling life.