Designed for (fill in the blank)

On the occasion of Global Accessibility Day (GAAD), Apple has created a series of videos highlighting the many ways its iOS devices empower individuals with disabilities to accomplish a variety of goals, from parenting to releasing a new album for a rock band. Each of the videos ends  with the tagline “Defined for” followed by the name of the person starring in the video, closing with “Designed for Everyone.” In this brief post, I want to highlight some of the ways in which this is in fact true. Beyond the more specialized features highlighted in the video (a speech generating app, the VoiceOver screen reader, Made for iPhone hearing aids and Switch Control), there are many other Apple accessibility features that can help everyone, not just people with disabilities:

  • Invert Colors: found under Accessibility > Display Accommodations, this feature was originally intended for people with low vision who need a higher contrast display. However, the higher contrast Invert Colors provides can be helpful in a variety of other situations. One that comes to mind is trying to read on a touch screen while outdoors in bright lighting. The increased contrast provided by Invert Colors can make the text stand out more from the washed out display in that kind of scenario.
  • Zoom: this is another feature that was originally designed for people with low vision, but it can also be a great tool for teaching. You can use Zoom to not only make the content easier to read for the person “in the last row” in any kind of large space, but also to highlight important information. I often will Zoom In (see what I did there, it’s the title of one of my books) on a specific app or control while delivering technology instruction live or on a video tutorial or webinar. Another use is for hide and reveal activities, where you first zoom into the prompt, give students some “thinking time” and then slide to reveal the part of the screen with the answer.
  • Magnifier: need to read the microscopic serial number on a new device, or the expiration name on that medicine you bought years ago and are not sure is still safe to take? No problem, Magnifier (new in iOS 10) to the rescue. A triple-click of the Home button will bring up an interface familiar to anyone who has taken a photo on an iOS device. Using the full resolution of the camera, you can not only zoom into the desired text, but also apply a color filter and even freeze the image for a better look.
  • Closed Captions: although originally developed to support the Deaf and hard of hearing communities, closed captions are probably the best example of universal design on iOS. Closed captions can also help individuals who speak English as a second language, as well as those who are learning how to read (by providing the reinforcement of hearing as well as seeing the words for true multimodal learning). They can also help make the information accessible in any kind of loud environment (a busy lobby, airport, bar or restaurant) where consuming the content has to be done without the benefit of the audio. Finally, closed captions can help when the audio quality is low due to the age of the film, or when the speaker has a thick accent. On Apple TV, there is an option to automatically rewind the video a few seconds and temporarily turn on the closed captions for the audio you just missed. Just say “what did he/she say?” into the Apple TV remote.
  • Speak Screen: this feature found under Accessibility > Speech are meant to help people with vision or reading difficulties, but the convenience it provides can help in any situation where looking at the screen is not possible – one good example is while driving. You can open up a news article in your favorite app that supports Speak Screen while at a stop light, then perform the special gesture (a two finger swipe from the top of the screen) to hear that story read aloud while you drive. At the next stop light, you can perform the gesture again and in this way catch up with all the news while on your way to work! On the Mac, you can even save the output from the text to speech feature as an audio file. One way you could use this audio is to record instructions for any activity that requires you to perform steps in sequence – your own coach in your pocket, if you will!
  • AssistiveTouch: you don’t need to have a motor difficulty to use AssistiveTouch. Just having your device locked into a protective case can pose a problem this feature can solve. With AssistiveTouch, you can bring up onscreen options for buttons that are difficult to reach due to the design of the case or stand. With a case I use for video capture (the iOgrapher) AssistiveTouch is actually required by design. To ensure light doesn’t leak into the lens the designers of this great case covered up the sleep/wake button. The only way to lock the iPad screen after you are done filming is to select the “lock screen” option in AssistiveTouch. Finally, AssistiveTouch can be helpful with older phones with a failing Home button.

While all of these features are featured in the Accessibility area of Settings, they are really “designed for everyone.” Sometimes the problem is not your own physical or cognitive limitations, but constraints imposed by the environment or the situation in which the technology use takes place.

How about you? Are there any other ways you are using the accessibility features to make your life easier even if you don’t have a disability?

7 Apple Watch Apps for Diverse Learners

Over on my YouTube Channel, I have posted a few video tutorials focusing on the built-in accessibility features of Apple Watch.

I also discuss these accessibility features in more detail in my recently updated book on low vision supports for Apple users, Zoom In (available as a free download on the iBookstore). VoiceOver, Zoom and many of the accessibility features familiar to users of iOS devices are included in Apple Watch. These accessibility features ensure users with a variety of special needs can personalize and use their Apple wearables.

As with iOS devices, Apple Watch also supports apps that provide even more flexibility for users in how they can use the wearable. With the release of watchOS 2 these apps can now run natively on the device itself, promising faster loading times and better performance overall. More importantly, apps can now use many of the hardware features available on Apple Watch, such as the Taptic engine, the Digital Crown, the various sensors (heart rate sensor and accelerometer) and the microphone.  Basically, apps can do much more than they could with the initial release of Apple Watch, opening the door for developers to be even more creative in how they address the needs of users who have special needs. This post focuses on my favorite apps for such users.

With any of the apps mentioned here, you will install the app from App Store just as you would do it for iPhone apps. You would then open the Watch app, go to My Watch, tap on the name of the app and make sure the toggle for “Show App on Apple Watch” is set to on. For some apps, you will also have an option to include it in Glances, the Apple Watch feature that allows you to access information with a swipe up from the current watch face.

For your convenience, I  have collected all of the resources mentioned in this post into an AppoLearning Collection. Have anything to add (apps, ideas for use in the classroom)? Let me know and I can add you as a collaborator to this collection (a new feature available on Appo Learning).

Children with Autism: A Visual Schedule

Visual schedules are supports that allow children and adults with autism and related disabilities to better handle transitions in their daily routines. These schedules use pictures to indicate to learners what they are to do and where they are to go next, helping to ease anxiety around transitions and building their ability to act independently.  According to developer Enuma (formerly Locomotion Apps), Visual Schedule ($12.99)  is the first picture-based scheduler for the Apple Watch.

Current activity shown on Apple WatchOn the Apple Watch, the app will display each activity as it is happening, with a progress indicator to let the learner know how much time is left before transitioning . Swiping up on any activity will display any associated tasks so that they can be checked off. Swiping to the left will then show what activities come next.

Building the visual schedules themselves takes place on the paired iPhone, where most of the work is done through a drag and drop interface.  From the Today view, tapping Edit in the upper right corner will display icons for 14 preset activities on the right side of the screen.

Visual Schedule today view in edit mode, showing activities that can be added to daily schedule.

You can also tap Add (+) at the bottom of the list to create your own custom activity (which can use a photo from your Camera Roll as its icon).  For each activity, you can double-tap its icon in the Edit view to specify the color of the label as well as any associated tasks.

To build the visual schedule itself, you  drag the activity icons onto the Today view and use the provided handles to  adjust the activity’s duration .

Proloquo2Go

Proloquo2Go ($249.99) is a robust symbol-based communication app that has been available for iOS devices for some time. The price listed is not for the Apple Watch app itself, but rather for the communication app that runs on iOS and which includes an Apple Watch component. Developer AssistiveWare originally  created the Apple Watch app when the watch was announced and at that time the app only allowed the wearable to be used as a switch for a paired iPhone running Proloquo2Go. With watch OS 2, Proloquo2Go now also provides basic communication capabilities on the Apple Watch app.

When you first launch Proloquo2Go on the Apple Watch you will be prompted to select the mode you want the app in: Switch or Communication. To change the mode after the initial selection, you will have to go into the Proloquo2Go options on the paired iPhone and change it there. You can see how the Switch feature works in a video I have posted on my YouTube channel.

Proloquo2Go Phrase BuilderThe new Communication option works by providing a basic set of phrases to start, then a Builder which can be accessed through a Force Touch to build new phrases. The Builder works very much in the same way you customize the various Complications  on the Apple Watch faces. The window is divided into three cells. You use the Digital Crown to select a sentence starter from the first cell, then repeat the process to select a noun or adjective from one of the available categories to complete the phrase (again, using the Digital Crown to navigate the choices). When the sentence is selected, it is displayed upside down so that the Apple Watch can be shown to the other person.  I found this to work out best when I turned off the option for “Wake Screen on Wrist Raise” on the Apple Watch settings. Otherwise, the screen would go to sleep as soon as I turned my wrist to display the message. Hopefully in the future the Apple Watch app can include text to speech, which according to AssistiveWare is a limitation imposed by Apple.

Proloquo4Text

Proloquo4Text ($119.99)  is a text-based communication app from AssistiveWare that, like Proloquo2Go,  includes an Apple Watch component.  However, unlike the Proloquo2Go Apple Watch app, this one does not include a phrase builder. You can choose to store a number of phrases into a special Apple Watch folder on the iOS app, and these phrases are then available for selection on the Apple Watch. As with the phrase builder in the  Proloquo2Go app, the phrases are displayed upside down when selected.

Wunderlist

For learners who have executive functioning challenges that make it difficult to stay organized, a good to do list app with reminders can be a helpful support. Surprisingly, there is no Reminders app for Apple Watch, though you can use Siri to create  a new reminder that shows up on the Reminders app on a paired iPhone.

Wunderlist Home ViewWunderlist (free) is an option if you would like to both create and view reminders on the Apple Watch. It is a free service for accessing to dos from just about any device (a full list of supported platforms is available on the Wunderlist site). On Apple Watch, the Wunderlist app provides a simple Home View with four options for viewing to dos: Inbox, Today, Starred and Surp to Me.  The Glances feature is also supported so you can access your to dos for the current day with a quick swipe up from the watch face.

To create a new to do item, you use the Force Touch feature of Apple Watch. You press and hold firmly to reveal the Add (+) button, then use your voice to dictate the text for the new to do and tap Done when you’re finished.

On the companion iPhone app you can then add details such as the due date, set up a reminder (which will use the Taptic Engine on the Apple Watch to get your attention with a gentle tap) and organize the to dos into lists that can be accessed by scrolling with the Digital Crown on the Apple Watch. The idea is that the Apple Watch app is an interface for creating quick to dos and checking them off as they are completed, while the iPhone app provides more options for managing said to dos.

Evernote

Evernote launch screen on Apple WatchEvernote (free) is one of my favorite apps for collecting random bits of information before it can get lost. It has become my replacement for all the post it notes I used to keep around my desk. With the Apple Watch app, you can create quick notes, use your voice to search through all the notes you have on your account, and see a list of recently viewed or updated notes. Like Wunderlist, Evernote supports reminders for notes that are time sensitive. However, with Evernote you can indicate the time for the reminder right on the Apple Watch itself as you create the note (though the options are limited to “this evening,” “tomorrow,” “next week” and “next month”). I find Evernote to be a nice compliment to Wunderlist and I use both: Wunderlist for to dos and Evernote for quick notes I will need to refer to at a later time but don’t necessarily have to act on right away. Together the two apps are great supports for staying organized and minimizing the risks of losing important information.

Just Press Record

Just Press Record opens with a big record buttonJust Press Record ($4.99) is a new audio recording app that is made possible by the access developers now have to the microphone with  watchOS 2.  Just Press Record can record audio directly from the Apple Watch’s microphone and play it back with the built-in speaker. The interface couldn’t be simpler: a nice big microphone button you press to start your recording. A nice touch is that you can see the wave form as you record, and when you are finished you can preview the recording before you choose to save the file to your iCloud account. You can even record when your iPhone is not within range of the Apple Watch (the recording will be synced the next time you connect the two devices). This app is useful as another option for students to capture their thoughts and ideas using just speech. It could even be used for students to reflect on their progress at regular intervals (at the end of each
day or week). Recordings can be shared with the teacher from the iCloud account the app uses to store the recordings.

iTranslate

iTranslate app ready to translate to SpanishiTranslate (free, in-app purchase) is a nice app to have if you have English Language Learners in your class and your command of their native language is not as strong as you would like it to be. When the app launches, you can use Force Touch to change the language (more than 90 are supported). Once you have set your language, tap the middle of the screen and use your voice to speak the phrase you want translated. You can then play the translation back using the Apple Watch speaker. This is not the fastest app (especially at launch) but hopefully it will continue to improve on performance over time. 

The number of apps for Apple Watch will continue to grow as developers become more comfortable with the device. What is exciting to me is the ability for developers to tap into the hardware features of the device with watch OS 2. I look forward to how the developer community will take advantage of the microphone and I hope that soon text to speech is made available to third-party apps as well. That would make many of these apps even more useful. What do you think?

IOS 6 Accessibility Features Overview

At today’s World Wide Developer’s Conference (WWDC) Apple announced IOS 6 with a number of accessibility enhancements. I am not a developer (yet!) so I don’t have a copy of the OS to check out,  so this post is primarily about what I read on the Apple website and on social media. A few of these features (word highlighting for speak selection, dictionary enhancements, custom alerts)  were tucked away in a single slide Scott Forstall showed, with little additional information on the Apple website. So far, these are the big features announced today:

  • Guided Access: for children with autism, this feature will make it easier to stay on task. Guided Access enables a single app mode where the home button can be disabled, so an app is not closed by mistake. In addition, this feature will make it possible to disable touch in certain areas of an app’s interface (navigation, settings button, etc.). This feature could be used to remove some distractions, and to simplify the interface and make an app easier to learn and use for people with cognitive disabilities. Disabling an area of the interface is pretty easy: draw around it with a finger and it will figure out which controls you mean. I loved how Scott Forstall pointed out the other applications of this technology for museums and other education settings (testing), a great example of how inclusive design is for more than just people with disabilities.
  • VoiceOver integrated with AssistiveTouch: many people have multiple disabilities, and having this integration between two already excellent accessibility features will make it easier for these individuals to work with their computers by providing an option that addresses multiple needs at once. I work with a wounded veteran who is missing most of one hand, has limited use of the other, and is completely blind. I can’t wait to try out these features together with him.
  • VoiceOver integrated with Zoom: people with low vision have had to choose between Zoom and VoiceOver. With IOS 6, we won’t have to make that choice. We will have two features to help us make the most of the vision we have: zoom to magnify and VoiceOver to hear content read aloud and rest our vision.
  • VoiceOver integrated with Maps: The VoiceOver integration with Maps should provide another tool for providing even greater  independence for people who are blind, by making it easier for us to navigate our environment.
  • Siri’s ability to launch apps: this feature makes Siri even more useful for VoiceOver users, who now have two ways to open an app, using touch or with their voice.
  • Custom vibration patterns for alerts: brings the same feature that has been available on the iPhone for phone calls to other alerts. Great for keeping people with hearing disabilities informed of what’s happening on their devices (Twitter and Facebook notifications, etc.).
  • FaceTime over 3G: this will make video chat even more available to people with hearing disabilities.
  • New Made for iPhone hearing aids: Apple will work with hearing aid manufacturers to introduce new hearing aids with high-quality audio and long battery life.
  • Dictionary improvements: for those of us who work with English language learners, IOS 6 will support Spanish, French and German dictionaries. There will also be an option to create a personal dictionary in iCloud to store your own vocabulary words.
  • Word highlights in speak selection: the ability to highlight the words as they are spoken aloud by text to speech benefits many  students with learning disabilities. Speak selection (introduced in IOS 5) now has the same capabilities as many third party apps in IOS 6.

These are the big features that were announced, but there were some small touches that are just as important. One of these is the deep integration of Facebook into IOS. Facebook is one of those apps I love and hate at the same time. I love the amount of social integration it provides for me and other people with disabilities, but I hate how often the interface changes and how difficult it is to figure it out with VoiceOver each time an update takes place. My hope is that Apple’s excellent support for accessibility in built-in apps will extend to the new Facebook integration, providing a more accessible alternative to the Facebook app which will continue to support our social inclusion into mainstream society. You can even use Siri to post a Facebook update.

Aside from the new features I mentioned above, I believe the most important accessibility feature shown today is not a built-in feature or an app, but the entire app ecosystem. It is that app ecosystem that has resulted in apps such as AriadneGPS and Toca Boca, both featured in today’s keynote. The built-in features, while great,  can only go so far in meeting the diverse needs of people with disabilities, so apps are essential to ensure that accessibility is implemented in a way that is flexible and customized as much as possible to each person. My hope is that Apple’s focus on accessibility apps today will encourage even more developers to focus on this market.

Another great accessibility feature that often gets ignored is the ease with which IOS can be updated to take advantage of new features such as Guided Access and the new VoiceOver integration. As Scott Forstall showed on chart during the keynote, only about 7% of Android users have upgraded to version 4.0, compared to 80% for IOS 5. What that means is that almost every IOS user out there is taking advantage of AssistiveTouch and Speak Selection, but only a very small group of Android users are taking advantage of the accessibility features in the latest version of Android.

Big props to Apple for all the work they have done to include accessibility in their products, but more importantly for continuing to show people with disabilities in a positive light. I loved seeing a blind person in the last keynote video for Siri. At this keynote, Apple showed another  blind person “taking on an adventure” by navigating the woods near his house independently. As a person with a visual disability myself, I found that inspiring. I salute the team at Apple for continuing to make people with disabilities more visible to the mainstream tech world, and for continuing to support innovation through inclusive design (both internally and through its developer community).

iPhoto App and Accessibility

This weekend I finally had a chance to try out the new iPhoto app Apple released along with iPad 3 (or as they are calling it “the new iPad.”) As an aspiring photographer I was impressed with the many options for organizing, editing, and sharing photos Apple has packed into this app which only costs $4.99 in the App Store. There have been many reviews of the new app posted online already, so I will not add another one here. However, I do have a unique perspective on the new app that I would like to share. Not only do I like to take photos (calling myself a photographer might be a stretch but it’s a hobby I enjoy and continue to try to get better at every day), but I also have a visual disability so I am part of a small community of blind photographers.

When I opened the iPhoto app on my iPhone, the first thing I did was turn on the VoiceOver built-in screen reader to hear how it would do with the new photo editing app. Frankly, I was not surprised that the new iPhoto app would be as accessible with VoiceOver as it is. I have come to expect accessible products from Apple over last few years, and I’m proud to be associated with it as an Apple Distinguished Educator. However, as I dug deeper into the iPhoto app with VoiceOver, the level of attention to detail in providing accessibility was still pretty impressive. For example, the brushes used to retouch photos (repair, lighten, darken, etc) are all accessible through VoiceOver gestures, as are the exposure and color correction controls and the various effects, . When I selected the crop tool, VoiceOver told me to pinch to resize the photo and as I did so it told me how much as I was zooming in as well as how far the image was offset (“image scaled to 15X, image offest by 15% x and 48% y).

On the iPad, there is a dedicated help button that opens up a series of overlays indicating what each button does. Not only was every part of the overlay accessible, but so is the entire help built into the iPad version of the app. The attention to detail is more impressive to me because there are so few blind photographers who would take advantage of an app such as iPhoto. What it does show is the level of commitment Apple has to accessibility, because it will go to great lengths to add accessibility even when only a few people will benefit from it.

In a recent blog post, accessibility advocate Joe Clark called out a number of hot new apps (Readability, Clear, Path, and Flipboard) for what he called irresponsible web development that results in accessibility barriers. Well, to me this new iPhoto app shows that you can design an app that is not only visually appealing, feature-packed and easy to use and learn, but also accessible to people with visual disabilities. I hope more developers start to realize that accessibility does not have to compete with good design, but that both complement each other.

When I first loaded the iPhoto app on my iPhone (that was the first device I installed the app on) I was too impatient to go on the Web and read about the app before I started to work with it. That’s just the kind of user I am, I like to get right in and try things out. Well, on the iPhone app the Help button from the iPad version of the app is missing. Most of the icons make sense, but in some cases I was unsure, so what I did was turn on VoiceOver and move my finger around the screen to have it announce what each button was for (or to at least give me a better idea). In that case, compatibility with VoiceOver helped me learn the app much faster without having to consult the help, and that got me to thinking. As these devices (phones, tablets, and whatever comes next) continue to get smaller and the interfaces start to use more visuals (tiles, buttons, etc.) and less text, the ability to hear the help may become an essential aspect of learning how to use the interface. In this way, features like VoiceOver would actually enhance the usability of a particular app for everyone – what universal design is all about.

 

Overview of new accessibility features in IOS 5

With IOS 5, Apple has introduced a number of features to make their mobile devices even more accessible to people with disabilities:

  • VoiceOver enhancements: IOS 5 includes an updated voice for VoiceOver, the built-in screen reader for people who have visual disabilities. I have found the new voice to be a great improvement over the old one, especially when reading long passages of text in apps such as iBooks. Another improvement is that the triple-click home option is set to toggle VoiceOver by default. Along with the PC-free setup introduced with IOS 5, this small change has made it possible for someone with a visual disability to independently configure his or her IOS device out of the box, without any help from a sighted person. The Mac-cessibility website has an excellent overview of the many new changes in VoiceOver that I highly recommend reading.
  • Camera app compatibility with VoiceOver: this is a neat feature that will make photography more accessible to people with low vision and those who are blind. With VoiceOver on, if you launch the Camera app it will announce how many faces are in the frame. In my testing this worked pretty well, and I’ve used it successfully on the iPad and the iPod touch. It should work even better on the iPhone, which has a better sensor and optics. Combined with the ability to turn on the camera app from the lock screen on some devices (iPhone and iPod touch) by double-tapping the home button and the fact that you can use the volume up button as a shutter release, Apple has done a lot to make photography more accessible to people with visual disabilities.
  • Text selection showing Speak menu option.Speak selection (text to speech): This is one of my favorite features introduced with IOS 5. It provides another modality for students with learning disabilities who can benefit from hearing the text read aloud to them. To use it, go into Settings, General, Accessibility, tap Speak Selection and choose On. Once you’ve enabled this feature, when you select text a popup will show the option to Speak the text using the VoiceOver voice. Note that you can control the speaking rate for the speak selection feature independently from VoiceOver.
  •  Balance controls for audio: In addition to mono-audio, which combines both channels of stereo audio into a single mono channel, there is now an option for controlling the  left/right balance for stereo sound. On the iPhone, there is now also a special Hearing Aid mode that is supposed to make the device more compatible with hearing aids.
  • Handling of incoming calls: you can choose to automatically route incoming calls to the speaker phone feature of the phone, or to a headset.
  • New alert types: on the iPhone, you can use one of five unique vibration patterns to identify who is calling if you have a hearing disability, or you can create your own pattern by tapping it on the screen. These custom vibration patterns can be assigned in the Contacts app by opening a contact’s information, choosing Edit, Vibration and then Create New Vibration. There is also an option to have the LED  flash go off when you get a notification, a new message, and so on.
  • Assistive touch: this was one of the most anticipated accessibility features in IOS 5. Assistive touch was designed to make IOS devices easier to use for people with motor difficulties. For example, someone who is not able to tap the Home button to exit an app can now bring up an overlay menu with icons for many of the hardware functions of their device, including the Home button. Overlay menu for assistive touch.Assistive touch also includes options allowing for single finger use of many of the multi-touch gestures (including the new four finger gestures available only for the iPad and the pinch gesture used for zooming). To use assistive touch, choose Settings, General, Accessibility and turn on Assistive Touch. You will know assistive touch is enabled when you see a floating circular icon on the screen. Tapping this icon will open the overlay menu with the assistive touch options. Note that you can move the assistive touch icon to another area of the screen if it gets in the way. Please note that Assistive Touch is not compatible with VoiceOver. I really wish the two features could work in tandem. This would be helpful to users with multiple disabilities.
  • Custom gestures: assistive touch includes an option to create your own gestures. Update: I was able to create a few useful gestures after watching this video from Cult of Mac. I created one for scrolling up on a page and one for scrolling down. Now when I’m reading a long web page, instead of having to swipe up or down to scroll I can bring up the assistive touch overlay menu, select the new gesture from the Favorites group and tap once on the screen to scroll.
  • Typing shortcuts: under Settings, General, Keyboard you can create shortcuts for common phrases. For example, you could create a shortcut that would enable you to enter an email signature by simply typing the letters “sig” and pressing the space bar. This feature should provide a big productivity boost to anyone who has difficulty entering text on their mobile device.
  • Siri and dictation (iPhone 4S only): the new personal assistant uses voice recognition and artificial intelligence to respond to a range of user queries that can be made using everyday language rather than preset commands. The Apple website has a video that demos some of the capabilities of Siri.  One of the amazing things about Siri is that it works without any training from the user. Along with Siri, the iPhone 4S also includes an option to dictate text by tapping a microphone button on the keyboard.  The ability to use your voice to control the device can be helpful to many different types of disabilities, including those who have disabilities that make it difficult to input text. One of the things I have found especially frustrating when using VoiceOver on IOS devices is inputting text, so I hope this new dictation feature makes that easier. I will have a chance to test it out more thoroughly once I get my own iPhone 4S (currently out of stock in my area). Update: I finally got my hands on an iPhone 4 and I tried using the dictation feature with VoiceOver. It is working really well for me. I find the microphone button on the onscreen keyboard by moving my finger over it, double-tap to start dictation (as indicated by a tone) and then I double-tap with two fingers to stop it. Even better, after I’m done dictating the text, if I move the phone away from my mouth,  it automatically stops listening! I love this feature.
  • Word selection showing Define menu option.Dictionary: While it is not listed as an accessibility feature, having a system dictionary is a new feature that is great for providing additional language supports to students with learning disabilities. To use this feature, select a word and a popup will show the Define option that will allow you to look it up using the same dictionary that has been previously available only in iBooks.
  • iMessages: a new  add-on for the Messages app makes it possible to send free MMS messages to any owner of an IOS device. Many people with hearing disabilities rely on text messaging as a convenient means of communication. The iMessages will be especially helpful to those who are on a limited text messaging plan.
  • Reminders app: The new Reminders app has a simple interface that will make it a nice app for people who need help with keeping track of assignments and other tasks. On the iPhone 4 or iPhone 4S, tasks can be tied to a location using the phone’s GPS capabilities. One use of this feature could be to set up a reminder for a person to take their medication when they get to a specific location, for example.
  • Airplay mirroring (iPad 2, requires an Apple TV): along with IOS 5, a recent firmware update for the Apple TV enables mirroring to a projector or TV using Airplay. I can see this option being helpful in a class where there are students in wheelchairs who have difficulty moving around the room. Using air mirroring, the teacher could bring the iPad 2 to the student and the rest of the class could still see what is displayed by the projector or TV.
The new accessibility features make IOS 5 a must-have update for anyone who has a disability, as well as for those who work with individuals with disabilities. For schools and other educational institutions, the accessibility features of IOS make Apple mobile devices an ideal choice for implementing mobile learning while complying with legal requirements such as Section 504, Section 508 and the Americans with Disabilities Act.
Disclosure: I am an Apple Distinguished Educator.

Favorite Free Apps on the new Mac App Store.

When the new Apple Mac App Store launched on January 6th, I was at first really disappointed with the choice of free software available. However, there was a lot about the App Store itself to like.  One thing I really like about the Mac App Store is that it simplifies the software update process by making it extremely easy to update all of your purchased/downloaded software with one click (much the same way you update apps on an iPad or iPhone). I also like that it is tied to your iTunes account so that you can install the same software across several machines and keep them in sync without having to spend endless hours downloading the same software on each machine.

Now, I have not had a chance to do an extensive review of the accessibility of the app  (it is not part of iTunes but it’s own app accessed through the Apple menu or the Dock) but so far it appears to be good. The secret appears to be using the rotor to quickly move between the different sections.  In any case, I would think that a single app that supports VoiceOver, even if not perfectly, would be a much better option for someone with a visual impairment  than having to visit each individual website to purchase/download individual apps.

Of the paid apps, the standouts are Rapidweaver (a web design program I used to design my own website), Pixelmator (a graphic editor that should have most of the features needed by the average person who doesn’t want to mortgage their house for Photoshop) and the unbundled iLife ’11 and iWork ’09 apps (don’t use Numbers, fine don’t buy that one). Some of the software is available at a reduced price (Pixelmator is half price on the App Store). If you are a photographer, Aperture for only $80 (instead of $200) is a steal.

But this post is about the free apps, so here are the ones I have installed so far that I like:

  • Caffeine is a tiny program that runs in the menu bar and allows you to suspend your energy settings. It is perfect for when you’re doing a presentation or watching web video and don’t want to be interrupted by the screen reader, screen dimming and other energy saving features. Using the menu bar icon is much faster than opening the display preferences.
  • DropCopy allows you to copy files between any Apple devices, including your laptop or desktop and your iPad, iPhone or iPod touch (you will need to install a free companion app).
  • MindNode for Mac is a simple brainstorming/concept mapping app for those who are visual learners. The app doesn’t have all the bells and whistles of other programs such as Inspiration, but it presents a simple interface that is perfect for brainstorming ideas.
  • Alfred is now my favorite way to search my Mac and launch applications. It works much like Quicksilver. Press a key and a text box will open in the middle of the screen where you can type in your search term. I like that it is much simpler and appears faster than Quicksilver, which never really caught on with me.
  • TextWrangler is a pretty good text editor with features usually found on much more expensive editors (search and replace across multiple files, FTP and SFTP support, etc.).

So far the only program I’ve downloaded that I was not happy with has been Smart Recorder. I just didn’t find it that useful or easy to use. However, it is still on my list of Purchases, so if I change my mind and find a use for it, it will be there waiting for me to install it with just one click.

You will notice that my list has a heavy focus on utilities. Your list may be different depending on how you use your Mac.

ePub and Pages Tutorial

I am working on a screencast of how to create ePub documents for the iPad using Pages ’09.

Three things to keep in mind when creating ePub documents with Pages:

  • the ePub export feature is only available for word processing documents.
  • Your images should be added inline. A good way to ensure all images (as well as video/audio files are inline is to always use the Insert menu, instead of dragging in a file from the Media Browser or the Desktop).
  • Use the styles menu to add styles to headings. You can then use the TOC tab of the Document Inspector to check which items you want to be listed in the table of contents automatically created by Pages 09. The top item on the TOC will be used to divide your document into chapters.

The final thing I want to emphasize is that this format is really intended for text.  When VoiceOver is used to read the ePub document in iBooks, it will only read alternative text for those who are blind if the text is added as a caption underneath each image (such as Figure 1a. ….). This is essential for accessibility.

Update: You can get around the fact that Pages does not let you insert alt text for images by renaming your image file to match the desired alt text before you add it to the Pages document.