HazeOver as a low vision aid

HazeOver is a $4.99 Mac app marketed as a distraction aid. The idea is that it dims all other windows so you can focus on the content in the foreground window (a blog post like this one, a paper you are drafting for school, etc.). The developers have prepared a short demo video that shows how the app works.

 

While that may be a good way to use this utility, for me it has become a helpful low vision aid as well. I often have a difficult time finding the mouse cursor and popup windows if they are out of my field of view (currently about 7 or 8 degrees depending on the day). I have been using Mousepose to help with the mouse cursor problem. Even with the mouse cursor set to the largest size it allows in Mac OS, I still have a difficult time locating it on the screen, especially when I have a dual monitor setup. I have found that the spotlight Mousepose puts around the mouse cursor when I press a special key (I have set to F1) makes this task much easier.

HazeOver does pretty much the same thing but for popup windows. When one of these windows pops up on the screen, the focus is assigned to it and all other windows are dimmed. In the HazeOver preferences, you can determine whether you want just one window to be highlighted or all front windows within the active app. I find the one window setting to be the most helpful with popups. You can adjust the level of dimming at any time using a slider that can be accessed by clicking the Menu Bar icon. For the best performance, HazeOver asks to get access to Mac OS as an assistive device.

A free trial of HazeOver is available from the developer’s site if you want to try it out first before you buy it on the Mac App Store.

 

10 Apple TV Apps for the Classroom

The Apple TV is already popular in schools that have adopted the iPad as a learning tool. Its support for AirPlay makes it possible for teachers to show apps on their iPads to the entire class, and it allows learners to show their work to their peers right from their seat. The release of the fourth generation device with access to an App Store promises to expand the possibilities for Apple TV in the classroom.

App Store on Apple TV

While at launch the app selection is limited, if the iOS and Mac app stores are any indication, this situation will quickly change. For now, the primary challenge is finding apps. Discovery would be greatly enhanced by an option to browse the store by categories, including one devoted specifically to education (as I was writing this, Apple a new Categories section showed up on my Apple TV so it looks like this issue should be improved soon). For now, we have to wade through the many fireplace apps. Another issue is that it is not possible to easily share apps since there is no version of Safari for the Apple TV. Thus, I can only provide a list with some brief descriptions and my experience with each app, but no links to help you quickly add the apps to your device.

A quick tip: make sure to look under Purchased when you go into the App Store on your Apple TV. It turns out that some apps are universal. This means the developer can create one version that is available on both iOS and Apple TV (the device is running a version of iOS after all). I was able to find and quickly install a couple of apps this way.

Another quick tip: Make sure you have your iOS device or computer nearby as you navigate the App Store and install apps. As an alternative to entering your login information on the Apple TV, some apps will ask you to go to a special web page on another device, where you enter a code displayed on the Apple TV.

Remember that with many of the video apps, you can use Siri to turn on the captions. Just say “turn on the captions.” You can also just flick down with one finger to display an overlay with additional information about the current program and options for captions and subtitles (as well as AirPlay).

Overlay with access to captions and subtitles for current video program.

In Settings, the captions can be customized to make them easier for all in the classroom to follow along with them.  Captions are usually available for content from TED and PBS, whereas it varies on YouTube (most of the content there still relies on automatic captions which are not always accurate, unfortunately).

Without further delay, here is my list of my most useful apps so far:

  1. YouTube: My first source for learning about a new topic. If you used this app on the old Apple TV, you will not see much of a difference with this version. Once you complete the login process, which requires you to enter a code on another device as per quick tip 2 above, you will see your subscriptions, watch history and the like.
  2. TED: Again, there is not much difference between this offering and the other TED apps. As with YouTube, you will need to log in on a different device and enter a code in order to save talks that you want to watch later.
  3. PBS Video: With access to a deep library of PBS video content from shows such as Nature, Frontline, NOVA and more, the PBS Video app can be helpful in a variety of subjects, from social studies and science to language arts.
  4. PBS Kids: The Kids app features popular shows such as Sesame Street, Curious George, Arthur and more.
  5. Coursera: This app provides access to the videos that make up most of the content for courses offered on the Coursera platform. You are not able to display the PDF documents and other resources. Even with that limitation, I have been able to find a couple of good courses that look like they will be interesting: Design Thinking from UVA and Ignite Your Every Day Creativity from SUNY). For most of these courses, you can watch for free or choose to get credit by paying a fee. I did not go through the process of enrolling for credit with the courses I am exploring, so I can’t speak to how that works.
  6.  Storehouse: This is one of my favorite story telling apps on iOS, but the Apple TV version is quite limited in my opinion. It only allows you to show the photos you have added to a story, not the quotes or captions. Even so, students can use it to create short five-frame stories that use imagery to convery a message or tell a story in a different way.
  7. Montessory Spelling: As the name implies, this app allows young leanrers to practice their spelling. After being shown a photo that represents the word and hearing it spoken aloud, the learner sees blank lines representing the number of letters needed. Using the Apple TV remote, the learner then selects the letters in the correct order to get auditory feedback (the word is repeated and stars are shown on the screen). The Settings include options for selecting the level of difficulty, the letter placement (right space or next space) and the keyboard (capital letters, script or cursive). English, spanish, french and italian are the supported languages. Not a complicated or highly interactive app, but then again few of the learning apps I have seen so far on the new Apple TV are.
  8. Dictionary: This is one of those things that should just be built into Siri, but it’s not, so there’s an app for that. The one thing I like is the display of photos from Flickr with each definition. That can definitely help some learners who prefer or need visuals for understanding. There is a word of the day feature, but each time I tried it I got booted to the Home Screen. Unfortunately, the text is very small even on a reaonsably sized TV, and there are no options to increase it within the app.
  9. MyTalk Tools: For those of you who work with students with communication difficulties (or parents of kids who have such difficulties) there is at least one Augmentative and Alternative Communication (AAC) app on the Apple TV app store. I am still not sure how helpful this kind of app will be on this platform but hey, it’s available as an option. Maybe it will allow for quick communication while a child is watching a program or interacting with an app on the Apple TV (by double-tapping the Home button to switch back and forth between the AAC app and the other content or app). MyTalk is a $99 app for iOS (though a lite version is available if you want to try it as I did). It is on the iOS device that you will configure the communication boards available on the Apple TV after syncing through a MyTalk account. For each cell in the communication board, you can record your own voice and change the photo to either one you have saved to the Camera Roll or one you take with the camera of your iOS device. It looks like the free version will only allow you to replace existing cells, not create new ones.
  10. White Noise: I didn’t really go out looking for this app. It was shown to me when I looked in my Purchased category in the App Store (because I already own it on iOS and it is a universal app). I’m thinking this would be a good app to help learners simmer down and focus if they get too rowdy. It plays soothing sounds from ocean waves, to forest sounds, to rain drops and more. Since the app will continue playing in the background even after you exit it, you can combine it with the amazing screen savers Apple has provided for the ultimate chill out experience.

You will notice I have not included any math apps. Overall, I was not too pleased with the three I tried (each was only $.99): Math Champions, Math for Kids, and Math Drills. Each of these has some drills limited to basic operations. Beyond selecting the correct answer from a list and getting the typical auditory feedback (“Correct!”) there was not much in the way of interactivty or an immersive game experience. This is an area where I hope a few developers will look to creating something that is unique to the platform and incorporates more engaging gamification elements (a story, a mission, etc.). I did find some calculator and unit conversion apps, but again I feel this is something that should be easy for Siri to perform rather than require a separate app (in fact, it can already do all this on iOS devices).

That’s it for my initial tour of the Apple TV App Store after just a couple of days of owning the device. Have you found some useful apps I have left off the list? Let me know in the comments or tweet them at me (@_luisfperez).

A Workflow for Independence – Logan’s Story

Recently I had the pleasure of meeting Logan Prickett, a second year student at Auburn University at Montgomery. Logan is an academically gifted STEM student and the inspiration behind The Logan Project at AUM, an initiative to develop software that will enable students who are blind or who have low vision to fully participate in all college-level math courses.

Luis and Logan.

At age 13, Logan suffered an anaphylactic reaction to the contrast dye in an MRI. His heart stopped beating on its own which left him without oxygen for 45 minutes. Logan believes that “a prayer chain that reached around the world was active during those 45 minutes and I credit God and those prayers for the heartbeat that brought me back to life.”

His time without oxygen left Logan blind, a wheelchair user, with fine motor control difficulties, and unable to speak above a whisper due to damage to his vocal cords that occurred during life saving measures. Logan has the cognitive ability to do the work in his courses, he just needs a few technology supports in place to ensure his vision and motor challenges do not get in the way and prevent him from tapping his full potential. The goal of the Logan Project is thus to eliminate barriers for students with complex needs like Logan so that they can not only complete required math coursework but pursue a career in a STEM field if they desire. This is worthy goal given the underrepresentation of people with disabilities in STEM fields. You can learn more about it by typing The Logan Project into the search bar on the AUM website (aum.edu).

The Goal: Independent Communication

When I met with Logan and his team the expressed goal was to get Logan started on the journey to independent communication, beginning with the ability to send and receive short messages with his family and those close to him. Logan had just acquired an iPhone 6 Plus and we considered the use of Switch Control since Logan has enough motor control to press a switch. To accommodate his visual impairment, we decided that Logan would use Switch Control with manual scanning and the speech support turned on. This way he would be able to hear the items on the screen as he presses the switches to scan through them at a pace that works for him. The one problem with this setup is the possibility of fatigue from repeated switch presses. Siri seemed like a possibility for getting around this issue, but unfortunately Siri is not able to recognize Logan’s low whisper to allow him to quickly send a text message or initiate a FaceTime call. Surprisingly, FaceTime can pick up Logan’s whisper well so that it can be understood on the other end of the call. Although he can be heard with an actual phone call as well, the audio with a FaceTime call is much better. Thus, if we could find a way to activate FaceTime with a minimum of effort we would go a long way toward giving Logan an option for communication while he develops his Switch Control skills. That’s where the Workflow app comes in.

Workflow to the Rescue

I knew about the Workflow app because it made history as the first app to get an Apple design award for its attention to accessibility. In fact, at the Design Awards, members of Apple’s engineering team who are blind were the ones who actually did the demo of the app to show how well it works with the VoiceOver screen reader built into Apple’s mobile devices. You can watch the demo on Apple’s WWDC 2015 site (the Workflow demo starts at 35 minutes and goes through the 42 minute mark.)

As the name suggests, Workflow is a utility for creating workflows that allow the user to chain together a series of actions to complete a given task. For example, as I often do tutorials with screenshots from my Apple Watch, I have created a workflow that automatically takes the latest Apple Watch screenshot saved to my Camera Roll on the iPhone and shares it to my computer using Airdrop so that I can quickly add it to a blog post or a presentation. This kind of workflow can save a lot of time and effort for tasks that you perform several times over the course of a day.

Workflow already includes many actions for built-in iOS apps such as Contacts, FaceTime and Messages. These actions can be chained together to create a workflow, with the output from one action used as the input for the next one in the chain. Thus, a workflow can consist of selecting an entry in the Contacts app and feeding its information into the FaceTime app to start a new call with the selected contact. In much the same way, the entry from the Contacts app can be combined with a Text action to start Messages, pre-fill the message body and automatically address the message. For Logan this kind of workflow would reduce the amount of work he would have to perform and allow him to send quick messages to his team, such as “I’m ready for pick up” or “class is running late.” There is even the possibility of sharing his location so that other team members can get an idea of where Logan is at different points in the day.

Once a workflow has been created it is possible to add it as a shortcut on the Home Screen, with its own descriptive name, icon and color. By organizing these shortcuts on the Home Screen it is possible to create a simple communication system for Logan, giving him the ability to use Switch Control to independently start FaceTime calls, send quick messages and more.

Going Forward

The ultimate goal is to develop Logan’s ability to communicate independently and this will require building up his skills as a new switch user. With time and practice, I have no doubt after getting to know Logan that he will become a proficient user of Switch Control. In the meantime, Workflow is a good option for building his confidence and giving him some good reasons to use those skills: communicating with those who are important to him with a minimum of effort. When he is ready, he could then add an alternative and augmentative communication (AAC) app such as Proloquo4Text to his arsenal of communication tools, as well as keyboards such as Keeble and Phraseboard that make it easier for switch user to enter text. Logan has demonstrated that he has the ability to do well in higher education; now we just have to figure out how to eliminate a few barriers that are standing in his way and preventing him from letting his ability shine.

7 Apple Watch Apps for Diverse Learners

Over on my YouTube Channel, I have posted a few video tutorials focusing on the built-in accessibility features of Apple Watch.

I also discuss these accessibility features in more detail in my recently updated book on low vision supports for Apple users, Zoom In (available as a free download on the iBookstore). VoiceOver, Zoom and many of the accessibility features familiar to users of iOS devices are included in Apple Watch. These accessibility features ensure users with a variety of special needs can personalize and use their Apple wearables.

As with iOS devices, Apple Watch also supports apps that provide even more flexibility for users in how they can use the wearable. With the release of watchOS 2 these apps can now run natively on the device itself, promising faster loading times and better performance overall. More importantly, apps can now use many of the hardware features available on Apple Watch, such as the Taptic engine, the Digital Crown, the various sensors (heart rate sensor and accelerometer) and the microphone.  Basically, apps can do much more than they could with the initial release of Apple Watch, opening the door for developers to be even more creative in how they address the needs of users who have special needs. This post focuses on my favorite apps for such users.

With any of the apps mentioned here, you will install the app from App Store just as you would do it for iPhone apps. You would then open the Watch app, go to My Watch, tap on the name of the app and make sure the toggle for “Show App on Apple Watch” is set to on. For some apps, you will also have an option to include it in Glances, the Apple Watch feature that allows you to access information with a swipe up from the current watch face.

For your convenience, I  have collected all of the resources mentioned in this post into an AppoLearning Collection. Have anything to add (apps, ideas for use in the classroom)? Let me know and I can add you as a collaborator to this collection (a new feature available on Appo Learning).

Children with Autism: A Visual Schedule

Visual schedules are supports that allow children and adults with autism and related disabilities to better handle transitions in their daily routines. These schedules use pictures to indicate to learners what they are to do and where they are to go next, helping to ease anxiety around transitions and building their ability to act independently.  According to developer Enuma (formerly Locomotion Apps), Visual Schedule ($12.99)  is the first picture-based scheduler for the Apple Watch.

Current activity shown on Apple WatchOn the Apple Watch, the app will display each activity as it is happening, with a progress indicator to let the learner know how much time is left before transitioning . Swiping up on any activity will display any associated tasks so that they can be checked off. Swiping to the left will then show what activities come next.

Building the visual schedules themselves takes place on the paired iPhone, where most of the work is done through a drag and drop interface.  From the Today view, tapping Edit in the upper right corner will display icons for 14 preset activities on the right side of the screen.

Visual Schedule today view in edit mode, showing activities that can be added to daily schedule.

You can also tap Add (+) at the bottom of the list to create your own custom activity (which can use a photo from your Camera Roll as its icon).  For each activity, you can double-tap its icon in the Edit view to specify the color of the label as well as any associated tasks.

To build the visual schedule itself, you  drag the activity icons onto the Today view and use the provided handles to  adjust the activity’s duration .

Proloquo2Go

Proloquo2Go ($249.99) is a robust symbol-based communication app that has been available for iOS devices for some time. The price listed is not for the Apple Watch app itself, but rather for the communication app that runs on iOS and which includes an Apple Watch component. Developer AssistiveWare originally  created the Apple Watch app when the watch was announced and at that time the app only allowed the wearable to be used as a switch for a paired iPhone running Proloquo2Go. With watch OS 2, Proloquo2Go now also provides basic communication capabilities on the Apple Watch app.

When you first launch Proloquo2Go on the Apple Watch you will be prompted to select the mode you want the app in: Switch or Communication. To change the mode after the initial selection, you will have to go into the Proloquo2Go options on the paired iPhone and change it there. You can see how the Switch feature works in a video I have posted on my YouTube channel.

Proloquo2Go Phrase BuilderThe new Communication option works by providing a basic set of phrases to start, then a Builder which can be accessed through a Force Touch to build new phrases. The Builder works very much in the same way you customize the various Complications  on the Apple Watch faces. The window is divided into three cells. You use the Digital Crown to select a sentence starter from the first cell, then repeat the process to select a noun or adjective from one of the available categories to complete the phrase (again, using the Digital Crown to navigate the choices). When the sentence is selected, it is displayed upside down so that the Apple Watch can be shown to the other person.  I found this to work out best when I turned off the option for “Wake Screen on Wrist Raise” on the Apple Watch settings. Otherwise, the screen would go to sleep as soon as I turned my wrist to display the message. Hopefully in the future the Apple Watch app can include text to speech, which according to AssistiveWare is a limitation imposed by Apple.

Proloquo4Text

Proloquo4Text ($119.99)  is a text-based communication app from AssistiveWare that, like Proloquo2Go,  includes an Apple Watch component.  However, unlike the Proloquo2Go Apple Watch app, this one does not include a phrase builder. You can choose to store a number of phrases into a special Apple Watch folder on the iOS app, and these phrases are then available for selection on the Apple Watch. As with the phrase builder in the  Proloquo2Go app, the phrases are displayed upside down when selected.

Wunderlist

For learners who have executive functioning challenges that make it difficult to stay organized, a good to do list app with reminders can be a helpful support. Surprisingly, there is no Reminders app for Apple Watch, though you can use Siri to create  a new reminder that shows up on the Reminders app on a paired iPhone.

Wunderlist Home ViewWunderlist (free) is an option if you would like to both create and view reminders on the Apple Watch. It is a free service for accessing to dos from just about any device (a full list of supported platforms is available on the Wunderlist site). On Apple Watch, the Wunderlist app provides a simple Home View with four options for viewing to dos: Inbox, Today, Starred and Surp to Me.  The Glances feature is also supported so you can access your to dos for the current day with a quick swipe up from the watch face.

To create a new to do item, you use the Force Touch feature of Apple Watch. You press and hold firmly to reveal the Add (+) button, then use your voice to dictate the text for the new to do and tap Done when you’re finished.

On the companion iPhone app you can then add details such as the due date, set up a reminder (which will use the Taptic Engine on the Apple Watch to get your attention with a gentle tap) and organize the to dos into lists that can be accessed by scrolling with the Digital Crown on the Apple Watch. The idea is that the Apple Watch app is an interface for creating quick to dos and checking them off as they are completed, while the iPhone app provides more options for managing said to dos.

Evernote

Evernote launch screen on Apple WatchEvernote (free) is one of my favorite apps for collecting random bits of information before it can get lost. It has become my replacement for all the post it notes I used to keep around my desk. With the Apple Watch app, you can create quick notes, use your voice to search through all the notes you have on your account, and see a list of recently viewed or updated notes. Like Wunderlist, Evernote supports reminders for notes that are time sensitive. However, with Evernote you can indicate the time for the reminder right on the Apple Watch itself as you create the note (though the options are limited to “this evening,” “tomorrow,” “next week” and “next month”). I find Evernote to be a nice compliment to Wunderlist and I use both: Wunderlist for to dos and Evernote for quick notes I will need to refer to at a later time but don’t necessarily have to act on right away. Together the two apps are great supports for staying organized and minimizing the risks of losing important information.

Just Press Record

Just Press Record opens with a big record buttonJust Press Record ($4.99) is a new audio recording app that is made possible by the access developers now have to the microphone with  watchOS 2.  Just Press Record can record audio directly from the Apple Watch’s microphone and play it back with the built-in speaker. The interface couldn’t be simpler: a nice big microphone button you press to start your recording. A nice touch is that you can see the wave form as you record, and when you are finished you can preview the recording before you choose to save the file to your iCloud account. You can even record when your iPhone is not within range of the Apple Watch (the recording will be synced the next time you connect the two devices). This app is useful as another option for students to capture their thoughts and ideas using just speech. It could even be used for students to reflect on their progress at regular intervals (at the end of each
day or week). Recordings can be shared with the teacher from the iCloud account the app uses to store the recordings.

iTranslate

iTranslate app ready to translate to SpanishiTranslate (free, in-app purchase) is a nice app to have if you have English Language Learners in your class and your command of their native language is not as strong as you would like it to be. When the app launches, you can use Force Touch to change the language (more than 90 are supported). Once you have set your language, tap the middle of the screen and use your voice to speak the phrase you want translated. You can then play the translation back using the Apple Watch speaker. This is not the fastest app (especially at launch) but hopefully it will continue to improve on performance over time. 

The number of apps for Apple Watch will continue to grow as developers become more comfortable with the device. What is exciting to me is the ability for developers to tap into the hardware features of the device with watch OS 2. I look forward to how the developer community will take advantage of the microphone and I hope that soon text to speech is made available to third-party apps as well. That would make many of these apps even more useful. What do you think?

Accessibility Options in Voice Dream Writer App

This week, Winston Chen and the Voice Dream team released a new Voice Dream Writer app,. I am highlighting the new app here not only because Voice Dream Reader is one of my favorite apps for students who need reading supports such as text to speech, word highlighting and customized text, but also for the attention to accessibility from the Voice Dream team in this new app. Not only are the controls and the interface in the app nicely labeled for VoiceOver users, but there are even a few features specially designed to make things easier for VoiceOver users.

Screenshot of Voice Dream Writer interface on iPad with VoiceOver controls highlighted.

When VoiceOver is turned on the app can recognize this and adds three buttons for editing text to the interface (they appear in the toolbar located just above the onscreen keyboard, on the left side). These buttons are:

  • Cursor:  allows the user to move the cursor by flicking up or down with one finger.
  • Cursor movement unit: changes how the cursor movement takes place by allowing the user to choose from characters, words or sentences.
  • Select text: selects text based on the cursor movement unit. For example, flicking up with sentences as the cursor movement unit will select the text one sentence at a time.

All of these controls are adjustable. A flick up or down with one finger will change the value (for the cursor movement unit) or navigate to/select the next item (for the cursor and select text buttons).

A three-finger swipe gesture is also supported for cursor movement and selection: a three-finger swipe up will move the cursor to the beginning of the document and a three-finger swipe down to the end, and three-finger swipes up or down will select the text from the cursor position to the beginning or end of the document.

Another nice feature of the app is the way it makes it easy to find misspelled words by opening Tools in the upper right and choosing Find Misspelled Words. You can then flick down with one finger to navigate the misspelled words in your document. When you get to a word you want to fix you have two options: you can double-tap with one finger to edit it with the onscreen keyboard or you can swipe from the right with three fingers to use the Word Finder with a phonetic search. The phonetic search will bring up a list of words that closely match the one that is misspelled in your document.  You can then choose the correctly spelled word from the list and double-tap with one finger to make the correction.

I did a short video to demonstrate some of these options in the Voice Dream Writer app. I hope you find it helpful. For more information about the app, make sure to check out the Voice Dream website.

A SAMR and UDL Framework

As I was traveling to Macworld 2013, where I presented a session on iBooks Author, I had some time when I was trapped on a plane without Wi-Fi (the horror!). Rather than reading the magazine in front of me, I gave into my urge to try to combine two frameworks I am really passionate about, the SAMR model developed by Dr. Ruben Puentadura and the UDL framework developed by CAST. Below is an image showing the framework I developed and some apps that address each level. This was just a quick brainstorm on a long plane ride, but I do appreciate your feedback.

SAMRandUDL008.008 SAMRandUDL.009

 

Update: Here is a text version that should be more accessible with a screen reader (with app and feature matching):

n: needs assessment and profile
determine current level of performance and desired outcomes.

A: access to content and tools
The technology eliminates barriers that prevent access to information

  • Proloquo2Go
  • FaceTime
  • VoiceOver
  • AssistiveTouch
  • Closed Captioning Support
  • Dictation (built-in with iOS)
  • Dragon Dictation
B: building supports and scaffolds for learner variability
The technology includes scaffolds and supports that account for learner differences.
  • iBooks
  • AppWriter US
  • Speak It!
  • Typ-O HD
  • Evernote
  • Notability
L: leveraging multimedia
The technology provides multiple means of expression.
  • Book Creator
  • Creative Book Builder
  • StoryKit
  • SonicPics
  • StoryRobe
  • Pictello
E: expression and creativity
The technology unleashes creative potential and disrupts perceptions of disability.
  • Camera
  • iMovie
  • Garageband
  • iPhoto
  • Instagram

IOS 6 Accessibility Features Overview

At today’s World Wide Developer’s Conference (WWDC) Apple announced IOS 6 with a number of accessibility enhancements. I am not a developer (yet!) so I don’t have a copy of the OS to check out,  so this post is primarily about what I read on the Apple website and on social media. A few of these features (word highlighting for speak selection, dictionary enhancements, custom alerts)  were tucked away in a single slide Scott Forstall showed, with little additional information on the Apple website. So far, these are the big features announced today:

  • Guided Access: for children with autism, this feature will make it easier to stay on task. Guided Access enables a single app mode where the home button can be disabled, so an app is not closed by mistake. In addition, this feature will make it possible to disable touch in certain areas of an app’s interface (navigation, settings button, etc.). This feature could be used to remove some distractions, and to simplify the interface and make an app easier to learn and use for people with cognitive disabilities. Disabling an area of the interface is pretty easy: draw around it with a finger and it will figure out which controls you mean. I loved how Scott Forstall pointed out the other applications of this technology for museums and other education settings (testing), a great example of how inclusive design is for more than just people with disabilities.
  • VoiceOver integrated with AssistiveTouch: many people have multiple disabilities, and having this integration between two already excellent accessibility features will make it easier for these individuals to work with their computers by providing an option that addresses multiple needs at once. I work with a wounded veteran who is missing most of one hand, has limited use of the other, and is completely blind. I can’t wait to try out these features together with him.
  • VoiceOver integrated with Zoom: people with low vision have had to choose between Zoom and VoiceOver. With IOS 6, we won’t have to make that choice. We will have two features to help us make the most of the vision we have: zoom to magnify and VoiceOver to hear content read aloud and rest our vision.
  • VoiceOver integrated with Maps: The VoiceOver integration with Maps should provide another tool for providing even greater  independence for people who are blind, by making it easier for us to navigate our environment.
  • Siri’s ability to launch apps: this feature makes Siri even more useful for VoiceOver users, who now have two ways to open an app, using touch or with their voice.
  • Custom vibration patterns for alerts: brings the same feature that has been available on the iPhone for phone calls to other alerts. Great for keeping people with hearing disabilities informed of what’s happening on their devices (Twitter and Facebook notifications, etc.).
  • FaceTime over 3G: this will make video chat even more available to people with hearing disabilities.
  • New Made for iPhone hearing aids: Apple will work with hearing aid manufacturers to introduce new hearing aids with high-quality audio and long battery life.
  • Dictionary improvements: for those of us who work with English language learners, IOS 6 will support Spanish, French and German dictionaries. There will also be an option to create a personal dictionary in iCloud to store your own vocabulary words.
  • Word highlights in speak selection: the ability to highlight the words as they are spoken aloud by text to speech benefits many  students with learning disabilities. Speak selection (introduced in IOS 5) now has the same capabilities as many third party apps in IOS 6.

These are the big features that were announced, but there were some small touches that are just as important. One of these is the deep integration of Facebook into IOS. Facebook is one of those apps I love and hate at the same time. I love the amount of social integration it provides for me and other people with disabilities, but I hate how often the interface changes and how difficult it is to figure it out with VoiceOver each time an update takes place. My hope is that Apple’s excellent support for accessibility in built-in apps will extend to the new Facebook integration, providing a more accessible alternative to the Facebook app which will continue to support our social inclusion into mainstream society. You can even use Siri to post a Facebook update.

Aside from the new features I mentioned above, I believe the most important accessibility feature shown today is not a built-in feature or an app, but the entire app ecosystem. It is that app ecosystem that has resulted in apps such as AriadneGPS and Toca Boca, both featured in today’s keynote. The built-in features, while great,  can only go so far in meeting the diverse needs of people with disabilities, so apps are essential to ensure that accessibility is implemented in a way that is flexible and customized as much as possible to each person. My hope is that Apple’s focus on accessibility apps today will encourage even more developers to focus on this market.

Another great accessibility feature that often gets ignored is the ease with which IOS can be updated to take advantage of new features such as Guided Access and the new VoiceOver integration. As Scott Forstall showed on chart during the keynote, only about 7% of Android users have upgraded to version 4.0, compared to 80% for IOS 5. What that means is that almost every IOS user out there is taking advantage of AssistiveTouch and Speak Selection, but only a very small group of Android users are taking advantage of the accessibility features in the latest version of Android.

Big props to Apple for all the work they have done to include accessibility in their products, but more importantly for continuing to show people with disabilities in a positive light. I loved seeing a blind person in the last keynote video for Siri. At this keynote, Apple showed another  blind person “taking on an adventure” by navigating the woods near his house independently. As a person with a visual disability myself, I found that inspiring. I salute the team at Apple for continuing to make people with disabilities more visible to the mainstream tech world, and for continuing to support innovation through inclusive design (both internally and through its developer community).

iPhoto App and Accessibility

This weekend I finally had a chance to try out the new iPhoto app Apple released along with iPad 3 (or as they are calling it “the new iPad.”) As an aspiring photographer I was impressed with the many options for organizing, editing, and sharing photos Apple has packed into this app which only costs $4.99 in the App Store. There have been many reviews of the new app posted online already, so I will not add another one here. However, I do have a unique perspective on the new app that I would like to share. Not only do I like to take photos (calling myself a photographer might be a stretch but it’s a hobby I enjoy and continue to try to get better at every day), but I also have a visual disability so I am part of a small community of blind photographers.

When I opened the iPhoto app on my iPhone, the first thing I did was turn on the VoiceOver built-in screen reader to hear how it would do with the new photo editing app. Frankly, I was not surprised that the new iPhoto app would be as accessible with VoiceOver as it is. I have come to expect accessible products from Apple over last few years, and I’m proud to be associated with it as an Apple Distinguished Educator. However, as I dug deeper into the iPhoto app with VoiceOver, the level of attention to detail in providing accessibility was still pretty impressive. For example, the brushes used to retouch photos (repair, lighten, darken, etc) are all accessible through VoiceOver gestures, as are the exposure and color correction controls and the various effects, . When I selected the crop tool, VoiceOver told me to pinch to resize the photo and as I did so it told me how much as I was zooming in as well as how far the image was offset (“image scaled to 15X, image offest by 15% x and 48% y).

On the iPad, there is a dedicated help button that opens up a series of overlays indicating what each button does. Not only was every part of the overlay accessible, but so is the entire help built into the iPad version of the app. The attention to detail is more impressive to me because there are so few blind photographers who would take advantage of an app such as iPhoto. What it does show is the level of commitment Apple has to accessibility, because it will go to great lengths to add accessibility even when only a few people will benefit from it.

In a recent blog post, accessibility advocate Joe Clark called out a number of hot new apps (Readability, Clear, Path, and Flipboard) for what he called irresponsible web development that results in accessibility barriers. Well, to me this new iPhoto app shows that you can design an app that is not only visually appealing, feature-packed and easy to use and learn, but also accessible to people with visual disabilities. I hope more developers start to realize that accessibility does not have to compete with good design, but that both complement each other.

When I first loaded the iPhoto app on my iPhone (that was the first device I installed the app on) I was too impatient to go on the Web and read about the app before I started to work with it. That’s just the kind of user I am, I like to get right in and try things out. Well, on the iPhone app the Help button from the iPad version of the app is missing. Most of the icons make sense, but in some cases I was unsure, so what I did was turn on VoiceOver and move my finger around the screen to have it announce what each button was for (or to at least give me a better idea). In that case, compatibility with VoiceOver helped me learn the app much faster without having to consult the help, and that got me to thinking. As these devices (phones, tablets, and whatever comes next) continue to get smaller and the interfaces start to use more visuals (tiles, buttons, etc.) and less text, the ability to hear the help may become an essential aspect of learning how to use the interface. In this way, features like VoiceOver would actually enhance the usability of a particular app for everyone – what universal design is all about.

 

2X the Productivity with DisplayPad

Ok, I may be exaggerating about the 2X boost in productivity, but DisplayPad has been one of the most useful apps I have purchased recently. With DisplayPad ($4.99), you can set up your iPad to serve as a portable secondary display for a laptop. When working with programs that have a lot of settings panels, such as Photoshop and other Adobe software, having a second display can save you a lot of time.

The app is really simple to set up and use. You will need to download a helper program that will run on your Mac. To start using DisplayPad, click on the helper program’s menu bar icon and choose your iPad (both devices need to be on the same Wifi network). Your iPad will then act just like any secondary display. Tapping with one finger will be the same as clicking with the mouse, and tapping twice will be the same as right-clicking. You can also drag with two fingers to scroll, just like on your laptop’s trackpad. I have been very happy with the performance. There is very little lag when you drag windows from your laptop screen to the iPad. I have even used DisplayPad with my regular secondary display to create a triple-display option when I use my laptop at home.

oMoby for iPod touch

Alena Roberts recently featured oMoby on her Blind Perspective blog. oMoby is intended as a shopping app, but as Alena has suggested it can have other uses for people with visual impairments. oMoby uses the iPhone/iPod touch 4G camera to snap a picture of any product you come across while shopping. The picture is then compared against a database and oMoby will provide you with a list of possible products/items. The fact that you don’t have to scan a bar code for this app to work makes it easier fo use for someone with low vision who might not be able to properly locate the bar code.

I had to try oMoby for myself to see how it works, and I was blown away with how well this app performs. Just like Alena, I wanted to see if I could use the app as a replacement for a currency identifier. I tried the app with both $1 and $5 bills and both times it accurately identified them. It had a more difficult time with coins (it could not correctly identify a quarter, only telling me that it was a silver coin). While not perfect for this purpose, oMoby is pretty good consideirng it is a free app.

The other thing to point out about oMoby is that I found it to work well with VoiceOver, the screen reader now included with all new iPod touch and iPhone models. So, to sum up, oMoby is a shopping app that has the potential to replace a much more expensive currency identifier used by those with visual impairments, and it can do it for $0 dollars. Not a bad deal.

Here is the link for it on the Apple Store.