New Video Tutorial: Accessibility Features of Apple TV

I love my Apple TV and use it not just for entertainment purposes but also as a learning tool that allows me to subscribe to a number of podcasts in order to stay up to date with the world of technology. Apple TV includes two great accessibility/universal design features found under Settings > General > Accessibility:

  • VoiceOver: the same screen reader that ships with Macs and iOS devices (and the recently released Apple Watch) is included with the Apple TV to provide spoken menus for someone who is blind or has low vision. You can adjust the speaking rate of VoiceOver, or set it to use a pitch change to indicate when you are navigating within the same screen or moving away to a different screen.
  • Closed Captions: as on iOS and OS X, the captions can even be customized with either preset styles or by creating your won custom styles. You can customize the text (font, color, text size), background (color and opacity) and even add special text styles such as highlighting or a drop shadow.

An accessibility menu is available as a shortcut  for turning these features on and off without having to go back into Settings. Once you enable it in the accessibility settings, this accessibility shortcut involves holding down the Menu button on the Apple TV remote until the menu pops up on the screen with options for VoiceOver and Closed Captions (as well as the usual function which is to Return to the main menu).

In addition to the included remote with tactile buttons, Apple TV can be controlled with the free Remote app for iOS. This app supports the VoiceOver and Switch Control accessibility features for iOS. You can even have VoiceOver with Alex(male voice) on the iOS device and Samantha (female voice) on the Apple TV so you can tell them apart.

Here is a video from my YouTube channel that provides an overview of the accessibility options included with Apple tV:

https://youtu.be/torM_GxfF4M

New Video Tutorial: Overview of Chrome OS Accessibility Features

Although I personally use Apple product in my day to day work, it is great to see that accessibility is being considered by most of the industry when it comes to the devices available for students. A great example is the Chromebook, which is a low cost device that is very popular in education right now. The Chromebook runs Chrome OS, a streamlined operating system that emphasizes access to cloud-based tools and resources. In this video tutorial, I provide a quick overview of the accessibility features built into Chrome OS, including: a screen reader (ChromeVox), a screen magnifier, an option for enlarging the cursor, a high contrast mode and more.

Accessibility Options in Voice Dream Writer App

This week, Winston Chen and the Voice Dream team released a new Voice Dream Writer app,. I am highlighting the new app here not only because Voice Dream Reader is one of my favorite apps for students who need reading supports such as text to speech, word highlighting and customized text, but also for the attention to accessibility from the Voice Dream team in this new app. Not only are the controls and the interface in the app nicely labeled for VoiceOver users, but there are even a few features specially designed to make things easier for VoiceOver users.

Screenshot of Voice Dream Writer interface on iPad with VoiceOver controls highlighted.

When VoiceOver is turned on the app can recognize this and adds three buttons for editing text to the interface (they appear in the toolbar located just above the onscreen keyboard, on the left side). These buttons are:

  • Cursor:  allows the user to move the cursor by flicking up or down with one finger.
  • Cursor movement unit: changes how the cursor movement takes place by allowing the user to choose from characters, words or sentences.
  • Select text: selects text based on the cursor movement unit. For example, flicking up with sentences as the cursor movement unit will select the text one sentence at a time.

All of these controls are adjustable. A flick up or down with one finger will change the value (for the cursor movement unit) or navigate to/select the next item (for the cursor and select text buttons).

A three-finger swipe gesture is also supported for cursor movement and selection: a three-finger swipe up will move the cursor to the beginning of the document and a three-finger swipe down to the end, and three-finger swipes up or down will select the text from the cursor position to the beginning or end of the document.

Another nice feature of the app is the way it makes it easy to find misspelled words by opening Tools in the upper right and choosing Find Misspelled Words. You can then flick down with one finger to navigate the misspelled words in your document. When you get to a word you want to fix you have two options: you can double-tap with one finger to edit it with the onscreen keyboard or you can swipe from the right with three fingers to use the Word Finder with a phonetic search. The phonetic search will bring up a list of words that closely match the one that is misspelled in your document.  You can then choose the correctly spelled word from the list and double-tap with one finger to make the correction.

I did a short video to demonstrate some of these options in the Voice Dream Writer app. I hope you find it helpful. For more information about the app, make sure to check out the Voice Dream website.

Switch Access now available in Android

After waiting a couple of weeks for the elusive Android 5.0 Lollipop update for my 2013 Nexus tablet, I  decided to do things the hard way using Google’s instructions for loading a factory image. If you know how to open a Terminal window and issue a few commands, you should not find it too difficult to load the factory image, though I did run into a few roadblocks that I was able to address with quick Google searches. Since I am on a Mac, I had to modify some of the commands a bit, but that was not too hard to do and after a couple of tries I was able to get the 5.0 image loaded on my device.

According to this article and video on Android Central, most of the accessibility features for Lollipop (Android 5.0) have been carried over from 4.4 KitKat. However, the article glosses over a significant addition to Android accessibility: Switch Access. Switch Access is not intended for those with visual impairments, as stated in the Android Central article. Rather, it allows people with physical and/or cognitive limitations to use a touch screen device with assistance from an adaptive switch. Google’s own description does a much better job of explaining how Switch Access works:

Switch access enables you to interact with your Android device using one or more switches that work like keyboard keys. Switch access can be helpful for users with mobility limitations that prevent them from interacting directly with the Android device.

I had a chance to try out the new Switch Access with one of my favorite switch interfaces, the Blue 2 from Ablenet (which by the way has a great guide on how to set up Switch Access in PDF format). While Switch Access is nowhere near as robust as Switch Control on iOS devices, kudos to Google for taking an important step that will ensure even more people can enjoy the use of Android phones and tablets. The fact that two of the major mobile platforms now have switch access as an option is a big step forward for ensuring accessibility for all users.

Switch Access in Android 5.0 only has  few options for configuration in its current incarnation. For example, you are not able to change the appearance of the scanning cursor, which is a very faint green outline around the currently selected item. You can’t  increase the size of the cursor either, and I found it to be difficult to see, especially when it appeared against certain backgrounds. It would be nice if there were a large cursor option (along with the ability to change the color for those who can’t perceive green that well) but I’m sure these options will be added over time. You also have few timing options. You can only adjust the speed at which the cursor moves when you are in the auto-scanning mode, but options such as hold duration, or pause on first item, which could be helpful to certain users, are missing. Again, I see this version of Switch Access as a first step in the right direction and I’m sure these options will be added over time.

Switch Access in Android 5.0 Lollipop can be used in two different ways: you can use it with a single switch by turning on the auto-scanning option, or you can add multiple switches and assign different actions to each of the switch buttons. With single switch use and auto-scanning, pressing the switch will start the scan and pressing it a second time will make a selection. With multiple switches, you can assign different actions to each switch, such as “next” to move the cursor when you press one switch and “click” to make a selection when you press the other. With additional switches, you can assign actions such as scrolling forward or backward, going to the home screen, opening notifications, settings and going to recent apps.  While switch access is running you can still interact with the touch screen in the same way you would if switch access were not turned on. I could see that being useful when you need to work with someone who is not familiar with switch access and how it works.

I created this brief video to demonstrate how switch access is configured and how it works in Android 5.0 Lollipop. For mirroring I am using the Mirror beta app, which is sending a stream I can display and record on my Mac with Reflector and Screenflow. I wish you could remove the watermark (I’m even willing to pay for this free app so that the watermark doesn’t get in the way) but I really like that you can show your taps and touches with this app. It would be really nice if you could do this on iOS  devices.

 

A Visually Impaired Photographer Experiments with the GoPro

As many of you who follow me online know, I am very passionate about photography and the possibilities it presents to us as people with disabilities to tell our own stories and exercise our creativity. I love to use my iPhone to take photos because it incorporates so many accessibility features that help me with my photography, such as Zoom, VoiceOver and Invert Colors. However, I am always looking for new options to expand my photographic horizons and the Go Pro action camera is one such option that has fascinated me for some time. I believe that just because you have a disability it doesn’t mean you should not be able to ski, sky dive, or do anything else you set your mind to doing, and the Go Pro has become the go-to camera when it comes to action sports and an active lifestyle.

I started out with the  least expensive option in the GoPro lineup, the new entry-level Hero which retails for $129.  However, after about a week with the camera, I returned it and opted for the older Hero 3 white model, which is think is a better fit for my needs. The new entry-level Hero has a number of shortcomings due to its low price and limited feature set. However, if you’re an educator looking for an inexpensive camera for recording classroom activities (science experiments, plays and performances, etc) this is a nice camera and there are ways to get around its limitations:

  • it does not have an LCD screen for framing shots and adjusting camera settings like the more expensive Go Pro 4 Silver. I don’t think this is  a significant drawback, since use of  the LCD screen outdoors would be difficult anyway due to glare. The camera’s wide field of view makes it likely that you will capture the shot you want even when you can’t frame it with a viewfinder. For someone who has tunnel vision, the wide FOV is actually one of the things that made the Go Pro so attractive to me. Go Pro does sell an add-on LCD screen but I’m not sure if it is supported on the new Hero. Regardless, using the add-on screen will probably lead to reduced battery life.
  • it does not support Wifi connectivity. With other Go Pro cameras (like the Hero 3 White I eventually traded up to), you can set up a Wifi connection between the camera and a smartphone to control the camera and see what you are capturing. However, as with the addition of an add-on LCD screen, a drawback to Wifi connectivity is that it drains the battery much faster.
  • it has a built-in battery that cannot be replaced or swapped out to extend the length of time the camera can be used in the field.  A workaround for this is to use any of a number of smartphone or tablet external batteries that matches the needs of the Go Pro (5V and 1-1.5 amps). The external battery will allow you to capture longer time lapse photos where the camera has to be turned on for extended periods of time. I was also fortunate to find an old power adapter for a Kodak zi8 camera that  allows me to plug in the Go Pro to a wall outlet to charge it much faster than through the USB port on a computer.
  • it does not support the higher resolutions of more expensive Go Pro cameras. The Hero tops out at 1080p (30 fps)  and also supports 720p at 30 or 60 fps. It does not support 4K, which is fine by me as the higher resolutions result in huge files I can’t possibly store or process on my Macbook Air.

Despite its limitations, I still think the new Go Pro Hero is a  nice entry level camera for use in educational settings. It provides access to the many possibilities for using this type of camera to support learning (examples of which are featured on this website by Lisa Tossey) but at a very reasonable price. However, from an accessibility perspective, the biggest problem is not the lack of an LCD viewfinder with features such as large text or a high contrast mode. Rather it is the fact that there is not an option to set up spoken feedback other than a series of beeps as you advance through the various menu screens which are displayed in a small window on the front of the camera. If the camera had Wifi connectivity I could probably use VoiceOver and other accessibility features on my iPhone to get better access to the camera menus and settings.

This possibility convinced me to exchange the Hero for the older Hero 3 White, which does support Wifi. I was able to download the free Go Pro app from the App Store and it has some VoiceOver compatibility.  I’m convinced that with a few tweaks this app could be made very accessible to the blind. For the most part the  buttons have VoiceOver labels that can be spoken aloud to a blind user, but these labels could be improved so that they are clearer and easier to understand when read aloud.  For example, I don’t need to hear the following when I choose the option for reviewing the clips on my camera: GoPro app, list view icon, cam roll. Just describing it as Camera Roll would be sufficient. Surprisingly, the shutter release button is the only button or control with no label at all (it just says “button”). In any case, through the Wifi connection I will still be able to use Zoom and other accessibility features on my iPhone even if the app does not have great VoiceOver support.

With the Hero 3 White I lose the following  features which are only available in the current generation cameras: Quick Capture, Super Wide Capture and Auto Low Light. Quick Capture allows the capture of video with a single tap of the top button and time lapse with an extended press, while Super Wide extends the FOV slightly and Auto Low Light gives the camera better dynamic range in low light situations. Of these three features only Super Wide would be significantly helpful to me. I don’t shoot in the kinds of environments where Auto Low Light would come in handy (due to my difficulties with navigating low-light environments) and Quick Capture is a nice-to-have but not an essential feature.

The Hero 3 White also has a slightly lower top frame rate for photos, topping out at 3fps as compared to 5fps for the new Hero, as well as a smaller capacity battery.  However, I can compensate for the smaller capacity battery by purchasing a number of inexpensive add-on batteries (retailing for $15-20 each) which the Hero 3 White supports but the new Hero does not. The swappable batteries would make up somewhat for the battery drain resulting from the use of the camera with a Wifi connection to my iPhone for accessibility support.

Along with the Wifi connectivity, the Hero 3 White also has support for an HDMI port (for connecting the camera to a HD TV), the ability to connect an external microphone for improved audio (using an adapter), support for  higher capacity memory cards (topping out at 64GB as opposed to 32GB with the new Hero) and again,  swappable batteries. The Hero 3 White has more customizable time-lapse settings, allowing for intervals from half a second to a full 60 seconds. The new Hero on the other hand is set to a single interval of half a second. Both cameras are very similar in terms of mounting options and underwater performance (with a top depth of 131 feet in each case).

I have had great fun with the  Go Pro cameras during the short time I have owned them, and I really think the Hero 3 White will be a better action camera for me than the entry-level Hero (at least until I can get one of the higher priced models like the Go Pro 4 Silver). I end this post with this photo I took with my new GoPro during a recent visit to the beach.

Go Pro selfie taken at the beach in St. Petersburg, Florida.

 

iBeacons Experiment with Beacondo

As detailed in a blog post on AT Mac, iBeacon is a new technology that has a lot of potential for people with disabilities. iBeacons are small devices capable of emitting a low-power Bluetooth signal that can be recognized by an iOS device and used to trigger an action such as opening a website, playing a video or sound and more. One use case that is already in implementation is the use of iBeacons to provide environmental cues that help people who are blind navigate the environment in an airport or  other place of public accommodation.

I had been curious about iBeacons for a while, and even purchased a single iBeacon from Radius Networks to try the new technology out. The one I got was only $29 and it works while plugged into a USB port for power. Other iBeacons have their own battery and don’t have to be plugged in, providing more flexibility of installation. In the future, I will probably buy the $99 3-pack from Estimote for this reason.

In preparation for a session at Closing the Gap focusing on how Augmented Reality and iBeacons can be used to provide UDL supports, I finally took the plunge and started experimenting with my new iBeacon. I created a simple content delivery app using the Beacondo software which is available as a free download along with an SDK. I followed along with the tutorials on the Beacondo site and a couple of hours later I had a working iBeacon app inspired by a similar one I saw demoed at this year’s Apple Distinguished Educator Institute in San Diego. At Closing the Gap, I will use this app to introduce iBeacons to the participants as they walk around the room and learn what an iBeacon is, the different types of  iBeacons available for purchase, and how they are being implemented in education (with links to the websites of my ADE colleagues Paul Hamilton and Jonathan Nalder, who are the true experts in this area).

I couldn’t believe how easy it was to create the app with Beacondo. I just followed these steps:

  • Downloaded the free Xcode software from Apple.
  • Downloaded the Beacondo Designer software and the SDK.
  • After watching the tutorials, opened Beacondo and got started customizing the various screens in the template included with the SDK. I had to include any videos and images I wanted in the app inside my project directory so that they would be available in the various pulldown menus inside Beacondo Designer.
  • Clicked on Build and chose Xcode to preview the app using the iPhone simulator.
  • Rinsed and repeated as needed to get my content looking the way I wanted.
  • When I had the app looking just the way I wanted it was time to add the iBeacon and assign an action as demonstrated in this video.
  • Did a final build for Beacondo Viewer, an iOS app that allows you to open your app for testing on your device. Building for Beacondo Viewer exports the app as a zip file that can be easily shared online.
  • Uploaded the app as a zip file to Dropbox and created a QR code using Kaywa QR Generator, my favorite tool for creating QR codes.
  • Opened Beacondo Viewer and chose the Scan from QR Code option, then scanned the QR code I had created earlier.

The first few times I did this I could not get the app to open in Beacondo Viewer. A quick email to Beacondo and they informed me that I had to change the ending to my Dropbox link from “dl=0” to “dl=1.” Beacondo will not be able to download the app’s zip file if it encounters a “Download now” screen and changing the end of the URL gets around that. With that small change I was able to download the app to Beacondo Viewer and the next time I walked into my room I was greeted with an audio message I had recorded and the app opened up directly to a page explaining what an iBeacon is, just as I would want it to do for participants at our Closing the Gap session.

From a UDL perspective, iBeacons could be really useful for embedding instructions and other context-aware supports that are available to learners when and where they are needed. Paul Hamilton does a nice demonstration of how he is using iBeacons to create learning centers in his classroom. iBeacons would be a great way to embed virtual coaches in these learning centers or stations to aid students with autism or executive functioning difficulties (think visual schedules, social stories and other supports that are triggered only when the user is at the location where these supports would be useful). I am also interested in the use of QR Codes, Augmented Reality and iBeacons to create layered learning environments where users have multiple pathways through the content, triggering on-demand access to background information or more advanced topics  as they see fit.

 

 

Ten Chrome Extensions for Accessibility and Universal Design for Learning

Although I am primarily a Safari user, I have been very impressed with the variety of extensions you can add to customize the Chrome web browser from Google. I have been experimenting with a number of these extensions, and here are the ones I have found helpful and currently have installed:

  • ChromeVox: Google’s screen reader that is built into Chrome OS on Chromebooks or can be installed as an extension for the Chrome browser on Windows or Mac. A really nice interactive tutorial is available from Google to help new users get started with ChromeVox.
  • ChromeSpeak: This extension provides the same functionality that is available through a Speech service on the Mac but should be helpful to  Chrome OS users. You can select text on any web page, right-click and choose Speak to have the text read aloud using the text to speech engine built into the respective OS.
  • ChromeVis: This extension for low vision users allows you to select text and press a keyboard shortcut ( the number 0) to have the text appear magnified in a small window. You can customize this window (or lens) to have the text rendered in different color combinations, to change the text size, or to have the magnified text appear right next to the selection rather than at the top of the page. Navigation is performed through a series of keyboard shortcuts.
  • High Contrast: This extension allows you to add a high contrast theme on a  site by site basis. Options include:  high contrast, grayscale,  inverted, inverted grayscale and yellow on black.
  • Zoom :  This extension gives you more control over how Zoom works in your Chrome browser. You can use a slider or type in a value to set a custom zoom level.
  • Readability and Clearly: Both of these extensions will clean up the clutter on a  web page and present a simplified version that is perfect for using text to speech and reading without all of the distractions of ads and other irrelevant content. Both extensions provide options for customizing the appearance of the text (text size, background, etc.). With Clearly, you can also highlight right on the page and if you sign into your Evernote account these highlights will be saved into Evernote automatically.
  • Read&Write for Google: Read&Write is a premium extension for Chrome from TextHelp. It provides a number of supports that are helpful to students with learning difficulties such as dyslexia: text to speech with word highlighting, ability to highlight right on the page using a number of colors, a picture dictionary to make concepts more concrete, and an option for summarizing a page. A 30 day free trial is available, but even after the trial is over some of the features will continue to work. This includes the text to speech with word highlighting and the translation features.
  • Evernote Web Clipper and Diigo Web Collector:  Both of these extensions are great as supports for classroom research and reading.  With Evernote Web Clipper, you can save an entire web page to your Evernote account, or you can choose to clip a selection, take a screenshot, or just save a bookmark to the page. An option to save a simplified version is also available. This will save the current page without the ads and other distractions.  For a screenshot, you can add text annotations and stamps (great for highlighting critical features in diagrams, charts and other graphics).  Diigo does a lot of the same things as Evernote Web Clipper: you can save bookmarks to pages or screenshots with annotations. What I really like about Diigo is the ability to add highlights and sticky notes to any web page. This has made it my tool of choice for taking notes while I am doing research online.

Bonus: ChromeCast. This extension allows you to show any Chrome tab on your TV using the Chromecast HDMI dongle available from Google. This can be useful for showing websites and documents from Google Docs on a larger screen.

There are a number of extensions that I use for testing web pages for accessibility as well, including:

  • Web Developer Toolbar adds a number of handy tools for checking accessibility, such as options for disabling styles, displaying alt text for images and more.
  • Accessibility Developer Tools from Google adds an accessibility audit section to the Chrome Developer tools.
  • Color Contrast Checker can help you check an entire page or a portion of it for compliance with WCAG 2.0 requirements for color contrast.

You’re More Powerful Than You Think

As I watched the Apple commercial “Powerful” one night I got an idea: what if there were a series of commercials with the same theme but focusing on how Apple technology has empowered me and many others who have disabilities to pursue their dreams and “be powerful” in our own way.

I initially threw this idea out on Facebook and a few weeks later, Christopher Hills posted a great video of iOS 8 Switch Control that featured this theme in the intro. I loved the way Christopher presented the idea and this inspired me to create my own video.

I went down to the waterfront in St. Petersburg, Florida (where I live) to watch the sunrise on a Sunday morning. Unfortunately, it had been raining most of the weekend and the view of the sunrise was not that great. That gave me an idea: let’s not waste the trip, let’s make a quick video. Armed with my iPhone, I recorded a series of 4-5 clips while the light was still good. I then used iMovie right on the phone to edit the clips into a short video that is about the length of a typical commercial. Here is the result:

Do you know anyone else whose story demonstrates how they are “more powerful” with technology? Post it online and then use the hashtag Christopher created: #iAmMorePowerfulThanYouThink

Third Party Keyboards

One of the most anticipated features in iOS 8 was the redesigned on-screen keyboard. Recently, I did a video on my YouTube channel on the new QuickType feature that now provides smart word prediction with the iOS onscreen keyboard. That video also discussed two other additions: the Dictation feature now has almost real-time feedback, and you can customize the on-screen keyboard by adding a number of third-party keyboards to your iOS device. In the video I featured two of my favorite third-party keyboards: Swype and Fleksy (both $.99 on the App Store).

http://youtu.be/gcHwBWURo7A

With Fleksy,  I like the extra feedback I get as I type (the letters appear to jump out) and the fact that you can customize the keyboard by choosing large keys and adjusting the colors to a combination that works well for you. Typing is also very quick with this keyboard. Whenever you need to enter a space, just do a quick swipe to the right, and deleting is as simple as a quick swipe to the left. Word prediction is included, but I have not found the suggestions to be as good as with the built-in keyboard.

Swype allows me to type very quickly by dragging my fingers over the letters that make up each word in one continuous motion.  It really makes more sense when you see it in action in the video. While you can switch to a Dark theme that I find helpful, I wish this keyboard had a few more themes to choose from.

Lastly, that brings me to Keedogo and Keedogo Plus from AssistiveWare, a well known name in the field of assistive technology thanks to their Proloquo2Go app. Keedogo is designed for beginning writers, with a simplified layout, lower-case letters, ability to use either a QWERTY or ABC layout, and vowel and special key highlighting. It does not include features that could distract an early writer, such as word prediction and auto-correction. I also like the high contrast and large keys. Keedogo Plus adds word prediction and automatic capitalization to the feature set, and is intended for beginning writers who are ready to move on from Keedogo to something more advanced.

http://youtu.be/OrdXnwWubDg

AssistiveWare has a third keyboard called Keeble in development that is intended for people with vision and motor difficulties. This keyboard will include options such as color themes, Speak Keys for auditory feedback as you type, select on release, and more.

One quick tip before I end: apparently there is a bug that creates problems with the third party keyboards when Guided Access is enabled. If you are having problems such as the keyboards disappearing, head on over to Settings > Accessibility > Guided Access (found under Learning) and turn that feature off if you are not using it.

Quick Tip: Siri and Speak Screen

New in iOS 8, Speak Screen allows you to hear not only text but also interface elements such as buttons and other controls read aloud. Speak Screen is a handy feature to use on websites, ebooks and anywhere you need to have the text read aloud to you, or if you need to familiarize yourself with a layout of an app due to a vision difficulty.  The nice thing about this feature is that you don’t have to make a selection first. Speak Screen begins reading at the top of the page automatically after you activate it but you can use onscreen controls to advance the selection or to rewind, and you can also adjust the speaking speed. Also, Speak Screen will work with any of the voices you have installed on your device, including the advanced Alex voice Apple has now ported from the Mac to iOS with iOS 8.

Speak screen popover menu with options for navigation and controlling the speaking rate.

To activate Speak Screen, you use a special gesture (drag from the top of the screen with two fingers). Here is today’s quick tip: instead of using a gesture to start Speak Screen, use your voice. Siri works great for this purpose. Just bring up Siri and say “Speak Screen” and your device will start reading the current screen.

Even better, with iOS 8, you can put Siri in an always listening mode by going into Settings > Siri and enabling Hey Siri. You can then just say “Hey Siri” to start Siri (but remember that this only works when you have your device plugged in to a power source, as it drains the battery some). A great use of “Hey Siri” is to use it in combination with Speak Screen. Just say “Hey Siri, Speak Screen” and it should start reading the text and describing the onscreen controls.

I hope you find that tip useful.