A new Apple TV Remote app is now available for download from the App Store. The main difference between this new app and the existing Remote app (which you can still use to control your Apple TV) is the addition of Siri functionality. With the 4th generation Apple TV, you can press and hold an onscreen Siri button in the app to speak Siri requests on your iOS device that will be understood by your Apple TV. This works just like it does when you press and hold the physical button on the 4th generation Apple TV Siri remote.
Setup was a pretty simple process. Upon launching the app, it quickly recognized all of the Apple TVs on my Wifi network (I have one of each generation) and showed them as a list. After I tapped on the device I wanted to control, I was prompted to enter a four digit code shown on the Apple TV (and automatically read aloud by VoiceOver) and that was it: my iPhone was paired to control my Apple TV.
The App Layout
The app has a dark theme, with great contrast, throughout. As someone with low vision I can say the the options on the app are much easier for me to see than the dimly labeled buttons on the physical Apple TV remote.
The screen is divided into two sections: the top two thirds make up a gesture area that simulates the touch pad on the physical remote, while the bottom third includes onscreen options for the buttons. If you can see the screen on your device, right away you will notice the Menu button is much bigger than the other buttons. This is actually a welcome design touch, as the Menu button is one of the most frequently used options for controlling the Apple TV. Below the Menu button, you will find options for Play/Pause, Home, and Siri from left to right.
I tried to test the app with Dynamic Text (large text) enabled. This only made the text in the devices list (which lists all of your Apple TVs) bigger. It would be nice if Dynamic Text worked on the label for the Menu button as well, but with the bigger button and high contrast look, this is just a minor point.
You control the Apple TV by performing touch gestures in the gesture area at the top of the screen. When you come across a text entry field, the onscreen keyboard will come up automatically to let you enter the text (same as on the older Remote app). If you tap Done to dismiss the onscreen keyboard, you can bring it back by tapping the keyboard icon at the top of the screen.
With games, you can tap a game controller icon at the top of the screen to change the layout of the app for game play. With the iPhone in the landscape orientation and the Home button to the right, the left two thirds of the screen will be a gesture area and the right one third will include Select (A) and Play/Pause (X) buttons – surprisingly these are not labeled for VoiceOver. Tapping Close in the upper right corner will exit out of the game controller mode to the standard layout.
From the one game I tried with the app, Crossy Road, I don’t think it will be a good replacement for a dedicated controller. There was just too much lag, probably due to the Wifi connection the app uses to communicate with the Apple TV. It may work with some games where timing is not as crucial, but definitely not Crossy Road.
Zoom will work just like it does when using the physical remote: a quick triple tap on the gesture area will zoom in and out. The one issue is that the gesture area on the app does not accept two finger gestures. As a result, you will not be able to:
turn panning on/off: this requires a two finger tap.
change the zoom level: this requires you to double-tap and hold with two fingers then slide up or down to adjust the zoom level.
The same limitations hold for VoiceOver. You will not be able to access the Rotor gesture on the Apple TV Remote app. Furthermore, the following gestures will not be available:
pause/resume speech: this requires a two finger tap.
read all from the top/current location: this requires a two finger swipe up/down.
If you have used VoiceOver with the older Remote app, then you will be familiar with how navigation works in this new app. With VoiceOver turned on in both the iOS app and the Apple TV, select the gesture area on the iOS app. As you flick or explore by touch in the gesture area, VoiceOver will announce the item in the VoiceOver cursor on the TV. You can then double-tap anywhere on the gesture area to make a selection.
For Siri, you will have to perform a standard gesture (double-tap and hold) so that you can speak your Siri request.
One interesting thing about using VoiceOver with the new app is how you access the Accessibility Menu. When you select the Menu button it will announce “actions available.” With a one finger flick up or down you can access the two actions: the default, which is “activate item” or “accessibility menu.” Depending on how you have your Accessibility Shortcut set up in the Apple TV settings, selecting the “accessibility menu” option will either toggle on/off one of the features or bring up the accessibility menu to allow you to choose.
I was not able to use the new app to control my Apple TV with Switch Control. The problem is that when Switch Control goes into the gesture area it does not recognize my input as I try to select one of the direction arrows to move the cursor on the Apple TV. This could very well be a bug that is fixed in a future update. In the meantime, you can continue to use the older Remote app if you need Switch Control to use your Apple TV.
In any case, Apple has promised to include Switch Control when tvOS is updated in the fall. This will be different from the current implementation in that the scanning cursor will actually show up on the TV and the iOS device will act as a switch source (at least as I understand it from my online reading, I have not been able to update my Apple TV to the latest beta).
To be honest, I don’t use the included physical remote for my Apple TV all that much. It is just too small and easy to misplace for me. I actually have my existing TV remote (which I am very familiar with) set up to control my Apple TV, and I also often use the older Remote app on my iPhone for the same purpose. With those two methods I was not able to use Siri, but now that has changed. I see myself using Siri a lot more with this new app, especially for searching on the Apple TV.
There are a few limitations that keep this app from being a full time replacement for the physical remote if you use Zoom and VoiceOver, but I anticipate that those will be addressed in future updates.
Are you using the new app? Let me know your experience in the comments, especially if you are using it with Zoom or VoiceOver. I would love to hear how it has worked out for you.
On the occasion of the Book Creator Chat (#BookCreator) focusing on accessibility, this post focuses on five easy to implement accessibility tips for Book Creator authors. By taking the time to consider the variability of readers from the start, you can ensure your books work for more of your potential audience.
1: Choose Text Size and Fonts Wisely
While Book Creator exports to the industry standard ePub format, the kind of ePub document it creates is of the fixed layout variety. This means that readers are not able to resize the text or change its appearance when they open the book in iBooks (yes they can use the Zoom feature to magnify what is shown on the screen and Invert Colors to enable a high contrast view, but not everyone is familiar with these iOS accessibility features). At a minimum, I would recommend a text size of 24px as a good starting point to ensure the text is large enough to be easily read without too much effort.
When comes to the processing of the text, some readers may have dyslexia or other reading difficulties. While there are special fonts for dyslexic readers that can be installed on the iPad, there is limited research on their impact on reading speed and comprehension.
Instead, the consensus appears to be that clean, sans-serifs fonts, which are good for all readers, can also help readers who have dyslexia. In Book Creator, you can choose from a number of sans-serif fonts such as Cabin, Lato and Noto Sans, or you can use system fonts installed on your device such as Arial, Helvetica and Verdana. You should definitely avoid fonts in the Handwriting and Fun categories, as these are more difficult to decode even for people who do not have dyslexia.
Other tips for improving legibility include:
Left justify text. Fully justified text can result in large gaps in the text that can be distracting to readers who have dyslexia.
Use bolding (instead of italics or ALL CAPS) to highlight text. The latter are more difficult to decode.
Use shorter sentences and paragraphs.
Use visual aids to reinforce information in the text (but make sure to include an accessibility descriptions as noted later in this post).
Use an off-white background. For some readers, an overly bright (all white) background can result in significant visual stress. To reduce this stress, you can choose a more dim background color in Book Creator. With no item on the page selected, tap the Inspector (i) button and choose a page under Background, then tap More under Color. A color toward the bottom of the color picker should work well.
2. Add Descriptions to Images
Readers who are blind will rely on assistive technology (screen readers) to access the content in your books. Screen readers are only able to describe images to readers who are blind when they include a text alternative. Adding a text alternative is straightforward in Book Creator:
With the image selected, tap the Inspector (i) button in the toolbar.
Some of your readers will be listening to the content because they are not able to see the display. They will be using a screen reader (VoiceOver on the iPad) to hear the text read aloud. When the screen reader comes across a link that reads as “click here” or “learn more” the person listening to the content will not have sufficient information to determine if the link is worth following or not. Instead of using “click here” or “learn more” as the link text, select a descriptive phrase (“Learn more about adding accessibility descriptions) and make that the link text – as with the following example:
While the iPad has built-in text to speech features (Speak Selection and Speak Screen) and the quality of the voice continues to improve, some readers will still prefer to hear an actual human voice reading the text. Fortunately, adding a recording of the text is an easy task in Book Creator:
Tap the Add (+) button in the toolbar.
Choose Add Sound.
Tap the Start Recording button (the red disk).
Read the text and tap the Stop Recording button when finished.
Tap Yes to use the recording.
Move the Speaker icon to the desired location on the page (it should be right below the corresponding text).
5. Remember Bits are Free!
The only limitation to the length of your book is the amount of storage on your device. Feel free to spread it out! Too much content on a single page can be overwhelming for some readers. A better approach is to use white space to present a clean layout with information organized into easy to digest chunks. This may require you to create more pages, but that’s ok – remember bits are free!
One limitation of Book Creator, from an accessibility perspective, is that it removes the closed caption track when it recompresses videos to be included in a book. This means the content in those videos is not accessible to those who are Deaf or hard of hearing (or other readers such as English Language Learners who can also benefit from captions). My current workaround is to upload the videos to my YouTube channel and then edit the auto captions created by YouTube so that they are accurate . This is not an ideal solution, as it requires the reader to exit iBooks to view the video in another app (Safari or YouTube), but it is the best workaround I have for now.
At this year’s ISTE conference, I was on a panel focusing on accessible educational materials (AEM). The panel was one of the activities sponsored by the ISTE Inclusive Learning Network, of which I am the Professional Learning Chair. I only had about 10 minutes to share some tips with our attendees so I tried to convey them with an easy to remember mnemonic: SLIDE.
As a follow up to that panel, I created this blog post. I hope you find it helpful and look forward to your feedback.
Note: Document accessibility is a complex topic. This is by no means a comprehensive guide, just a few tips to help educators get started by taking some easy steps toward making their content more accessible.
When it comes to making documents more accessible and useful for all learners, small changes can have a significant impact!
By following these tips, you will ensure all learners can enjoy access to information as a first step toward creating more inclusive learning environments.
Styles are used to reveal the structure of the information
Links are descriptive
Images include alternative text
Design is clear and predictable
Empathy and an ethic of care are a key focus
Properly marked up headings are important for screen reader users, who can use a shortcut to quickly access a list of these headings and navigate to any section in the document (saving valuable time). For other readers, headings reveal the structure of the information and make the document easier to scan.
Select the desired heading text and choose from the styles menu in your authoring tool.
Choose formatting options such as making the text bigger and bold. The text will look like a heading but lack the proper markup.
As with headings, screen reader users will often use a shortcut to bring up a list of the links in a document. Links need to be descriptive in order for them to make sense when they are accessed in this way, without the context of the surrounding text on the page.
Select some descriptive text and make that the link (see examples on this document).
Avoid using “click here” and “learn more” as the link text.
Through good design, you can reduce the amount of effort it takes your readers to process the information in a document, allowing them to focus on the meaning conveyed by the content rather than its presentation.
Some helpful design tips include:
Ensure sufficient contrast between the text and its background.
Use proximity and white space to make relationships clear: items that belong together should be close to each other and separated from other items by sufficient white space.
Use repetition to highlight patterns and build a cohesive whole.
Even more important than implementing these tips is changing your approach to design so that it reflects an ethic of care. Remember that not everyone reading your content can see, hear or process information as well as you. As you approach your work, try to think about the diversity in your potential audience. Doing so will allow your content to reach more readers and have a greater impact!
According to the U.S. Census:
1 in 5 Americans Reports Having a Disability
For Americans over 65, that figure is 40%
Accessible content will not only benefit other people. As you age, your ability to see, hear and process content may be affected. When you creat accessible content, you are also designing for your “future self.”
Global Accessibility Awareness Day (GAAD) is a very special day for me. Without the many advances in digital access there is so much that I would not have been able to accomplish: getting a doctorate; writing a book; doing my work as an inclusive learning consultant (which involves travel, accessing the Web for research, creating presentations and more); being an advocate through blog posts like these, my YouTube videos and ebooks..the list is long.
I’m far from being an accessibility expert, but I try my best to continue learning and doing what I can to make things more accessible not only for other people but ultimately for myself when the day comes that I have lost all of my eyesight. And that’s the point of GAAD to me. You don’t have to be perfect, you jus have to take the first step!
I wanted to create this blog post as one stop shop for the resources I have created for GAAD:
Along with these resources, I had the pleasure of moderating the #ATChat discussion on accessibility in ed on the eve of GAAD. Here is a transcript of our discussion available on Storify. A big thank you to Karen Janowski and Mike Marotta for allowing me to do that.
As you can see, there are many ways you can contribute to the conversation and the work that is ongoing to make the world a better, more accessible place for all learners. The key is to take the first step. As I did during our ATChat, I want to leave you with the following challenge: what is one small thing you can do or try today to improve accessibility where you work?
On the occasion of Global Accessibility Awareness Day (GAAD) this week (May 19th), I created this post to highlight some of the iOS accessibility features that can benefit a wide range of diverse learners, not just those who have been labeled as having a disability.
It’s built in.
Every iOS device comes with a standard set of accessibility features that are ready to use as soon as you take the device out of the box. Let’s take a look at a few of these features that can benefit all users in the spirit of Universal Design.
Get started by going to Settings > General > Accessibility!
#1: Closed Captions
Closed captions were originally developed for those with hearing difficulties, but they can help you if you speak English as a second language or just need them as a support for improved processing. Captions can also help if your speakers are not working, or the sound in the video is of poor quality.
80% of caption users did not have any kind of hearing loss in one UK study.
All iOS devices support built-in text to speech with the option to turn on word highlighting. Starting with iOS 8, it is possible to use the more natural Alex voice formerly available only on the Mac. TTS supports decoding, which frees you the reader to focus on the meaning of the text.
Breathe!: Alex takes a breath every once in a while to simulate the way we speak!
Bonus tip!: Don’t want to make a selection first? No problem. Just bring up Siri and say “Speak Screen.” This will read everything on the screen!
#3: Safari Reader
Safari’s Reader is not really an accessibility feature (you will not find it in Settings) but it can help you if you find that you get distracted by all the ads when you are reading or doing research online. It is also a nice complement to the Speech features mentioned above. With iOS 9, you can now customize the appearance of the text and even change the background and font to make it easier to read when you surf the Web.
Left my heart in…San Francisco is a new system font available in iOS 9. It is designed to be easier to read, and is one of the font options available for Reader.
Whenever you see the iOS keyboard, you can tap the microphone icon to the left of the space bar to start entering text using just your voice. This can help you get your words down on the page (or is it the screen?) more efficiently.
Try It!: Dictation can handle complex words. Try this: Supercalifragilisticexpialidocious.
QuickType is Apple’s name for the word prediction feature now built into the iOS keyboard. Word prediction can help you if you struggle with spelling, and it can speed up your text entry as well. Starting with iOS 8, it is now possible to customize the built-in keyboard by installing a 3rd party app. The 3rd party keyboards add improved word prediction, themes for changing the appearance of the keys and more.
Struggling to see the screen? – make sure to check out the Vision section in the Accessibility Settings. You can Zoom in to magnify what is shown on the screen, Invert Colors to enable a high contrast mode, make the text larger with Dynamic Text, and much more.
As I finish out this series of tutorials on the 4th Generation Apple TV, I want to focus on the options for customizing the playback of media and the appearance of the interface. As shown in the video, the captions and subtitles feature makes great use of the Siri remote capabilities on the 4th Generation Apple TV: you can either enable the captions for the rest of the program (“Turn on closed captions”), or you can enable them for a short time if you have missed something (just say “What did he/she say?” and after the video rewinds a few seconds the captions will come on for a short time to help you catch what you missed).
Of course this feature will only work if the content creator(s) have made the captions available. Ted Talks is one channel that does, making the great presentations on their channel accessible to a wider audience (yay for Ted Talks!).
Apple TV also supports audio descriptions. Audio descriptions provide a description of the action in a video for those who are unable to see. The audio descriptions can be enabled in the same Media section of the Accessibility Settings where Captions and Subtitles are found. As with captions, these audio descriptions will only work if the content creator has made them available.
In addition to the ways in which viewers can customize media playback, the Apple TV watchOS includes a number of options for customizing the interface: bold text, reduce motion, reduce transparency and focus style, which adds an outline around the currently selected item.
Zoom on the Apple TV provides up to 15X magnification for those who have low vision, but it can benefit anyone who has difficulty seeing the Apple TV interface on their TV. This accessibility feature should be familiar to low vision users of other Apple products. It has been available for some time on the Mac and on iOS devices, and it is also supported on the Apple Watch. With the release of the 4th Generation Apple TV, every Apple product that supports a display now also supports magnification for low vision users.
This video provides an overview of the Zoom accessibility feature. You will learn how to enable/disable Zoom in Settings, how to add Zoom to the Accessibility Shortcut for quick access, and some of the gestures supported by Zoom:
a light tap near any edge on the Siri remote will move the zoomed in area by one screen
dragging on the touch area of the Siri remote will allow you to pan in any direction (a two finger tap will stop/resume panning).
double-tapping and holding with two fingers, then dragging up/down without letting go will allow you to adjust the zoom level.
A nice feature built into Zoom is that you can double-tap the Siri remote at any time to hear the currently selected item read aloud. This works even if you are not currently zoomed in (Zoom just has to be enabled).
VoiceOver was already available on older Apple TV models, but the touchpad on the new Siri remote allows it to be an even more robust accessibility solution on the new 4th generation model. This video provides an overview of the various gestures VoiceOver supports on the new Apple TV, including the Rotor gesture that can be used to change VoiceOver settings such as the speech rate.
I was already happy with my third-generation Apple TV, but when I read that Apple was expanding the support for accessibility in the fourth generation model I knew I was going to pre-order the device as soon as it became available. Today, my 4th-generation Apple TV finally arrived, and it does not disappoint with regard to its accessibility. This post is not an in-depth review of the new Apple TV (there are plenty of those online already including a really nice one from iMore), but rather my first impressions of the set top box as someone with a visual impairment and a personal interest in accessibility. I will also just focus on the built-in features of the new Apple TV, rather than the apps that can now be installed on the device (that will make for a separate post as I explore the App Store further in the next few weeks and even more apps become available).
Nicely rounds out the support for accessibility across the Apple ecosystem by expanding on the support for VoiceOver in the previous model, adding Zoom and providing many of the same options for customizing the interface that are available on other Apple devices.
Major accessibility features such as VoiceOver and Zoom are responsive and perform well, with little lag.
The interface is cleaner and works better across the room: for example, it is now much easier for me to tell when an item has focus, something I struggle with on my third-generation Apple TV (especially on my smaller TV).
Other than the new Siri remote there are no other options for controlling the new Apple TV, which does have an impact on accessibility for some users. I hope this situation is addressed soon through a software update.
Setup and Interface
Setup for the new Apple TV couldn’t be easier. Once you have your power and HDMI cables connected and your new device has come on, you can triple-click the Menu button to turn on VoiceOver so that it can guide you through the rest of the setup. After you have selected your langauge and country/region, a brand new feature even allows you to place your iPhone (running iOS 9.1 or later) near the Apple TV to provide it with your network and Apple ID information.
The rest of the setup goes as expected, with selections for enabling location services and Siri, sending diagnostics data to Apple and developers, agreeing to the terms of service no one reads and so on.
Once the setup is complete, you will notice that the new interface is much brighter than the old one, with light gray backgrounds rather than black throughout.
Some people have complained about this, and I can see where it can be a problem if you have an Apple TV in your bedroom and want to use it while the other person (roommate or significant other) is trying to sleep. It would be nice to have the option of a dark theme like Invert Colors on iOS devices for those who prefer it.
Overall, I found the interface to be much easier for me to use. The item that has the focus pops out a bit, which is a more pronounced focus indidcator from in the older interface. Whether on the apps grid or in the menus I found this change made it easier for me to quickly know what item I had selected. The interface supports greater customization than on any previous Apple TV, thanks to an entire section labeled Interface in the Settings.
When you go into Settings, the first thing you will notice is that the Accessibility options are now near the top of the General pane. In fact, they are one of the first things you see, right after the optons for the screenreader. On the previous Apple TV model, you had to scroll quite a bit to locate Accessibility toward the bottom of the General pane.
Of course, you can still use the Accessibility shortcut to quickly enable and disable accessibility features without going into Settings. Whereas on the old Apple TV you invoked this Accessibility Shortcut (it was actually called the Accessibility Menu) by pressing and holding the Menu button, on the new one you do it by triple-clicking that same button (much like you triple-click the Home button on iOS devices to do the same thing). A nice touch is that VoiceOver will read the options shown by the Accessibility Shortcut even if you have it disabled in Settings.
In addition to the Accessibility Shortcut, the new Interface section of the Acccessibiliy pane includes a number of options for cusotmizing the appearance of the display (similar to options already found on iOS and Apple Watch), including:
Bold Text: a simple toggle that provides more weight to the text labels. Enabling this feature will require a quick restart just as it does on other Apple devices.
Increase Contrast: there are two options. The first reduces the transparency, while the second one changes the focus style by adding a thick outline around the currently selected item.
Reduce Motion: another toggle that removes animations throughout the interface for those who are sensitive to the extra motion.
Along with adjusting the appearance of the interface, the new Apple TV has retained the options for customizing closed captions that were available before. These are found in the Media section of the Accessibility pane, where you can also enable audio descriptions for programs that include them. In addition to turning on the captions, you can still customize the style by selecting Large Text and Classic options or creating your own style with many options for both the text and the background.
Updated 11/5/15: Siri is one of the major selling points of the new Apple TV and I’ve finally had a chance to play around with it as I have started to interact with content on the device. Apple TV’s Siri allows you to do a number of things using speech: search for movies (“show me movies with Penelope Cruz”), refine your search (“only her dramas from 2012”), navigate (“open Photos” or my favorite – “home screen”), and control playback (“pause this,” “skip forward 30 seconds,” etc.) From an accessibility perspective, it allows you to enable/disable VoiceOver, “Turn on closed captions” while you are watching content, and if you miss something you can just say “What did he/she say?” and the playback will rewind 15 seconds and temporarily turn n the captions. I love this feature because it highlights the usefulness of captions not just as an accessibility feature but as an example of design that benefits everyone (universal design). My only concern with Siri is that you have to hold down the button the entire time you are speaking your request. That could be an issue for some people with motor difficulties, especially as you start to use Siri all the time. I am hoping that eventually there is an always on feature like “hey Siri” on the iPhone.
VoiceOver and Zoom
These two features in the Vision section of the Accessibility pane are the biggest changes to the accessiiblity of the Apple TV in the new model. Zoom is brand new, and supports magnification up to 15X (the default is 5X). Once Zoom is enabled, you will zoom in and out by triple-clicking the touchpad on the new remote. While you are zoomed in, you can interact with Zoom in a variety of ways:
drag one finger over the touchpad to pan in any direction. As you pan, an overlay will let you give you an idea of what area of the interface you are zoomed in on (very similar to the indicator you get with Apple Watch when you use the Digital Crown to zoom by row).
stop panning by tapping the touchpad with two fingers. At that point, you will be able to use the usual flicking gestures to move from one item to the next without panning, but you can resume panning at any time with a second two-finger tap on the touchpad.
adjust the zoom level by double-tapping with two fingers, holding, and then swiping up or down with the two fingers without letting go. The maximum amount you can zoom will be determined by the value selected in Settings.
Update 11/5/15: In a previous version of this post I noted that I could get the labels read aloud each time I double-clicked the Siri button. The next day, I could not get my Apple TV to do it again and couldn’t figure out why. It turns out that this is a feature of Zoom. If Zoom is enabled, you can double-click the Siri button to hear an item read aloud.
VoiceOver was already available on the older model, but the touchpad allows it to be an even more robust solution on the new one. If you have used VoiceOver on an iOS device (or on a Mac laptop) you will already be somewhat familiar with how to interact with VoiceOver on the new Apple TV. However, if you do need some help, just know that you now have a VoiceOver Practice that is only shown when you have VoiceOver turned on (sound familiar, iOS users?).
VoiceOver supports the following gestures on the new Apple TV remote (all gestures are performed on the touchpad area of the new remote):
Move your finger around on the touchpad: move the focus to have VoiceOver speak the currently selected item aloud.
Flick in any direction with one finger: move the focus in a given direction.
Click on the touchpad: make a selection.
Flick down with two fingers: read from the current location to the bottom.
Flick up with two fingers: start reading from the top of the screen.
Two-finger tap: pause/resume speaking.
Again, these gestures should be familiar if you have used an iOS device or a Mac laptop with the Trackpad Commander turned on. Speaking of the Trackpad Commander, the rotor is also supported and, you guessed it, you turn the virtual dial clockwise or counter-clockwise with two fingers to select a rotor option and then flick up or down with one finger to adjust its value.
The rotor can be used to adjust the speech rate with more control (as opposed to the option in Settings that only allows you to select from a few preset values such as “Very Slow” or “Very Fast”). It also allows you hear items read by character or word, to enable or disable Direct Touch (where instead of flicking to navigate in a linear way you can just move your finger on the touchpad to move around the interface with more freedom) and more (I’m still trying to figure out a few of the options such as Navigate and Explore).
You can use Siri to turn on VoiceOver (just say “Turn VoiceOver on”) but for some reason you can’t do the same for Zoom and other settings. When I tried it all it did was open the Settings, but it didn’t take me to Zoom or turn on the feature as requested.
Overall, I like the new Apple TV from my limited exposure to it in the few hours since it arrived at my home. I like the updated interface, which is more cleanly laid out and designed for better visibility from across the room. From an accessibility perspective, I think Apple TV is the best game in town. None of the other set top boxes I have tried have the accessibility support Apple TV had even before the new model came out.
The new model ups the ante with more options for customizing the appearance of the interface, the addition of Zoom for those who have low vision, and an enhanced VoiceOver that is more than ready for use with apps (though how well that works will depend as always on how well developers incorporate accessibility support in their apps). Performance is a lot better too. I almost forgot just how much time I spent waiting on my older Apple TV until I switched back to compare some of the features. The new model is a lot more responsive and just performs better all around.
Having said all that, whether I end up liking this Apple TV as much as I have the previous model will depend on what happens in the next few weeks and months as updates to tvOS are released. As good as the accessibility features and performance of this new version are, there are still a number of issues that need to be addressed:
No Podcasts app: The company that basically brought us the podcast has launched a set top box without a dedicated podcast app (and as I write this, there are no Apple TV versions of Downcast or Overcast in the App Store). Aside from renting movies, podcasts are the next thing i consume the most on Apple TV. I can set them to play in the background while I do other things around the house, and I have a number of favorites I listen to on a regular basis. I’m hoping Apple is just taking a little bit more time to make sure the podcast app is done right when it is finally released.
No Remote app support: the current Remote app for iOS is not compatible with the new Apple TV. This means that someone with a motor difficulty is not able to use Switch Control on an iOS device to navigate the Apple TV interface through the Remote app. While the built in accessibility features of the new Apple TV do an excellent job of accommodating the needs of those with vision and hearing difficulties, it is important to address this omission to make sure switch users can enjoy the Apple TV along with the rest of us.
No support for external Bluetooth keyboards: Probably my biggest annoyance was having to go back to typing in user names and passwords with the onscreen keyboard. I have always used either the Remote app for iOS or an external keyboard connected over Bluetooth for this purpose, but both options are not possible at launch. Especially when entering complicated passwords, doing it on an external keyboaard is much faster and easier.
The remote: I generally like the new remote. It is lightweight and feels good in the hand. My issue is that I know there is good likelihood that I will lose the thing and it will cost me $79 to replace it (the previous remote was only $19 for comparison). I’m thinking I may buy a $25 tile and find a way to attach it to the remote just in case. I’m surprised Apple did not build the same Ping feature that is available between the Apple Watch and the iPhone, allowing us to quickly find a misplaced remote by emitting a loud ping sound. For now Tile may be my best bet ($25 is much better than $79). In the meantime, I have set up my existing TV remote to work with the Apple TV.
Recently I had the pleasure of meeting Logan Prickett, a second year student at Auburn University at Montgomery. Logan is an academically gifted STEM student and the inspiration behind The Logan Project at AUM, an initiative to develop software that will enable students who are blind or who have low vision to fully participate in all college-level math courses.
At age 13, Logan suffered an anaphylactic reaction to the contrast dye in an MRI. His heart stopped beating on its own which left him without oxygen for 45 minutes. Logan believes that “a prayer chain that reached around the world was active during those 45 minutes and I credit God and those prayers for the heartbeat that brought me back to life.”
His time without oxygen left Logan blind, a wheelchair user, with fine motor control difficulties, and unable to speak above a whisper due to damage to his vocal cords that occurred during life saving measures. Logan has the cognitive ability to do the work in his courses, he just needs a few technology supports in place to ensure his vision and motor challenges do not get in the way and prevent him from tapping his full potential. The goal of the Logan Project is thus to eliminate barriers for students with complex needs like Logan so that they can not only complete required math coursework but pursue a career in a STEM field if they desire. This is worthy goal given the underrepresentation of people with disabilities in STEM fields. You can learn more about it by typing The Logan Project into the search bar on the AUM website (aum.edu).
The Goal: Independent Communication
When I met with Logan and his team the expressed goal was to get Logan started on the journey to independent communication, beginning with the ability to send and receive short messages with his family and those close to him. Logan had just acquired an iPhone 6 Plus and we considered the use of Switch Control since Logan has enough motor control to press a switch. To accommodate his visual impairment, we decided that Logan would use Switch Control with manual scanning and the speech support turned on. This way he would be able to hear the items on the screen as he presses the switches to scan through them at a pace that works for him. The one problem with this setup is the possibility of fatigue from repeated switch presses. Siri seemed like a possibility for getting around this issue, but unfortunately Siri is not able to recognize Logan’s low whisper to allow him to quickly send a text message or initiate a FaceTime call. Surprisingly, FaceTime can pick up Logan’s whisper well so that it can be understood on the other end of the call. Although he can be heard with an actual phone call as well, the audio with a FaceTime call is much better. Thus, if we could find a way to activate FaceTime with a minimum of effort we would go a long way toward giving Logan an option for communication while he develops his Switch Control skills. That’s where the Workflow app comes in.
Workflow to the Rescue
I knew about the Workflow app because it made history as the first app to get an Apple design award for its attention to accessibility. In fact, at the Design Awards, members of Apple’s engineering team who are blind were the ones who actually did the demo of the app to show how well it works with the VoiceOver screen reader built into Apple’s mobile devices. You can watch the demo on Apple’s WWDC 2015 site (the Workflow demo starts at 35 minutes and goes through the 42 minute mark.)
As the name suggests, Workflow is a utility for creating workflows that allow the user to chain together a series of actions to complete a given task. For example, as I often do tutorials with screenshots from my Apple Watch, I have created a workflow that automatically takes the latest Apple Watch screenshot saved to my Camera Roll on the iPhone and shares it to my computer using Airdrop so that I can quickly add it to a blog post or a presentation. This kind of workflow can save a lot of time and effort for tasks that you perform several times over the course of a day.
Workflow already includes many actions for built-in iOS apps such as Contacts, FaceTime and Messages. These actions can be chained together to create a workflow, with the output from one action used as the input for the next one in the chain. Thus, a workflow can consist of selecting an entry in the Contacts app and feeding its information into the FaceTime app to start a new call with the selected contact. In much the same way, the entry from the Contacts app can be combined with a Text action to start Messages, pre-fill the message body and automatically address the message. For Logan this kind of workflow would reduce the amount of work he would have to perform and allow him to send quick messages to his team, such as “I’m ready for pick up” or “class is running late.” There is even the possibility of sharing his location so that other team members can get an idea of where Logan is at different points in the day.
Once a workflow has been created it is possible to add it as a shortcut on the Home Screen, with its own descriptive name, icon and color. By organizing these shortcuts on the Home Screen it is possible to create a simple communication system for Logan, giving him the ability to use Switch Control to independently start FaceTime calls, send quick messages and more.
The ultimate goal is to develop Logan’s ability to communicate independently and this will require building up his skills as a new switch user. With time and practice, I have no doubt after getting to know Logan that he will become a proficient user of Switch Control. In the meantime, Workflow is a good option for building his confidence and giving him some good reasons to use those skills: communicating with those who are important to him with a minimum of effort. When he is ready, he could then add an alternative and augmentative communication (AAC) app such as Proloquo4Text to his arsenal of communication tools, as well as keyboards such as Keeble and Phraseboard that make it easier for switch user to enter text. Logan has demonstrated that he has the ability to do well in higher education; now we just have to figure out how to eliminate a few barriers that are standing in his way and preventing him from letting his ability shine.