Designed for (fill in the blank)

On the occasion of Global Accessibility Day (GAAD), Apple has created a series of videos highlighting the many ways its iOS devices empower individuals with disabilities to accomplish a variety of goals, from parenting to releasing a new album for a rock band. Each of the videos ends  with the tagline “Defined for” followed by the name of the person starring in the video, closing with “Designed for Everyone.” In this brief post, I want to highlight some of the ways in which this is in fact true. Beyond the more specialized features highlighted in the video (a speech generating app, the VoiceOver screen reader, Made for iPhone hearing aids and Switch Control), there are many other Apple accessibility features that can help everyone, not just people with disabilities:

  • Invert Colors: found under Accessibility > Display Accommodations, this feature was originally intended for people with low vision who need a higher contrast display. However, the higher contrast Invert Colors provides can be helpful in a variety of other situations. One that comes to mind is trying to read on a touch screen while outdoors in bright lighting. The increased contrast provided by Invert Colors can make the text stand out more from the washed out display in that kind of scenario.
  • Zoom: this is another feature that was originally designed for people with low vision, but it can also be a great tool for teaching. You can use Zoom to not only make the content easier to read for the person “in the last row” in any kind of large space, but also to highlight important information. I often will Zoom In (see what I did there, it’s the title of one of my books) on a specific app or control while delivering technology instruction live or on a video tutorial or webinar. Another use is for hide and reveal activities, where you first zoom into the prompt, give students some “thinking time” and then slide to reveal the part of the screen with the answer.
  • Magnifier: need to read the microscopic serial number on a new device, or the expiration name on that medicine you bought years ago and are not sure is still safe to take? No problem, Magnifier (new in iOS 10) to the rescue. A triple-click of the Home button will bring up an interface familiar to anyone who has taken a photo on an iOS device. Using the full resolution of the camera, you can not only zoom into the desired text, but also apply a color filter and even freeze the image for a better look.
  • Closed Captions: although originally developed to support the Deaf and hard of hearing communities, closed captions are probably the best example of universal design on iOS. Closed captions can also help individuals who speak English as a second language, as well as those who are learning how to read (by providing the reinforcement of hearing as well as seeing the words for true multimodal learning). They can also help make the information accessible in any kind of loud environment (a busy lobby, airport, bar or restaurant) where consuming the content has to be done without the benefit of the audio. Finally, closed captions can help when the audio quality is low due to the age of the film, or when the speaker has a thick accent. On Apple TV, there is an option to automatically rewind the video a few seconds and temporarily turn on the closed captions for the audio you just missed. Just say “what did he/she say?” into the Apple TV remote.
  • Speak Screen: this feature found under Accessibility > Speech are meant to help people with vision or reading difficulties, but the convenience it provides can help in any situation where looking at the screen is not possible – one good example is while driving. You can open up a news article in your favorite app that supports Speak Screen while at a stop light, then perform the special gesture (a two finger swipe from the top of the screen) to hear that story read aloud while you drive. At the next stop light, you can perform the gesture again and in this way catch up with all the news while on your way to work! On the Mac, you can even save the output from the text to speech feature as an audio file. One way you could use this audio is to record instructions for any activity that requires you to perform steps in sequence – your own coach in your pocket, if you will!
  • AssistiveTouch: you don’t need to have a motor difficulty to use AssistiveTouch. Just having your device locked into a protective case can pose a problem this feature can solve. With AssistiveTouch, you can bring up onscreen options for buttons that are difficult to reach due to the design of the case or stand. With a case I use for video capture (the iOgrapher) AssistiveTouch is actually required by design. To ensure light doesn’t leak into the lens the designers of this great case covered up the sleep/wake button. The only way to lock the iPad screen after you are done filming is to select the “lock screen” option in AssistiveTouch. Finally, AssistiveTouch can be helpful with older phones with a failing Home button.

While all of these features are featured in the Accessibility area of Settings, they are really “designed for everyone.” Sometimes the problem is not your own physical or cognitive limitations, but constraints imposed by the environment or the situation in which the technology use takes place.

How about you? Are there any other ways you are using the accessibility features to make your life easier even if you don’t have a disability?

3 Ways the iPhone Has Disrupted My Life for the Better

The 10th anniversary of the iPhone announcement in 2007 was mentioned on a number of podcasts I listen to this past week, and this got me into a reflective mood. I can remember vividly where I was when the announcement took place. At the time I was a graduate student at the University of South Florida, and I watched the announcement on the big screen in the iTeach Lounge where I worked as a graduate assistant.

I must admit that at first I was a bit skeptical. The first version of the iPhone was pretty expensive, and it took me a year after the launch to decide that I wanted to get in on the fun.  If I remember correctly, it cost me $399 for 8GB of storage when I bought my first iPhone from Cingular Wireless (remember them?). As cool as that first iPhone was, it took two important developments to make me a true believer.  The first one was the release of the App Store in 2008, which opened up  a world of possibilities only limited to developers’ imagination. The second was the accessibility support announced with the release of the iPhone 3GS. After my first iPhone contract with Cingular was up, I actually returned to a traditional flip phone for a little while for my next phone. Once the accessibility support was announced, though, I was locked in. I have been an iPhone owner ever since.

In addition to the App Store and the built-in accessibility support, there are three other important ways in which the iPhone has disrupted my life in significant ways that go beyond just being able to have access to information and communication on the go.

A Better Set of Eyes

The iPhone couldn’t have come at a better time for me. At the time, my vision loss was getting the point where using a traditional DSLR camera was becoming harder and harder. As I detailed in an article for the National Federation of the Blind’s Future Reflections magazine, the built-in accessibility features of the iPhone have allowed me to continue with my passion for capturing the beauty in the world around me. The way I see it, the iPhone is now “a better set of eyes” for me. Most of the time, I can’t be sure that I have actually captured a decent image when I aim the phone at a scene. It is not until later, when I am reviewing the images more carefully at home, that I notice small details I didn’t even know were in the frame. You can see some examples of my photography on my Instagram page.

Instagram collage showing best nine images of 2016.

Going forward, this idea of the iPhone as my “best set of eyes” is going to be important to me beyond photography. As my vision loss progresses, I will be able to rely on the iPhone’s ever improving camera to recognize currency, capture and read aloud the text in menus, business cards and more, and tell me if my clothes are exactly the color I intended. I have no doubt that “computer vision” will continue to get better and this gives me hope for the future. Already, the VoiceOver screen reader can recognize some objects in your images and describe them aloud. This technology was developed to make searching through large image libraries more efficient, but it will be helpful to people with visual impairments like me as well.

Independence at the Touch of a Button

The second major way the iPhone has disrupted my life for the better is by giving me back my independence in a big way, through apps such as Uber and Lyft. Now, I know you can use these apps on other smartphones, so they are not exclusive to the iPhone. However, when you really think about it, no iPhone means no App Store. No App Store means there is no incentive for other companies to copy what Apple did.

Uber has replaced the many frustrations I had with public transportation (lateness, high taxi fares) with a much more convenient and less expensive solution. Yes, I know some of my blind friends have had a number of issues with Uber (such as outright discrimination from drivers who are not comfortable with a guide dog in their vehicles), but this would probably happen with taxicabs too.

My own experience with Uber has been mostly positive, and the service allows me to easily get to doctor’s appointments, and provides me with a reliable way to get to the airport so that I can do my work of spreading the message of accessibility and inclusive design for education to a broader audience beyond my local area. Uber and Lyft, and the iPhone as the platform that made them possible, have really opened up the world to me.

Can You Hear Me Now?

One of the big trends at the Consumer Electronics Show (CES) this year was the presence of Alexa, Amazon’s voice assistant, on all kinds of consumer appliances. Alexa joins Apple’s Siri, Microsoft’s Cortana and Google’s Assistant in heralding a future where voice and speech recognition replace the mouse and the touch screen as the primary input methods for our computing devices. We are not quite there yet, but the accuracy of these services will continue to improve and I am already seeing the potential with some of the home automation functions that are possible with the existing implementations (having my lights be automatically turned on when I arrive at home, for example).

Here, again, the iPhone deserves quite a bit of credit. The release of Siri as part of the iPhone 4S in 2011 brought the idea of speech recognition and voice control to the mainstream. Previously, its use was limited mostly to individuals with motor difficulties or niche markets like the medical and legal transcription fields. Siri helped popularize this method of voice interaction and made it more user friendly (remember when you had to sit for several minutes training speech recognition software to recognize just your voice?).

Looking Ahead

The smartphone is a mature technology and some have questioned whether it has reached its apex and will soon give way to other options for content consumption and communication. One possibility would involve virtual, augmented or even mixed reality. Given the visual nature of AR and VR this gives me some cause for concern just like I had at the release of the iPhone back in 2007. However, just like Apple took a slab of glass and made it accessible when few people thought it could, with some creativity we can make AR and VR accessible too.

We have come a long way in just 10 years (sometimes I find it hard to remember that it has only been that long). In that time, Apple has shown that “inclusion promotes innovation.”  Accessible touch screens, voice controlled assistants, ride sharing services, are just a few of the innovations that have developed within an accessible ecosystem started with the iPhone. Thank you Apple, and congrats on the 10th anniversary of iPhone.Here’s to the next 10, 20 or 30 years of innovation and inclusion.

 

 

Read to Me in Book Creator 5

Read to Me Screen in Book Creator

Book Creator for iPad recently added a new Read to Me text to speech feature that allows learners to hear their books read aloud within the app (without having to transfer the book to iBooks first). The feature also supports accessibility in two other ways:

  • all embedded media can be played automatically. This is great for those with motor difficulties, and it also creates a better flow during reading (no need to stop and start the text to speech to hear the embedded media).
  • automatic page flips: again this is a great feature for those who can’t perform the page flip gesture to turn the pages in a book.

These options can be configured through a Settings pane where it is possible to change the voice (you can choose any voice available for Speech in iOS), slow it down, or remove the word by word highlighting that goes along with it. For better focus, it is also possible to show one page at a time by unchecking the “Side by side pages” option under Display.

I created a short video to show how the new feature works (with a bonus at the end: how to fix the pronunciations with the new pronunciation editor built into the Speech feature in iOS 10).

 

4 New Accessibility Features of iOS 10 You Should Know

Apple today released iOS 10,  the latest version of its operating system for mobile devices such as the iPad and iPhone. This post is a quick review of some of the most significant enhancements to the accessibility support in iOS 10, starting with a brand new feature called Magnifier.

Magnifier

With Magnifier, users who have low vision can use the great camera on their devices to enlarge the text in menus, pill bottles, and other items where they might need a little support for their vision to read the content. Magnifier is found alongside Zoom (which enlarges onscreen content) in the Accessibility Settings. Once it is enabled, you can activate the Magnifier by triple-clicking the Home button.

While a number of existing apps such as Vision Assist and Better Vision provide similar functionality, having this functionality built into the OS should improve performance (through faster focusing, better clarity made possible by accessing the camera’s full native resolution, etc.). Magnifier has the following options:

  • a slider for adjusting the zoom level (or you can pinch in and out on the screen)
  • a shutter button that freezes the image for closer inspection – you can then pinch to zoom in on the captured image and drag on the screen with one finger to inspect a different part of it
  • a button for activating the device flash (on devices that have one) in torch mode  so that you get a bit more light in a dark environment
  • a button for locking the focus at a given focal length
  • a button for accessing a variety of filters or overlays

The available filters include: white/blue, yellow/blue, grayscale, yellow/black, and red/black. For each of these, you can press the Invert button to reverse the colors, and you can do this while in the live view or with a frozen image. Each filter also provides a set of sliders for adjusting the brightness and contrast as needed.

Display Accommodations

Display Accommodations is a new section in the Accessibility Settings that brings together a few existing display options (Invert Colors,  Grayscale, Reduce White Point) with a couple of new ones (Color Tint and and options for three different types of color-blindness).

Color filters pane has options for Grayscale and color blindness filters.

For those who have Irlen Syndrome (Visual Stress) there is a new option in iOS for adding a color tint over the entire display. Once you choose this option, you will be able to use a slider to specify the intensity and hue of the filter.

Color Filters with sliders for intensity and hue

Speech Enhancements

In addition to word by word highlighting, the text to speech options in iOS 10 (Speak Selection and Speak Screen) will now provide sentence by sentence highlighting as well. By choosing Highlight Content in the Speech Settings you can configure how the highlighting takes place: you can have only the words highlighted, only the sentences, or both, and you can choose whether the sentence highlight will be an underline or a background color (though you still can’t choose your own color).

A new Typing Feedback setting can help you if you find you are often entering the wrong text. You can choose to hear the last character or word you typed (or both). For the character feedback, you can specify a delay after which the character will be spoken and even whether a hint (“t, tango”) is provided. An additional setting allows you to hear the QuickType suggestions read aloud as you hover over them, to make sure you are choosing the right prediction.

The entire Speech system also can take advantage of some additional high quality voices: Allison, Ava, Fred, Susan, Tom and Victoria for U.S. English. Some of the voices (such as Allison) have both a default and an enhanced version as has been the case with previously introduced voices, and you preview each voice before downloading it by tapping a play button. An edit button allows you to remove voices you are not using if you are running low on space (you can always download them again).

VoiceOver Pronunciation Editor and New Voices

I’m sure the team at AppleVis will do a complete rundown of VoiceOver in iOS 10, so here I will just highlight one feature that I am really happy about: the new Pronunciation Editor. After all this time, I can finally get VoiceOver to get a little bit closer to the correct pronunciation for my name (the accent in Pérez still throws it off a little).

The Pronunciation Editor is found under VoiceOver > Speech > Pronunciations. Once there, you will press the Add (+) button, enter the phrase to be recognized and then either dictate or spell out the correct pronunciation. You can restrict the new pronunciation to specific Languages,  Voices and apps or choose All for each option for a more global availability.

In addition to the pronunciation editor, VoiceOver can take advantage of all the new voices for the Speech system in iOS 10: Allison, Ava, Fred, Susan, Tom and Victoria for U.S. English (each with an enhanced version). Like the Alex voice, you will have to download each of these new voices before you can use it, but you can  preview each voice before downloading it.

These are just a few of the new accessibility features in iOS 10. Others include:

  • the ability to auto-select the speaker for a call when you are not holding the iPhone to your ear.
  • an option for routing the audio for VoiceOver: you can hear the speech on one channel and the sound effects in the other when wearing headphones.
  • Switch access for Apple TV which will allow you to navigate the interface of the Apple TV using the Switch Control feature on your iOS device.
  • a new option for Switch Control Recipes that will allow to create a hold on point action right from the scanner menu. Before you could only crate a tap action in this way.

And of course, there are other enhancements to the rest of the Apple ecosystem which I will cover in their own blog posts as they become available: Siri for the Mac, Taptic Time for Apple Watch, new activity settings on Apple Watch for wheelchair users, and more.

Finally, there is the new hardware Apple just announced last week, which will soon be shipping. Apple Watch has a faster processor and a better display (helpful for those with low vision), and the iPhone 7 and 7 Plus come with even better cameras (12 MP, with two cameras for 2X zooming on the larger model). As both a visually impaired photographer and as someone who focuses on accessibility features that use the camera (Magnifier, Optical Character Recognition apps to convert print into digital content) this is very exciting.

What are your thoughts on Apple’s recent announcements? Are you upgrading to the new devices? Which features have you most excited?

 

 

 

 

A small action with a big impact (Recipes for Switch Control)

Sometimes accessibility is about making small changes that bring about a big impact in people’s lives. Take the act of flipping the pages in a book. This is probably an action most of us take for granted.  For some people with motor challenges, though, the ability to flip the pages of a book is the difference between being able to enjoy a favorite book or being locked out of that experience.

In the past, the only way to accomplish this action (flipping the page of a book) was through the use of a cumbersome mechanical device. My friend and colleague Christopher Hills illustrates the use of such a device in a short YouTube video.

 

Description: As dramatic music plays, the video begins with the words “In the not too distant past…” then cuts to Christopher sitting in his powered wheelchair while a relative reads a book next to him. As Christopher looks on, his dad Garry brings in a large, industrial looking device that needs to be wheeled into the room. Garry proceeds to plug in the device and place a book on it. An external switch box has options for the various page flip actions. Christopher flips the pages of the book with this device, which uses a roller to turn the pages each time Christopher presses a head mounted switch that is connected to the external switch box. The video then cuts to “Now…”  and we see the same relative as in the opening scene sitting down at the kitchen table with his iPad, ready to read a book. With an over the shoulder shot, we see the relative turn the pages on his iPad as Christopher performs the same action next to him  by pressing a head mounted switch that is connected to his iPad via Switch Control.

With digital content and assistive technology, the cumbersome, mechanical device shown in Christopher’s video is no longer needed. Devices like the iPad now include built-in switch access (Switch Control) that can be combined with external switches to make flipping the pages of a book a much simpler task. In the embedded video, I demonstrate the use of Recipes to flip the pages of a book I created with the Book Creator app on my iPad. The book is I Am More Powerful Than You Think.

 

Want to learn more about Recipes and Switch Control?  You should check out a free book I co-authored with Christopher Hills – Handsfree: Mastering Switch Control on iOS . This interactive book has more than 20 closed captioned videos that go over every aspect of using Switch Control – from how to connect a switch interface to your iOS device, to how to control your Apple TV with a switch.

 

Apple TV Remote App: Accessibility Quick Take

A new Apple TV Remote app is now available for download from the App Store. The main difference between this new app and the existing Remote app (which you can still use to control your Apple TV) is the addition of Siri functionality.  With the 4th generation Apple TV, you can press and hold an onscreen Siri button in the app to speak Siri requests on your iOS device that will be understood by your Apple TV. This works just like it does when you press and hold the physical button on the 4th generation Apple TV Siri remote.

Setup

Setup was a pretty simple process. Upon launching the app, it quickly recognized all of the Apple TVs on my Wifi network (I have one of each generation) and showed them as a list. After I tapped on the device I wanted to control, I was prompted to enter a four digit code shown on the Apple TV (and automatically read aloud by VoiceOver) and that was it: my iPhone was paired to control my Apple TV.

The App Layout

The app has a dark theme, with great contrast, throughout. As someone with low vision I can say the the options on the app are much easier for me to see than the dimly labeled buttons on the physical Apple TV remote.

Apple TV Remote app layout in standard mode

The screen is divided into two sections: the top two thirds make up a gesture area that simulates the touch pad on the physical remote, while the bottom third includes onscreen options for the buttons. If you can see the screen on your device, right away you will notice the Menu button is much bigger than the other buttons. This is actually a welcome design touch, as the Menu button is one of the most frequently used options for controlling the Apple TV. Below the Menu button, you will find options for Play/Pause, Home, and Siri from left to right.

I tried to test the app with Dynamic Text (large text) enabled. This only made the text in the devices list (which lists all of your Apple TVs) bigger. It would be nice if Dynamic Text worked on the label for the Menu button as well, but with the bigger button and high contrast look, this is just a minor point.

You control the Apple TV by performing touch gestures in the gesture area at the top of the screen. When you come across a text entry field, the onscreen keyboard will come up automatically to let you enter the text (same as on the older Remote app). If you tap Done to dismiss the onscreen keyboard, you can bring it back by tapping the keyboard icon at the top of the screen.

With games, you can tap a game controller icon at the top  of the screen to change the layout of the app for game play. With the iPhone in the landscape orientation and the Home button to the right, the left two thirds of the screen will be a gesture area and the right one third will include  Select (A) and Play/Pause (X) buttons – surprisingly these are not labeled for VoiceOver. Tapping Close in the upper right corner will  exit out of the game controller mode to the standard layout.

Apple TV Remote app layout in game controller mode

From the one game I tried with the app, Crossy Road, I don’t think it will be a good replacement for a dedicated controller. There was just too much lag, probably due to the Wifi connection the app uses to communicate with the Apple TV. It may work with some games where timing is not as crucial, but definitely not Crossy Road.

Zoom

Zoom will work just like it does when using the physical remote: a quick triple tap on the gesture area will zoom in and out. The one issue is that the gesture area on the app does not accept two finger gestures. As a result, you will not be able to:

  • turn panning on/off: this requires a two finger tap.
  • change the zoom level: this requires you to double-tap and hold with two fingers then slide up or down to adjust the zoom level.

VoiceOver

The same limitations hold for VoiceOver. You will not be able to access the Rotor gesture on the Apple TV Remote app. Furthermore, the following gestures will not be available:

  • pause/resume speech: this requires a two finger tap.
  • read all from the top/current location: this requires a two finger swipe up/down.

If you have used VoiceOver with the older Remote app, then you will be familiar with how navigation works in this new app. With VoiceOver turned on in both the iOS app and the Apple TV, select the gesture area on the iOS app. As you flick or explore by touch in the gesture area, VoiceOver will announce the item in the VoiceOver cursor on the TV. You can then double-tap anywhere on the gesture area to make a selection.

For Siri, you will have to perform a standard gesture (double-tap and hold) so that you can speak your Siri request.

One interesting thing about using VoiceOver with the new app is how you access the Accessibility Menu. When you select the Menu button it will announce “actions available.” With a one finger flick up or down you can access the two actions: the default, which is “activate item” or “accessibility menu.” Depending on how you have your Accessibility Shortcut set up in the Apple TV settings, selecting the “accessibility menu” option will either toggle on/off one of the features or bring up the accessibility menu to allow you to choose.

Switch Control

I was not able to use the new app to control my Apple TV with Switch Control. The problem is that when Switch Control goes into the gesture area it does not recognize my input as I try to select one of the direction arrows to move the cursor on the Apple TV. This could very well be a bug that is fixed in a future update. In the meantime, you can continue to use the older Remote app if you need Switch Control to use your Apple TV.

In any case, Apple has promised to include Switch Control when tvOS is updated in the fall. This will be different from the current implementation in that the scanning cursor will actually show up on the TV and the iOS device will act as a switch source (at least as I understand it from my online reading, I have not been able to update my Apple TV to the latest beta).

Apple TV Remote app with Switch Control turned on, showing direction arrows in gesture area.

Conclusion

To be honest, I don’t use the included physical remote for my Apple TV all that much. It is just too small and easy to misplace for me. I actually have my existing TV remote (which I am very familiar with) set up to control my Apple TV, and I also often use the older Remote app on my iPhone for the same purpose. With those two methods I was not able to use Siri, but now that has changed. I see myself using Siri a lot more with this new app, especially for searching on the Apple TV.

There are a few limitations that keep this app from being a full time replacement for the physical remote if you use Zoom and VoiceOver, but I anticipate that those will be addressed in future updates.

Are you using the new app? Let me know your experience in the comments, especially if you are using it with Zoom or VoiceOver. I would love to hear how it has worked out for you.

5 Easy Accessibility Tips for Book Creator Authors

On the occasion of the Book Creator Chat (#BookCreator) focusing on accessibility, this post focuses on five easy to implement accessibility tips for Book Creator authors. By taking the time to consider the variability of readers from the start, you can ensure your books work for more of your potential audience.

1: Choose Text Size and Fonts Wisely

While Book Creator exports to the industry standard ePub format, the kind of ePub document it creates is of the fixed layout variety. This means that readers are not able to resize the text or change its appearance when they open the book in iBooks  (yes they can use the Zoom feature to magnify what is shown on the screen and Invert Colors to enable a high contrast view, but not everyone is familiar with these iOS accessibility features). At a minimum, I would recommend a text size of 24px as a good starting point to ensure the text is large enough to be easily read without too much effort.

When comes to the processing of the text, some readers may have dyslexia or other reading difficulties. While there are special fonts for dyslexic readers that can be installed on the iPad, there is limited research on their impact on reading speed and comprehension.

Instead, the consensus appears to be that clean, sans-serifs fonts, which are good for all readers, can also help readers who have dyslexia. In Book Creator, you can choose from a number of sans-serif fonts such as Cabin, Lato and Noto Sans, or you can use system fonts installed on your device such as Arial, Helvetica and Verdana. You should definitely avoid fonts in the Handwriting and Fun categories, as these are more difficult to decode even for people who do not have dyslexia.

Other tips for improving legibility include:

  • Left justify text. Fully justified text can result in large gaps in the text that can be distracting to readers who have dyslexia.
  • Use  bolding (instead of italics or ALL CAPS) to highlight text. The latter are more difficult to decode.
  • Use shorter sentences and paragraphs.
  • Use  visual aids to reinforce information in the text (but make sure to include an accessibility descriptions as noted later in this post).
  • Use an off-white  background. For some readers, an overly bright (all white) background can result in significant visual stress. To reduce this stress, you can choose a more dim background color in Book Creator. With no item on the page selected, tap the Inspector (i) button and choose a page under Background, then tap More under Color. A color toward the bottom of the color picker should work well.

    Custom color picker in Book Creator with light yellow color selected.

2. Add Descriptions to Images

Readers who are blind will rely on assistive technology (screen readers) to access the content in your books. Screen readers are only able to describe images to readers who are blind when they include a text alternative. Adding a text alternative is straightforward in Book Creator:

  1. With the image selected, tap the Inspector (i) button in the toolbar.
  2. Tap in the Accessibility field.
  3. Enter text that describes what the image represents rather than its appearance. WebAIM has an excellent article on how to create more effective alternative text for images.

    Accessibility field in the Book Creator Inspector.

    This video shows you how to add accessibility descriptions (alternative text) to images in Book Creator. 

3. Create Descriptive Links

Some of your readers will be listening to the content because they are not able to see the display. They will be using a screen reader (VoiceOver on the iPad) to hear the text read aloud. When the screen reader comes across a link that reads as “click here” or “learn more” the person listening to the content will not have sufficient information to determine if the link is worth following or not. Instead of using “click here” or “learn more” as the link text, select a descriptive phrase (“Learn more about adding accessibility descriptions) and make that the link text – as with the following example:

How to add a hyperlink in Book Creator.

 

4. Supplement Text with Audio

While the iPad has built-in text to speech features (Speak Selection and Speak Screen) and the quality of the voice continues to improve, some readers will still prefer to hear an actual human voice reading the text. Fortunately, adding a recording of the text is an easy task in Book Creator:

  1. Tap the Add (+) button in the toolbar.
  2. Choose Add Sound.
  3. Tap the Start Recording button (the red disk).
  4. Read the text and tap the Stop Recording button when finished.
  5. Tap Yes to use the recording.
  6. Move the Speaker icon to the desired location on the page (it should be right below the corresponding text).

5. Remember Bits are Free!

The only limitation to the length of your book is the amount of storage on your device. Feel free to spread it out! Too much content on a single page can be overwhelming for some readers. A better approach is to use white space to present a clean layout with information organized  into easy to digest chunks. This may require you to create more pages, but that’s ok – remember bits are free!

One limitation of Book Creator, from an accessibility perspective, is that it removes the closed caption track when it recompresses videos to be included in a book. This means the content in those videos is not accessible to those who are Deaf or hard of hearing (or other readers such as English Language Learners who can also benefit from captions). My current workaround is to upload the videos to my YouTube channel and then edit the auto captions created by YouTube so that they are accurate . This is not an ideal solution, as it requires the reader to exit iBooks to view the video in another app (Safari or YouTube), but it is the best workaround I have for now.