How I do my photography in 2022 (A photo essay)

I continue to enjoy photography as much as I did when I last posted to this blog back in 2018. However, the way I go about doing my photography has changed dramatically in the last few years. On this post, I will summarize my learning over that time as someone pursuing photography while experiencing the world with a significant visual impairment.

I would estimate that 80-90% of my photos used to be taken with an iPhone. The biggest reason for choosing to shoot with an iPhone is the excellent support for accessibility features that continue to get better with each iteration of iOS. Also, “the best camera is the one you have with you,” and my iPhone is never too far away, ready to capture the moment, even when it’s just my pup Bruno taking a nap.

A small poodle mix dog laying on top of a red blanket taking a nap

I still rely on my iPhone (now an iPhone 14 Pro) to take many of my photos, but the pandemic made me want more out of my photography. I’m happy to continue taking candid photos as well as landscapes that benefit from the wide lens of the iPhone, but one thing the iPhone lacks is reach (even with the 3X telephoto lens on my model).

A red bellied woodpecker grabbing on to the trunk of a moss covered tree in an upside down position.
Sometimes you get lucky, as with this photo I took with my iPhone when a red bellied woodpecker landed near us at the park.

I would look at the wildlife photographers, with their huge lenses, and wish I could do what they were doing. The pandemic gave me the push I needed to not just wish I could do that kind of photography, but to actually make it happen. As someone with a progressive visual impairment, I have always felt a sense of urgency when it comes to experiencing the world with my eyes while I can. It’s complicated. I’m not saying that my experience will be lesser if I completely lose my vision, but let’s be honest for a second, it will be different. And besides my worsening vision, there is the fact that entering my 50s, no aspect of my health is guaranteed – the pandemic was a reminder of that. I want to do more photography while I can still move about with good legs and a good back, not just a serviceable set of eyes.

With those complicated feelings in the background, I set out to take my photography to the next level over the last four years. In this post, I will discuss every aspect of what I’m doing, from the gear that I’m now using to the workflow I use for capturing and editing the photos I share on social media.

Before I get to the details, I want to take a minute to share my gratitude for my partner Cindy. Without her, none of this would be possible. Fortunately for me, she shares my passion for photography and the outdoors and we make a good team. She makes it easier for me to get to the places I want to photograph, and she’s also my spotter. Without her help there is little chance that I would be able to notice some of the wildlife that moves quickly and blends so well with the surroundings when you are out in our wonderful parks where we live. A supportive partner is key, and I’m so grateful to have that.

Ok, now on to the details…

The Gear

The biggest change I’ve made is in my choice of gear. I still like my Nikon D3100. It’s great for flower photos and portraits, but it doesn’t have the reach or speed I need for any kind of wildlife photography. The reach is important for me for a variety of reasons. One, the Florida wildlife can be unforgiving, and I don’t want to be gator bait. Two, I can’t go to far off road if I want to be safe – I just can’t see tree roots and other obstacles. so it’s best that I stay on the trail or boardwalk if one is available and use the long reach of a telephoto lens to capture the wildlife from a safe distance

Although it was a big expense, the first thing I did during the pandemic was not take up gardening or learn how to make sourdough bread. No, I purchased my first full frame camera and a long telephoto lens. My current setup consists of:

It was quite an investment to put together this kit, but I can say without any doubt that it was worth it!

A group of white pelicans resting on a sandbar.
This photo is only possible with a long lens. These white pelicans are protected and you have to keep a minimum distance from their resting area in the sandbar at Fort De Soto.

The A7 III has a number of features that make it ideal for me:

  • Amazing auto-focus: my old DSLR only had a few focus points, but the Sony A7III has almost 700! It also has face and eye detection, and the latter can be set to find eyes of animals. It’s not perfect, but for larger birds (which are easier for me to photograph anyway) it works well enough and I need all the help I can get when it comes to focusing.
  • Burst rate of 10 fps (frames per second). This is important because I often don’t really “see” what I’m photographing. It’s challenging enough to capture a fast moving bird even if you have good eyesight, even more challenging when you can’t track it due to your visual impairment.
  • 24 megapixels. This is key because I often have to shoot a little wider than I would like to in order to make sure I don’t cut off an essential element of the wildlife and its surrounding environment. With more megapixels and a full frame sensor, I can do some serious cropping and still retain pretty good image quality. I already did this quite a bit with the iPhone, but with the full frame camera I have even more leeway in what I can do in post processing to “make the image.”

Not long after I purchased the new camera, Sony released a new model, the A7IV and guess what? – that camera has a screen reader! How helpful would that be! I could sell my current camera and upgrade (which is a pain because you lose so much money). It would be nice if Sony would just add the screen reader functionality with a firmware update for the older camera. Not holding my breath on that one (and there may be technical reasons why it’s not possible).

If anyone from Sony is reading this, please reach out – I have some suggestions for how to make the experience better for people with low vision, starting with adding an option for changing the colors and the thickness of some of the visual indicators. If I can spend less time making out what my camera is trying to tell me, that’s more time I can spend shooting.

As for the lens, the 100-400 Sigma lens was a definite improvement over my older setup, but once I experienced the clear photos I was getting from my new camera and lens combo, I was hooked! I needed more! That brings me to my latest toy – the beast! I’ve nicknamed my Sigma 150-600 that because it weights almost five pounds. Walking around with it for a few hours provides all the exercise I need on the weekends. I balance it all on an iFootage Cobra carbon fiber monopod. I use a monopod because it allows me to be fairly agile with the big lens. I carry everything on a PGYTECH OneMo Camera Backpack 25L. I like that it came with a separate shoulder bag I can use when I just want to take the 100-400mm lens for a lighter set up.  

Luis posing with his camera and a long lens resting on a monopod. A lake and woods appear in the background.

I just added a Wimberley MH-100 mono gimbal head to my kit to make it easier to pan up when I want to shoot birds that are perched high on the trees, or follow a bird in flight (that last one is a real stretch for me, but a man can dream).

As for settings, during pandemic I finally mustered the courage to take the camera out of the automatic modes. Inspired by a webinar from Matt Kloskowski I found through a Facebook link (glad I clicked on that one), I shoot in manual – ok, manual -ish. Let me explain:

  • Aperture is set to the lowest F number my lens will allow when it is at its maximum reach of 600mm. I never change this setting because long telephoto lenses like the one I use need all the light they can get.
  • ISO – this is set to auto so that I don’t have to worry about it as I go through changing light conditions while out in the field. I let the camera do its thing when it comes to this setting.
  • Shutter speed – this is the one setting I play with. I typically follow the rule of using a shutter speed that is at least the length of the lens (1/800 for my lens for most slow moving birds such as herons and egrets).

It’s manual-ish because I’m only really controlling one variable, leaving one set at the same value most of the time, and letting the camera handle the third one that makes the exposure triangle. Exposure is the area where I have lots of room for growth.

For the focusing, I shoot continuous (don’t think I’ve ever changed this setting since I got the camera), with the drive mode set to burst high to take multiple shots each time I press the shutter. My focus mode is often expand flexible spot and I have the focus point indicator set just above center because I have a hard time seeing it out in the field. I used to do back button focus (where you focus with one button and take the photo with another) but I now keep things simple by using the shutter button for both focusing and taking the photo.

Another great tip from Matt Kloskowski (a fellow Tampanian by the way) – get low! How low? As low as you can go.

Dumlin, a small wading bird captured moving along the tide pool at low tide. It is a white and tan bird with a long, pointed beak.
I got real low to capture this dunlin as it moved about looking for food in the tide pool at low tide

You get beautiful bokeh (the dreamy, out of focus background) by increasing the distance between your subject and the background. Combined with the long telephoto lens, this will obliterate the background so that it can’t be made out and distract from your subject. If you can’t get low due to bad knees, here’s a trick: use a tripod camping chair. I do this a lot, especially if the ground is sketchy (I’ve gotten bitten by random bugs a few times, no fun). Another trick I’ll often use is to hold the camera low and use my LCD screen flipped up so that I’m looking down on it while I hold the camera in place with the monopod. It works if it’s not too sunny out, especially when shooting shore birds like sandpipers and plovers. The LCD screen on the A7 III is actually pretty good in terms of brightness and I find that sometimes it helps me to use the LCD screen to find the object I want to capture. I’ll move the screen around a bit until I see a change in light on the LCD – not sure how to explain it well, but it works for me.

The Workflow

Any kind of photography I do, whether with an iPhone or a traditional camera, is only made possible by digital media. A typical outing for me involves taking around 500 photos. Of those, i may choose 4-5 that are in focus and where I did not completely swing and miss with the composition and cut something off. It’s a numbers game for me. Now imagine going to the local CVS and trying to develop 500 photos at a time. That would be an even more expensive hobby and likely not possible for me to sustain. With digital, pixels are somewhat free (cost of the camera and lenses aside). I can take as many shots as I need and play the numbers game like a really bad hitter in baseball if you go by average number of keepers.

American Kestrel perched at the end of a branch. It has orange and black banded back feathers and a mottled brown front. The head is light blue on top  with black, orange and white vertical stripes on the sides.
This American Kestrel, captured at Fort De Soto is a keeper. It was one of the birds on my life list. The photo is a bit backlit, but you can’t give the bird any instructions as to where they should pose.

The first step in my process is to move the photos from my camera to a mobile device with a better screen where I can pinch and zoom to check focus. It used to be an iPhone, but these days it’s also an iPad Air.

I will perform most of my basic edits in the stock Photos app: cropping, strengthening, highlights/shadows, etc. At this point, I’ll mark the five or so photos I want to work some more on (the “keepers”) as favorites. I will transfer those to my computer over Airdrop.

On the computer, the only thing I do on each photo is run it through Topaz Denoise AI to remove some of the noise (grain). That software is like magic! Another Matt Kloskowski recommendation that panned out really well.

Female cardinal perched at the end of a branch with berries on it.
Another keeper with this female cardinal having some berries for a snack. You can see the grain in the background because this was taken in a shady area. Still like it though, and I cleaned it up a bit with Topaz later.

The rest

The last step is to share the photo on Facebook and Vero (no more Instagram for me – it’s too much like Tik Tok now). And that’s how 500 photos become 5 or 6. It’s not pretty, and it takes time, but I really enjoy the entire process (ok, maybe not the file management part – I’m now constantly playing a game with my iPhone and iPad to clear up space for more photos).

The gear, the settings, that’s only part of what goes into taking decent wildlife photos. The other part involves research. I spend a lot of time on the eBird app checking out what other birders have spotted in the area. I also follow a number of groups on Facebook for specific areas (Friends of Fort De Soto) or types of wildlife (mostly local birding groups). That helps me narrow down where I go on a given outing in hopes of increasing my chances of capturing a bird on my life list. If you’re curious, they are:

  • Barred owl – my nemesis…lol..I’ve spent more hours on this one than any other. The closest I’ve gotten is hearing the call a few times but with no luck in finding the actual birds in the woods.
  • Belted kingfisher – this one will be tough. It’s a really small and fast bird that feeds by taking quick dives into the water.
  • Merlin (a small falcon) – I found the kestrel, I’m pretty confident I will find this one too at Fort De Soto.
  • Cooper’s hawk – each time I’ve thought I had this bird it turned out to be a different type of hawk (broad shouldered or red shouldered).
  • Red tailed hawk – I see lots of red shouldered ones, but not this one)

One bird that’s missing from the list now: the bald eagle! I found a nest and finally got to see one in the wild, not at a zoo or rescue. The nest is on a cell phone tower that is well out of reach so the photos I got are a little blurry, possibly due to the heat coming off the water.

Bald eagle in flight with wings spread out.

The other app I use is Merlin (like the falcon). It has really nice descriptions of each species along with all of their calls. Merlin also has an excellent AI-based feature that allows you to upload a photo and get an ID almost immediately with 90% accuracy. This helps me provide better descriptions when I share the photo online.

While out in the field, I will also use the app’s audio recognition feature to identify potential subjects. With many birds, such as owls, you are more likely to hear them before you see them (especially in my case). I can’t recommend these two free apps enough. They are an essential part of my kit, just like my camera and lenses.


In the end, you really have to like what you’re doing if you choose to do wildlife photography, because it takes a lot of patience. There’s lots of waiting involved.

Eastern blue bird perched on a branch. It has a brownish band across the chest with a white underside and blue wings that are barely visible.
We waited more than an hour for one of these birds to finally land on a perch where I could capture it without any branches in front of it. Worth it! It’s a beautiful female Eastern blue bird.

Then just when you’re getting ready to go home, a bird you’ve never seen before shows up and you almost miss the shot. It’s a lot like fishing, and just as expensive.

Osprey that just landed on top of a wooden post with a big fish on its talons. The osprey has its wings spread out.
One of those “nothing to see here, let’s go home” photos.

But the waiting also makes wildlife photography such a great activity for mental health. When taking the photos, you can’t rush. You have to not only settle in and wait until you get the bird (or other wildlife) to show its best side, but when you take the photo, you have to slow your breath and really focus so that you don’t introduce unnecessary shake that can result in a blurry photo. You have to be present and forget about everything that’s worrying you in that moment. That, along with the fact that it gets you out in nature where you can enjoy some sunlight and fresh air, makes it a great activity for addressing stress.

I think that covers everything I want to share in this post. Feel free to reach out if you have any questions (@eyeonaxs on Twitter).

Update: I just found out I can share my entire Vero gallery outside the app. I have also reached out to the Vero team to inquire about alternative text and how we can make that app more accessible. I have been using the comments to identify each bird as I post, but alternative text (along with the comments) is the ideal solution. I’ll update this post based on what I hear back from them.


I presented this poem at the 2018 ISTE Conference during my TED Ed talk:

Neither here nor there
Neither blind nor sighted
I see you, but not all of you
You see me, but not all of me

Ni aquí, ni allá (Neither here nor there)
The islands, the city, the country
Español, Spanglish, English
Y hoy, ¿quién soy? (And today, who am I?)
Hay, ¿quién sabe? (Well, who knows?)

So I learn to live in between
In and out of the shadows
And as the light turns to dark
And the darkness comes to life
I’ve learned to just dance
Just dance in those shadows

Designed for (fill in the blank)

On the occasion of Global Accessibility Day (GAAD), Apple has created a series of videos highlighting the many ways its iOS devices empower individuals with disabilities to accomplish a variety of goals, from parenting to releasing a new album for a rock band. Each of the videos ends  with the tagline “Defined for” followed by the name of the person starring in the video, closing with “Designed for Everyone.” In this brief post, I want to highlight some of the ways in which this is in fact true. Beyond the more specialized features highlighted in the video (a speech generating app, the VoiceOver screen reader, Made for iPhone hearing aids and Switch Control), there are many other Apple accessibility features that can help everyone, not just people with disabilities:

  • Invert Colors: found under Accessibility > Display Accommodations, this feature was originally intended for people with low vision who need a higher contrast display. However, the higher contrast Invert Colors provides can be helpful in a variety of other situations. One that comes to mind is trying to read on a touch screen while outdoors in bright lighting. The increased contrast provided by Invert Colors can make the text stand out more from the washed out display in that kind of scenario.
  • Zoom: this is another feature that was originally designed for people with low vision, but it can also be a great tool for teaching. You can use Zoom to not only make the content easier to read for the person “in the last row” in any kind of large space, but also to highlight important information. I often will Zoom In (see what I did there, it’s the title of one of my books) on a specific app or control while delivering technology instruction live or on a video tutorial or webinar. Another use is for hide and reveal activities, where you first zoom into the prompt, give students some “thinking time” and then slide to reveal the part of the screen with the answer.
  • Magnifier: need to read the microscopic serial number on a new device, or the expiration name on that medicine you bought years ago and are not sure is still safe to take? No problem, Magnifier (new in iOS 10) to the rescue. A triple-click of the Home button will bring up an interface familiar to anyone who has taken a photo on an iOS device. Using the full resolution of the camera, you can not only zoom into the desired text, but also apply a color filter and even freeze the image for a better look.
  • Closed Captions: although originally developed to support the Deaf and hard of hearing communities, closed captions are probably the best example of universal design on iOS. Closed captions can also help individuals who speak English as a second language, as well as those who are learning how to read (by providing the reinforcement of hearing as well as seeing the words for true multimodal learning). They can also help make the information accessible in any kind of loud environment (a busy lobby, airport, bar or restaurant) where consuming the content has to be done without the benefit of the audio. Finally, closed captions can help when the audio quality is low due to the age of the film, or when the speaker has a thick accent. On Apple TV, there is an option to automatically rewind the video a few seconds and temporarily turn on the closed captions for the audio you just missed. Just say “what did he/she say?” into the Apple TV remote.
  • Speak Screen: this feature found under Accessibility > Speech are meant to help people with vision or reading difficulties, but the convenience it provides can help in any situation where looking at the screen is not possible – one good example is while driving. You can open up a news article in your favorite app that supports Speak Screen while at a stop light, then perform the special gesture (a two finger swipe from the top of the screen) to hear that story read aloud while you drive. At the next stop light, you can perform the gesture again and in this way catch up with all the news while on your way to work! On the Mac, you can even save the output from the text to speech feature as an audio file. One way you could use this audio is to record instructions for any activity that requires you to perform steps in sequence – your own coach in your pocket, if you will!
  • AssistiveTouch: you don’t need to have a motor difficulty to use AssistiveTouch. Just having your device locked into a protective case can pose a problem this feature can solve. With AssistiveTouch, you can bring up onscreen options for buttons that are difficult to reach due to the design of the case or stand. With a case I use for video capture (the iOgrapher) AssistiveTouch is actually required by design. To ensure light doesn’t leak into the lens the designers of this great case covered up the sleep/wake button. The only way to lock the iPad screen after you are done filming is to select the “lock screen” option in AssistiveTouch. Finally, AssistiveTouch can be helpful with older phones with a failing Home button.

While all of these features are featured in the Accessibility area of Settings, they are really “designed for everyone.” Sometimes the problem is not your own physical or cognitive limitations, but constraints imposed by the environment or the situation in which the technology use takes place.

How about you? Are there any other ways you are using the accessibility features to make your life easier even if you don’t have a disability?

3 Ways the iPhone Has Disrupted My Life for the Better

The 10th anniversary of the iPhone announcement in 2007 was mentioned on a number of podcasts I listen to this past week, and this got me into a reflective mood. I can remember vividly where I was when the announcement took place. At the time I was a graduate student at the University of South Florida, and I watched the announcement on the big screen in the iTeach Lounge where I worked as a graduate assistant.

I must admit that at first I was a bit skeptical. The first version of the iPhone was pretty expensive, and it took me a year after the launch to decide that I wanted to get in on the fun.  If I remember correctly, it cost me $399 for 8GB of storage when I bought my first iPhone from Cingular Wireless (remember them?). As cool as that first iPhone was, it took two important developments to make me a true believer.  The first one was the release of the App Store in 2008, which opened up  a world of possibilities only limited to developers’ imagination. The second was the accessibility support announced with the release of the iPhone 3GS. After my first iPhone contract with Cingular was up, I actually returned to a traditional flip phone for a little while for my next phone. Once the accessibility support was announced, though, I was locked in. I have been an iPhone owner ever since.

In addition to the App Store and the built-in accessibility support, there are three other important ways in which the iPhone has disrupted my life in significant ways that go beyond just being able to have access to information and communication on the go.

A Better Set of Eyes

The iPhone couldn’t have come at a better time for me. At the time, my vision loss was getting the point where using a traditional DSLR camera was becoming harder and harder. As I detailed in an article for the National Federation of the Blind’s Future Reflections magazine, the built-in accessibility features of the iPhone have allowed me to continue with my passion for capturing the beauty in the world around me. The way I see it, the iPhone is now “a better set of eyes” for me. Most of the time, I can’t be sure that I have actually captured a decent image when I aim the phone at a scene. It is not until later, when I am reviewing the images more carefully at home, that I notice small details I didn’t even know were in the frame. You can see some examples of my photography on my Instagram page.

Instagram collage showing best nine images of 2016.

Going forward, this idea of the iPhone as my “best set of eyes” is going to be important to me beyond photography. As my vision loss progresses, I will be able to rely on the iPhone’s ever improving camera to recognize currency, capture and read aloud the text in menus, business cards and more, and tell me if my clothes are exactly the color I intended. I have no doubt that “computer vision” will continue to get better and this gives me hope for the future. Already, the VoiceOver screen reader can recognize some objects in your images and describe them aloud. This technology was developed to make searching through large image libraries more efficient, but it will be helpful to people with visual impairments like me as well.

Independence at the Touch of a Button

The second major way the iPhone has disrupted my life for the better is by giving me back my independence in a big way, through apps such as Uber and Lyft. Now, I know you can use these apps on other smartphones, so they are not exclusive to the iPhone. However, when you really think about it, no iPhone means no App Store. No App Store means there is no incentive for other companies to copy what Apple did.

Uber has replaced the many frustrations I had with public transportation (lateness, high taxi fares) with a much more convenient and less expensive solution. Yes, I know some of my blind friends have had a number of issues with Uber (such as outright discrimination from drivers who are not comfortable with a guide dog in their vehicles), but this would probably happen with taxicabs too.

My own experience with Uber has been mostly positive, and the service allows me to easily get to doctor’s appointments, and provides me with a reliable way to get to the airport so that I can do my work of spreading the message of accessibility and inclusive design for education to a broader audience beyond my local area. Uber and Lyft, and the iPhone as the platform that made them possible, have really opened up the world to me.

Can You Hear Me Now?

One of the big trends at the Consumer Electronics Show (CES) this year was the presence of Alexa, Amazon’s voice assistant, on all kinds of consumer appliances. Alexa joins Apple’s Siri, Microsoft’s Cortana and Google’s Assistant in heralding a future where voice and speech recognition replace the mouse and the touch screen as the primary input methods for our computing devices. We are not quite there yet, but the accuracy of these services will continue to improve and I am already seeing the potential with some of the home automation functions that are possible with the existing implementations (having my lights be automatically turned on when I arrive at home, for example).

Here, again, the iPhone deserves quite a bit of credit. The release of Siri as part of the iPhone 4S in 2011 brought the idea of speech recognition and voice control to the mainstream. Previously, its use was limited mostly to individuals with motor difficulties or niche markets like the medical and legal transcription fields. Siri helped popularize this method of voice interaction and made it more user friendly (remember when you had to sit for several minutes training speech recognition software to recognize just your voice?).

Looking Ahead

The smartphone is a mature technology and some have questioned whether it has reached its apex and will soon give way to other options for content consumption and communication. One possibility would involve virtual, augmented or even mixed reality. Given the visual nature of AR and VR this gives me some cause for concern just like I had at the release of the iPhone back in 2007. However, just like Apple took a slab of glass and made it accessible when few people thought it could, with some creativity we can make AR and VR accessible too.

We have come a long way in just 10 years (sometimes I find it hard to remember that it has only been that long). In that time, Apple has shown that “inclusion promotes innovation.”  Accessible touch screens, voice controlled assistants, ride sharing services, are just a few of the innovations that have developed within an accessible ecosystem started with the iPhone. Thank you Apple, and congrats on the 10th anniversary of iPhone.Here’s to the next 10, 20 or 30 years of innovation and inclusion.



Zoom In Back on the iBooks Store

My multi-touch book on low vision supports has been updated to reflect the releases of iOS 10, watchOS 3 and tvOS 10. You will notice that I did not mention macOS Sierra, which was also recently updated. I have decided to streamline the book a bit in order to make the file smaller and allow me to focus on a couple of sections I have wanted to add for a while: videos showcasing the impact of the technology on the lives of people with vision impairments, and a series of accessibility challenges to help readers test their skills with the various features covered in the book. Focusing on iOS (and its varieties) will also allow me to more closely follow the release schedule from Apple (macOS is usually released separately from iOS and its related hardware).

Cover of Zoom In: Vision Supports on iOS Devices, Apple Watch and Apple TV

In addition to the content being completed updated to reflect the latest version of each OS, I have added four new videos covering Magnifier (my favorite new feature in iOS 10), Display Accommodations, Typing Feedback (and new highlight options for text to speech), and the Pronunciation Editor for VoiceOver. I have also done a thorough update of the Apps and Accessories sections based on the ones I have tried and found helpful since the last update of the book.

If you like the book (and want to show your support for my work in general), reviews and ratings on the iBooks Store are always appreciated. The update is free if you have purchased the book earlier – $4.99 (one trip to Starbucks) otherwise.

5 Easy Accessibility Tips for Book Creator Authors

On the occasion of the Book Creator Chat (#BookCreator) focusing on accessibility, this post focuses on five easy to implement accessibility tips for Book Creator authors. By taking the time to consider the variability of readers from the start, you can ensure your books work for more of your potential audience.

1: Choose Text Size and Fonts Wisely

While Book Creator exports to the industry standard ePub format, the kind of ePub document it creates is of the fixed layout variety. This means that readers are not able to resize the text or change its appearance when they open the book in iBooks  (yes they can use the Zoom feature to magnify what is shown on the screen and Invert Colors to enable a high contrast view, but not everyone is familiar with these iOS accessibility features). At a minimum, I would recommend a text size of 24px as a good starting point to ensure the text is large enough to be easily read without too much effort.

When comes to the processing of the text, some readers may have dyslexia or other reading difficulties. While there are special fonts for dyslexic readers that can be installed on the iPad, there is limited research on their impact on reading speed and comprehension.

Instead, the consensus appears to be that clean, sans-serifs fonts, which are good for all readers, can also help readers who have dyslexia. In Book Creator, you can choose from a number of sans-serif fonts such as Cabin, Lato and Noto Sans, or you can use system fonts installed on your device such as Arial, Helvetica and Verdana. You should definitely avoid fonts in the Handwriting and Fun categories, as these are more difficult to decode even for people who do not have dyslexia.

Other tips for improving legibility include:

  • Left justify text. Fully justified text can result in large gaps in the text that can be distracting to readers who have dyslexia.
  • Use  bolding (instead of italics or ALL CAPS) to highlight text. The latter are more difficult to decode.
  • Use shorter sentences and paragraphs.
  • Use  visual aids to reinforce information in the text (but make sure to include an accessibility descriptions as noted later in this post).
  • Use an off-white  background. For some readers, an overly bright (all white) background can result in significant visual stress. To reduce this stress, you can choose a more dim background color in Book Creator. With no item on the page selected, tap the Inspector (i) button and choose a page under Background, then tap More under Color. A color toward the bottom of the color picker should work well.

    Custom color picker in Book Creator with light yellow color selected.

2. Add Descriptions to Images

Readers who are blind will rely on assistive technology (screen readers) to access the content in your books. Screen readers are only able to describe images to readers who are blind when they include a text alternative. Adding a text alternative is straightforward in Book Creator:

  1. With the image selected, tap the Inspector (i) button in the toolbar.
  2. Tap in the Accessibility field.
  3. Enter text that describes what the image represents rather than its appearance. WebAIM has an excellent article on how to create more effective alternative text for images.

    Accessibility field in the Book Creator Inspector.

    This video shows you how to add accessibility descriptions (alternative text) to images in Book Creator. 

3. Create Descriptive Links

Some of your readers will be listening to the content because they are not able to see the display. They will be using a screen reader (VoiceOver on the iPad) to hear the text read aloud. When the screen reader comes across a link that reads as “click here” or “learn more” the person listening to the content will not have sufficient information to determine if the link is worth following or not. Instead of using “click here” or “learn more” as the link text, select a descriptive phrase (“Learn more about adding accessibility descriptions) and make that the link text – as with the following example:

How to add a hyperlink in Book Creator.


4. Supplement Text with Audio

While the iPad has built-in text to speech features (Speak Selection and Speak Screen) and the quality of the voice continues to improve, some readers will still prefer to hear an actual human voice reading the text. Fortunately, adding a recording of the text is an easy task in Book Creator:

  1. Tap the Add (+) button in the toolbar.
  2. Choose Add Sound.
  3. Tap the Start Recording button (the red disk).
  4. Read the text and tap the Stop Recording button when finished.
  5. Tap Yes to use the recording.
  6. Move the Speaker icon to the desired location on the page (it should be right below the corresponding text).

5. Remember Bits are Free!

The only limitation to the length of your book is the amount of storage on your device. Feel free to spread it out! Too much content on a single page can be overwhelming for some readers. A better approach is to use white space to present a clean layout with information organized  into easy to digest chunks. This may require you to create more pages, but that’s ok – remember bits are free!

One limitation of Book Creator, from an accessibility perspective, is that it removes the closed caption track when it recompresses videos to be included in a book. This means the content in those videos is not accessible to those who are Deaf or hard of hearing (or other readers such as English Language Learners who can also benefit from captions). My current workaround is to upload the videos to my YouTube channel and then edit the auto captions created by YouTube so that they are accurate . This is not an ideal solution, as it requires the reader to exit iBooks to view the video in another app (Safari or YouTube), but it is the best workaround I have for now.



SLIDE into Accessibility: 5 Tips for Making Learning Materials Work for Everyone

At this year’s ISTE conference, I was on a panel focusing on accessible educational materials (AEM). The panel was one of the activities sponsored by the ISTE Inclusive Learning Network, of which I am the Professional Learning Chair. I only had about 10 minutes to share some tips with our attendees so I tried to convey them with an easy to remember mnemonic: SLIDE.

As a follow up to that panel, I created this blog post. I hope you find it helpful and look forward to your feedback.

Note: Document accessibility is a complex topic. This is by no means a comprehensive guide, just a few tips to help educators get started by taking some easy steps toward making their content more accessible.

When it comes to making documents more accessible and useful for all learners, small changes can have a significant impact!

By following these tips, you will ensure all learners can enjoy access to information as a first step toward creating more inclusive learning environments.


  • Styles are used to reveal the structure of the information
  • Links are descriptive
  • Images include alternative text
  • Design is clear and predictable
  • Empathy and an ethic of care are a key focus


Properly marked up headings are important for screen reader users, who can use a shortcut to quickly access a list of these headings and navigate to any section in the document (saving valuable time). For other readers, headings reveal the structure of the information and make the document easier to scan.

Thumbs Up
Select the desired heading text and choose from the styles menu in your authoring tool.

Thumbs Down
Choose formatting options such as making the text bigger and bold. The text will look like a heading but lack the proper markup.


As with headings, screen reader users will often use a shortcut to bring up a list of the links in a document. Links need to be descriptive in order for them to make sense when they are accessed in this way, without the context of the surrounding text on the page.

Thumbs Up
Select some descriptive text and make that the link (see examples on this document).

Thumbs Down
Avoid using “click here” and “learn more” as the link text.



Alternative text allows a screen reader to provide a description of an image to someone who is not able to see the screen.

Thumbs UpCreate a short description that focuses on the information conveyed by the image: i.e. “smiley face with thumbs up.”
Thumbs Down
Focus on the appearance of the image: i.e. “white circle with eyes and frown drawn inside.”

Note: Creating helpful alternative text is as much an art as it is a science. Much will depend on the context in which an image is used. WebAIM has some great resources that discuss the considerations for creating effective alternative text in more detail. 


Through good design, you can reduce the amount of effort it takes your readers to process the information in a document, allowing them to focus on the meaning conveyed by the content rather than its presentation.

Some helpful design tips include:

  • Ensure sufficient contrast between the text and its background.
  • Use proximity and white space to make relationships clear: items that belong together should be close to each other and separated from other items by sufficient white space.
  • Use repetition to highlight patterns and build a cohesive whole.


Even more important than implementing these tips is changing your approach to design so that it reflects an ethic of care. Remember that not everyone reading your content can see, hear or process information as well as you. As you approach your work, try to think about the diversity in your potential audience. Doing so will allow your content to reach more readers and have a greater impact!

According to the U.S. Census:

  • 1 in 5 Americans Reports Having a Disability
  • For Americans over 65, that figure is 40%

Accessible content will not only benefit other people. As you age, your ability to see, hear and process content may be affected. When you creat accessible content, you are also designing for your “future self.”



Global Accessibility Awareness Day Resources

Global Accessibility Awareness Day (GAAD) is a very special day for me. Without the many advances in digital access there is so much that I would not have been able to accomplish: getting a doctorate; writing a book; doing my work as an inclusive learning consultant (which involves travel, accessing the Web for research, creating presentations and more); being an advocate through blog posts like these,  my YouTube videos and ebooks..the list is long.

Digital accessibility is personal to me, and I’m grateful that people like Jennison Asuncion and Joe Devon took the initiative to not just create a special day, but start a movement. Even big companies like Apple are now getting in on the action. We are making progress!

I’m far from being an accessibility expert, but I try my best to continue learning and doing what I can to make things more accessible not only for other people but ultimately for myself when the day comes that I have lost all of my eyesight. And that’s the point of GAAD to me. You don’t have to be perfect, you jus have to take the first step!

I wanted to create this blog post as one stop shop for the resources I have created for GAAD:

Along with these resources, I had the pleasure of moderating the #ATChat discussion on accessibility in ed on the eve of GAAD. Here is a transcript of our discussion available on Storify. A big thank you to Karen Janowski and Mike Marotta for allowing me to do that.

As you can see, there are many ways you can contribute to the conversation and the work that is ongoing to make the world a better, more accessible place for all learners. The key is to take the first step. As I did during our ATChat, I want to leave you with the following challenge: what is one small thing you can do or try today to improve accessibility where you work?

4th Gen Apple TV is Now Accessible with Switch Control

At launch, the 4th generation Apple TV lacked support for the iOS Remote app. The only way to navigate the interface and enter passwords and other information was through the use of the new Siri remote. An unintended consequence from the lack of support for the Remote app was that Switch Control users were for the most part locked out from using the device. Yes, you can do quite a bit with Siri on the new Apple TV using only your voice, but Siri doesn’t work for everyone (it does not always recognize the voice commands if the person using the remote has a speech difficulty or strong accent, and it can’t be used to enter passwords).

Fortunately, this limitation was addressed with the latest update for the 4th Generation Apple TV (tvOS 9.1). Now it is possible to use the Remote app on an iOS device to control the Apple TV (you must turn on Home Sharing and be on the same Wifi network for this to work). Since the Remote app is optimized to work with Switch Control, this means that switch users can now control the Apple TV just like any other user. In Item Mode, Switch Control will recognize hotspots in the touch area located in the middle of the Remote app interface. By selecting these hotspots the switch user can navigate in any direction and launch apps or interact with any of the controls on the Apple TV.

Screenshot of iOS Remote app interface showing a hotspot for moving the focus on Apple TV.

The addition of Remote app support is not only a great convenience for all users of the Apple TV. It addresses a major omission in the otherwise excellent support for accessibility on the new Apple TV, which I have covered in a number of recent posts.

Media and Interface Options on 4th Generation Apple TV

As I finish out this series of tutorials on the 4th Generation Apple TV, I want to focus on the options for customizing the playback of media and the appearance of the interface. As shown in the video, the captions and subtitles feature makes great use of the Siri remote capabilities on the 4th Generation Apple TV: you can either enable the captions for the rest of the program (“Turn on closed captions”), or you can enable them for a short time if you have missed something  (just say “What did he/she say?” and after the video rewinds a few seconds the captions will come on for a short time to help you catch what you missed).

Of course this feature will only work if the content creator(s) have made the captions available. Ted Talks is one channel that does, making the great presentations on their channel accessible to a wider audience (yay for Ted Talks!).

Apple TV also supports audio descriptions. Audio descriptions provide a description of the action in a video for those who are unable to see. The audio descriptions can be enabled in the same Media section of the Accessibility Settings where Captions and Subtitles are found. As with captions, these audio descriptions will only work if the content creator has made them available.

In addition to the ways in which viewers can customize media playback, the Apple TV watchOS includes a number of options for customizing the interface: bold text, reduce motion,  reduce transparency and focus style, which adds an outline around the currently selected item.