Yes, I Am Disabled (and Proud)!

Note: This post is in response to a recent Twitter post by Katie Novak.  For context, make sure to read Katie’s original tweet and her blog post as well as the many excellent responses that have followed.

For many people, the title of this post would not make sense. They would have a difficult time understanding how I could be proud of something (my disability) that is often seen through such a negative lens in our society. When we think of disability, we often focus on the challenges it presents to people. Rarely do we consider its positive aspects. Yes, there are challenges, and in no way am I implying that we don’t live in an ableist society where disabled people still face a range of obstacles in their efforts to secure civil rights and the opportunity to live a good life. But at the same time, disability as a natural part of the human experience changes our lives in a number of positive ways. For me, the most positive aspect of having a disability is the community, the many wonderful people I’ve met who continue the fight and speak up for themselves with dignity and grace in ways big and small. I am grateful  for those who have come before me and made civil rights for disabled people possible. Without them, I would not be here today.

In school we are taught about the civil rights movement as if it were ancient history – something that took place in the now distant past, a time when people of color and women fought for their rights and secured important victories that have shaped our society to this day. Rarely do we think about the civil rights movement for disabled people, or what some have called the “civil rights struggle of the 21st century.” This is not ancient history – it’s going on today as we speak, and in many different ways: from our efforts to get access to an education, to those aimed at securing good jobs, accessible transportation options and the ability to participate in the digital world on an equal basis.

It would seem like discussing language in that context would be focusing on the wrong thing. Why focus on language (disabled vs. person with a disability) when there are bigger problems that deserve our attention? Well, language is the way in which we define who is in and who is out. It explains why of all places, I would feel the most excluded at an event meant to celebrate inclusive education. When a speaker says something like “I don’t see disability” (yes, this has actually happened/happens) it sends people like me a powerful message – I don’t see you, you are invisible to me. Implying that I don’t have a disability is essentially saying that the last 20 years of my life did not take place. My disability is not just what I can do, it is who I am in the world. I see the failure to fully recognize people’s identities as a blind spot within the Universal Design for Learning (UDL) movement (if you’ll pardon the pun coming from a person with a visual impairment). That is why the conversation Katie’s tweet and blog post started is so important.

Having grown out of the universal design movement in architecture, UDL approaches disability from a social model perspective. Sure, this is a better  than the medical model that dominated the conversation on disability for most of our history. That model is the one championed by the medical profession and by charities. Disabled people were/are seen as broken beings who need to be made whole through surgery, medication and other interventions. In order to raise the funds needed to research “cures” for disabilities such as autism, charities grounded in the medical model rely on skewed portrayals of disabled people. You are either a helpless victim who needs saving or a “superman/superwoman” who can climb Mount Everest, run an ultra marathon or perform some other exceptional achievement. There is not much of an in-between. Mostly ignored are the majority of people with a given condition who just want to be able to do the same things as everyone else: get up in the morning, have a good cup of coffee, go to a job they love and pays them a fair wage, get home to take care of their kids, and sit down to watch a good movie.

The social model of disability, in contrast, does not focus on “fixing” the person, but rather on addressing the barriers that exist in the environment and keep people from accomplishing their goals, whether it is boarding a bus or learning how to code. Thanks to the social model, we have accessible entrances to buildings that benefit not just disabled people, but those making deliveries, parents pushing strollers and more. Angela Blackwell calls that the “curb cut effect.” When we design for the needs of one, we actually find solutions for the many. Or as Microsoft puts it in their Inclusive Design Toolkit, when we “solve for one, we extend too many.” I am incredibly grateful for the many innovations the social model of disability has brought about that have made my life easier: from the technology I now use to access the Web and compose this blog post to the UDL principles that have improved educational opportunity for me and many other learners.

The success of the social model of disability is actually why disabled folks like me are now in a position to critique it and point out where it falls short. One way in which it does is by failing to fully account for people’s lived experiences. Even if you were to create an accessibility utopia where few barriers existed, that would not erase the last twenty or so years of the life that I have lived as a disabled person. And neither would I want that to happen, as my life would not be as rich and as meaningful without the experiences I have had (both good and bad) from the time I was first diagnosed with my visual impairment to where I am today. And yet that is what often happens within the small community of UDL. I notice that there is a discomfort around the use of the word “disability’ (I joke that it is the four letter word of UDL). I think in part that tension arises from the need to “sell” UDL to the broader education community, in the process forgetting that the movement originated out of a need to do what was right and just (including learners in the margins) not what was popular.

I think we can find a balance where we make UDL welcoming to both the general public and  the disability community. The change has to start with the language we use, and I have a few suggestions:

  • Be comfortable with including the terms “disability” and “disabled” in our conversations. Doing so is incredibility validating to those of us who have a lived experience with disability. Whenever you avoid something, you are actually saying something about it, if that makes sense. You are saying “this is so bad” I don’t want to even talk about it. Silence is not neutral – it always “says” something.We can talk about learner variability and disability. The two are not exclusive.
  • As a movement that is built on the importance of providing choice and flexibility, we should be comfortable using both person-first and identity-first language as appropriate. Person-first language (e.g. person with a disability, person with autism) is well intentioned in that it seeks to emphasize the worth of disabled people by putting the person in front of the disability (thus the term person-first language). But we don’t need other people to give us self-worth. We can gain our own self-worth if given the opportunity to live life on our own terms.
  • We need to be clear with our language to avoid misunderstandings. What I propose is that we start by sharing our own labels in relation to disability. There is precedent for this: you have probably seen speakers who share with the audience their pronouns in order to be clear about their gender identity (mine are him/his). We can do the same thing for disability if at the start of our presentations or professional development sessions we share what we call ourselves: person with disability, disabled person or ally are just a few of the options. We should then explain why we have chosen the labels we want to use for ourselves, and give other people the permission to use theirs. I know this suggestion will be controversial. There is a resistance to labels within the UDL movement (you’ve probably heard expressions such as “labels only belong on clothes”). However, I think labels need to be considered in context. It always depends on who is doing the labeling and why. If an outsider is using the label to demean or lessen the worth of a person, then yest that is problematic. But if insiders (members of a community) choose to use the label for themselves, then it can be empowering and a way to build community.

I personally consider myself a disabled person. While disability is not all of who I am, it is a big part of my identity. My disability is not something separate from who I am (which the term “person with a disability” implies). It shapes my world from the moment I open my eyes in the morning to the time I close them at night. It is not like a suitcase that I get to put down whenever I want to.

There are times when I can “pass.” When I am sitting in front of the computer or at a meeting most people can’t tell that I have a disability. After all, “I don’t look blind.” My eyes don’t look or act any differently than most sighted people. But the moment it is time to go and I have to take out my cane, the gig is up. I don’t always have the option of hiding my disability and neither do I want to do that. It took me some time to come to terms with my disability and to get comfortable in my own skin. I know that is a journey without a fixed timetable. As with most experiences in life, we all process the events in our lives differently, and besides, it is not a race. That is why I take no offense if someone says he or she is a “person with a disability.” That’s fine with me because that’s where they are in their journey, and I was once at that point too.

When I was first diagnosed with my visual impairment, I spent countless hours researching my condition and reading research articles (which by the way were way beyond my understanding as someone who does not have a medical background). It was my hope at that time that there would be some kind of treatment or innovative surgery I could get that would “fix” everything and return me to my former life as a sighted person. I followed the medical model because that is how I was first exposed to disability, through my medical diagnosis. At that point I saw my disability as the taking of something – the taking of my independence (as I could no longer drive). I am quite open in sharing that it was not a great time in my life.

It was only when I started to embrace my disability, when I started to see that I could work with it, instead of around it, that I started to thrive in my life, both professionally and personally. A key point was learning about the work of John Swain and Sally French and their Affirmative Model of Disability, which completely changed my perspective and the way I saw myself in relation to my disability. It no longer became something I had to overcome, but rather something that I drew strength from, even with the occasional frustrations that come from living in a world that doesn’t always accommodate my needs.

I don’t have all the answers, and this post is just a reflection of my own always evolving way of thinking about disability and what it means to be disabled. I cannot speak for someone who has another kind of disability, or even a person with a visual impairment for that matter. There is great variation even within my own community. Some of us need large print, but in my case large print is actually a barrier since it places more of the text outside my limited peripheral vision.

I welcome your feedback on this post, both good and bad. By engaging in difficult conversations, we continue to grow and evolve. I am so grateful for everyone who has shared their experiences on Katie’s blog, especially Joni Degner and Eric Moore. You should really read their posts. By sharing our experiences we give others permission to do the same, and in doing so we build community.

In closing, I want to leave you with a Maya Angelou quote my colleague Mindy Johnson shared with me that now informs how approach all of my work: “Do the best you can until you know better. Then when you know better, do better.” Let’s commit to always doing better and becoming more inclusive. And when we get it wrong (as we all will at some point), let’s commit to continuing these kinds of courageous conversations. We will be better for it.







I presented this poem at the 2018 ISTE Conference during my TED Ed talk:

Neither here nor there
Neither blind nor sighted
I see you, but not all of you
You see me, but not all of me

Ni aquí, ni allá (Neither here nor there)
The islands, the city, the country
Español, Spanglish, English
Y hoy, ¿quién soy? (And today, who am I?)
Hay, ¿quién sabe? (Well, who knows?)

So I learn to live in between
In and out of the shadows
And as the light turns to dark
And the darkness comes to life
I’ve learned to just dance
Just dance in those shadows

Designed for (fill in the blank)

On the occasion of Global Accessibility Day (GAAD), Apple has created a series of videos highlighting the many ways its iOS devices empower individuals with disabilities to accomplish a variety of goals, from parenting to releasing a new album for a rock band. Each of the videos ends  with the tagline “Defined for” followed by the name of the person starring in the video, closing with “Designed for Everyone.” In this brief post, I want to highlight some of the ways in which this is in fact true. Beyond the more specialized features highlighted in the video (a speech generating app, the VoiceOver screen reader, Made for iPhone hearing aids and Switch Control), there are many other Apple accessibility features that can help everyone, not just people with disabilities:

  • Invert Colors: found under Accessibility > Display Accommodations, this feature was originally intended for people with low vision who need a higher contrast display. However, the higher contrast Invert Colors provides can be helpful in a variety of other situations. One that comes to mind is trying to read on a touch screen while outdoors in bright lighting. The increased contrast provided by Invert Colors can make the text stand out more from the washed out display in that kind of scenario.
  • Zoom: this is another feature that was originally designed for people with low vision, but it can also be a great tool for teaching. You can use Zoom to not only make the content easier to read for the person “in the last row” in any kind of large space, but also to highlight important information. I often will Zoom In (see what I did there, it’s the title of one of my books) on a specific app or control while delivering technology instruction live or on a video tutorial or webinar. Another use is for hide and reveal activities, where you first zoom into the prompt, give students some “thinking time” and then slide to reveal the part of the screen with the answer.
  • Magnifier: need to read the microscopic serial number on a new device, or the expiration name on that medicine you bought years ago and are not sure is still safe to take? No problem, Magnifier (new in iOS 10) to the rescue. A triple-click of the Home button will bring up an interface familiar to anyone who has taken a photo on an iOS device. Using the full resolution of the camera, you can not only zoom into the desired text, but also apply a color filter and even freeze the image for a better look.
  • Closed Captions: although originally developed to support the Deaf and hard of hearing communities, closed captions are probably the best example of universal design on iOS. Closed captions can also help individuals who speak English as a second language, as well as those who are learning how to read (by providing the reinforcement of hearing as well as seeing the words for true multimodal learning). They can also help make the information accessible in any kind of loud environment (a busy lobby, airport, bar or restaurant) where consuming the content has to be done without the benefit of the audio. Finally, closed captions can help when the audio quality is low due to the age of the film, or when the speaker has a thick accent. On Apple TV, there is an option to automatically rewind the video a few seconds and temporarily turn on the closed captions for the audio you just missed. Just say “what did he/she say?” into the Apple TV remote.
  • Speak Screen: this feature found under Accessibility > Speech are meant to help people with vision or reading difficulties, but the convenience it provides can help in any situation where looking at the screen is not possible – one good example is while driving. You can open up a news article in your favorite app that supports Speak Screen while at a stop light, then perform the special gesture (a two finger swipe from the top of the screen) to hear that story read aloud while you drive. At the next stop light, you can perform the gesture again and in this way catch up with all the news while on your way to work! On the Mac, you can even save the output from the text to speech feature as an audio file. One way you could use this audio is to record instructions for any activity that requires you to perform steps in sequence – your own coach in your pocket, if you will!
  • AssistiveTouch: you don’t need to have a motor difficulty to use AssistiveTouch. Just having your device locked into a protective case can pose a problem this feature can solve. With AssistiveTouch, you can bring up onscreen options for buttons that are difficult to reach due to the design of the case or stand. With a case I use for video capture (the iOgrapher) AssistiveTouch is actually required by design. To ensure light doesn’t leak into the lens the designers of this great case covered up the sleep/wake button. The only way to lock the iPad screen after you are done filming is to select the “lock screen” option in AssistiveTouch. Finally, AssistiveTouch can be helpful with older phones with a failing Home button.

While all of these features are featured in the Accessibility area of Settings, they are really “designed for everyone.” Sometimes the problem is not your own physical or cognitive limitations, but constraints imposed by the environment or the situation in which the technology use takes place.

How about you? Are there any other ways you are using the accessibility features to make your life easier even if you don’t have a disability?

3 Ways the iPhone Has Disrupted My Life for the Better

The 10th anniversary of the iPhone announcement in 2007 was mentioned on a number of podcasts I listen to this past week, and this got me into a reflective mood. I can remember vividly where I was when the announcement took place. At the time I was a graduate student at the University of South Florida, and I watched the announcement on the big screen in the iTeach Lounge where I worked as a graduate assistant.

I must admit that at first I was a bit skeptical. The first version of the iPhone was pretty expensive, and it took me a year after the launch to decide that I wanted to get in on the fun.  If I remember correctly, it cost me $399 for 8GB of storage when I bought my first iPhone from Cingular Wireless (remember them?). As cool as that first iPhone was, it took two important developments to make me a true believer.  The first one was the release of the App Store in 2008, which opened up  a world of possibilities only limited to developers’ imagination. The second was the accessibility support announced with the release of the iPhone 3GS. After my first iPhone contract with Cingular was up, I actually returned to a traditional flip phone for a little while for my next phone. Once the accessibility support was announced, though, I was locked in. I have been an iPhone owner ever since.

In addition to the App Store and the built-in accessibility support, there are three other important ways in which the iPhone has disrupted my life in significant ways that go beyond just being able to have access to information and communication on the go.

A Better Set of Eyes

The iPhone couldn’t have come at a better time for me. At the time, my vision loss was getting the point where using a traditional DSLR camera was becoming harder and harder. As I detailed in an article for the National Federation of the Blind’s Future Reflections magazine, the built-in accessibility features of the iPhone have allowed me to continue with my passion for capturing the beauty in the world around me. The way I see it, the iPhone is now “a better set of eyes” for me. Most of the time, I can’t be sure that I have actually captured a decent image when I aim the phone at a scene. It is not until later, when I am reviewing the images more carefully at home, that I notice small details I didn’t even know were in the frame. You can see some examples of my photography on my Instagram page.

Instagram collage showing best nine images of 2016.

Going forward, this idea of the iPhone as my “best set of eyes” is going to be important to me beyond photography. As my vision loss progresses, I will be able to rely on the iPhone’s ever improving camera to recognize currency, capture and read aloud the text in menus, business cards and more, and tell me if my clothes are exactly the color I intended. I have no doubt that “computer vision” will continue to get better and this gives me hope for the future. Already, the VoiceOver screen reader can recognize some objects in your images and describe them aloud. This technology was developed to make searching through large image libraries more efficient, but it will be helpful to people with visual impairments like me as well.

Independence at the Touch of a Button

The second major way the iPhone has disrupted my life for the better is by giving me back my independence in a big way, through apps such as Uber and Lyft. Now, I know you can use these apps on other smartphones, so they are not exclusive to the iPhone. However, when you really think about it, no iPhone means no App Store. No App Store means there is no incentive for other companies to copy what Apple did.

Uber has replaced the many frustrations I had with public transportation (lateness, high taxi fares) with a much more convenient and less expensive solution. Yes, I know some of my blind friends have had a number of issues with Uber (such as outright discrimination from drivers who are not comfortable with a guide dog in their vehicles), but this would probably happen with taxicabs too.

My own experience with Uber has been mostly positive, and the service allows me to easily get to doctor’s appointments, and provides me with a reliable way to get to the airport so that I can do my work of spreading the message of accessibility and inclusive design for education to a broader audience beyond my local area. Uber and Lyft, and the iPhone as the platform that made them possible, have really opened up the world to me.

Can You Hear Me Now?

One of the big trends at the Consumer Electronics Show (CES) this year was the presence of Alexa, Amazon’s voice assistant, on all kinds of consumer appliances. Alexa joins Apple’s Siri, Microsoft’s Cortana and Google’s Assistant in heralding a future where voice and speech recognition replace the mouse and the touch screen as the primary input methods for our computing devices. We are not quite there yet, but the accuracy of these services will continue to improve and I am already seeing the potential with some of the home automation functions that are possible with the existing implementations (having my lights be automatically turned on when I arrive at home, for example).

Here, again, the iPhone deserves quite a bit of credit. The release of Siri as part of the iPhone 4S in 2011 brought the idea of speech recognition and voice control to the mainstream. Previously, its use was limited mostly to individuals with motor difficulties or niche markets like the medical and legal transcription fields. Siri helped popularize this method of voice interaction and made it more user friendly (remember when you had to sit for several minutes training speech recognition software to recognize just your voice?).

Looking Ahead

The smartphone is a mature technology and some have questioned whether it has reached its apex and will soon give way to other options for content consumption and communication. One possibility would involve virtual, augmented or even mixed reality. Given the visual nature of AR and VR this gives me some cause for concern just like I had at the release of the iPhone back in 2007. However, just like Apple took a slab of glass and made it accessible when few people thought it could, with some creativity we can make AR and VR accessible too.

We have come a long way in just 10 years (sometimes I find it hard to remember that it has only been that long). In that time, Apple has shown that “inclusion promotes innovation.”  Accessible touch screens, voice controlled assistants, ride sharing services, are just a few of the innovations that have developed within an accessible ecosystem started with the iPhone. Thank you Apple, and congrats on the 10th anniversary of iPhone.Here’s to the next 10, 20 or 30 years of innovation and inclusion.



Zoom In Back on the iBooks Store

My multi-touch book on low vision supports has been updated to reflect the releases of iOS 10, watchOS 3 and tvOS 10. You will notice that I did not mention macOS Sierra, which was also recently updated. I have decided to streamline the book a bit in order to make the file smaller and allow me to focus on a couple of sections I have wanted to add for a while: videos showcasing the impact of the technology on the lives of people with vision impairments, and a series of accessibility challenges to help readers test their skills with the various features covered in the book. Focusing on iOS (and its varieties) will also allow me to more closely follow the release schedule from Apple (macOS is usually released separately from iOS and its related hardware).

Cover of Zoom In: Vision Supports on iOS Devices, Apple Watch and Apple TV

In addition to the content being completed updated to reflect the latest version of each OS, I have added four new videos covering Magnifier (my favorite new feature in iOS 10), Display Accommodations, Typing Feedback (and new highlight options for text to speech), and the Pronunciation Editor for VoiceOver. I have also done a thorough update of the Apps and Accessories sections based on the ones I have tried and found helpful since the last update of the book.

If you like the book (and want to show your support for my work in general), reviews and ratings on the iBooks Store are always appreciated. The update is free if you have purchased the book earlier – $4.99 (one trip to Starbucks) otherwise.

5 Easy Accessibility Tips for Book Creator Authors

On the occasion of the Book Creator Chat (#BookCreator) focusing on accessibility, this post focuses on five easy to implement accessibility tips for Book Creator authors. By taking the time to consider the variability of readers from the start, you can ensure your books work for more of your potential audience.

1: Choose Text Size and Fonts Wisely

While Book Creator exports to the industry standard ePub format, the kind of ePub document it creates is of the fixed layout variety. This means that readers are not able to resize the text or change its appearance when they open the book in iBooks  (yes they can use the Zoom feature to magnify what is shown on the screen and Invert Colors to enable a high contrast view, but not everyone is familiar with these iOS accessibility features). At a minimum, I would recommend a text size of 24px as a good starting point to ensure the text is large enough to be easily read without too much effort.

When comes to the processing of the text, some readers may have dyslexia or other reading difficulties. While there are special fonts for dyslexic readers that can be installed on the iPad, there is limited research on their impact on reading speed and comprehension.

Instead, the consensus appears to be that clean, sans-serifs fonts, which are good for all readers, can also help readers who have dyslexia. In Book Creator, you can choose from a number of sans-serif fonts such as Cabin, Lato and Noto Sans, or you can use system fonts installed on your device such as Arial, Helvetica and Verdana. You should definitely avoid fonts in the Handwriting and Fun categories, as these are more difficult to decode even for people who do not have dyslexia.

Other tips for improving legibility include:

  • Left justify text. Fully justified text can result in large gaps in the text that can be distracting to readers who have dyslexia.
  • Use  bolding (instead of italics or ALL CAPS) to highlight text. The latter are more difficult to decode.
  • Use shorter sentences and paragraphs.
  • Use  visual aids to reinforce information in the text (but make sure to include an accessibility descriptions as noted later in this post).
  • Use an off-white  background. For some readers, an overly bright (all white) background can result in significant visual stress. To reduce this stress, you can choose a more dim background color in Book Creator. With no item on the page selected, tap the Inspector (i) button and choose a page under Background, then tap More under Color. A color toward the bottom of the color picker should work well.

    Custom color picker in Book Creator with light yellow color selected.

2. Add Descriptions to Images

Readers who are blind will rely on assistive technology (screen readers) to access the content in your books. Screen readers are only able to describe images to readers who are blind when they include a text alternative. Adding a text alternative is straightforward in Book Creator:

  1. With the image selected, tap the Inspector (i) button in the toolbar.
  2. Tap in the Accessibility field.
  3. Enter text that describes what the image represents rather than its appearance. WebAIM has an excellent article on how to create more effective alternative text for images.

    Accessibility field in the Book Creator Inspector.

    This video shows you how to add accessibility descriptions (alternative text) to images in Book Creator. 

3. Create Descriptive Links

Some of your readers will be listening to the content because they are not able to see the display. They will be using a screen reader (VoiceOver on the iPad) to hear the text read aloud. When the screen reader comes across a link that reads as “click here” or “learn more” the person listening to the content will not have sufficient information to determine if the link is worth following or not. Instead of using “click here” or “learn more” as the link text, select a descriptive phrase (“Learn more about adding accessibility descriptions) and make that the link text – as with the following example:

How to add a hyperlink in Book Creator.


4. Supplement Text with Audio

While the iPad has built-in text to speech features (Speak Selection and Speak Screen) and the quality of the voice continues to improve, some readers will still prefer to hear an actual human voice reading the text. Fortunately, adding a recording of the text is an easy task in Book Creator:

  1. Tap the Add (+) button in the toolbar.
  2. Choose Add Sound.
  3. Tap the Start Recording button (the red disk).
  4. Read the text and tap the Stop Recording button when finished.
  5. Tap Yes to use the recording.
  6. Move the Speaker icon to the desired location on the page (it should be right below the corresponding text).

5. Remember Bits are Free!

The only limitation to the length of your book is the amount of storage on your device. Feel free to spread it out! Too much content on a single page can be overwhelming for some readers. A better approach is to use white space to present a clean layout with information organized  into easy to digest chunks. This may require you to create more pages, but that’s ok – remember bits are free!

One limitation of Book Creator, from an accessibility perspective, is that it removes the closed caption track when it recompresses videos to be included in a book. This means the content in those videos is not accessible to those who are Deaf or hard of hearing (or other readers such as English Language Learners who can also benefit from captions). My current workaround is to upload the videos to my YouTube channel and then edit the auto captions created by YouTube so that they are accurate . This is not an ideal solution, as it requires the reader to exit iBooks to view the video in another app (Safari or YouTube), but it is the best workaround I have for now.



SLIDE into Accessibility: 5 Tips for Making Learning Materials Work for Everyone

At this year’s ISTE conference, I was on a panel focusing on accessible educational materials (AEM). The panel was one of the activities sponsored by the ISTE Inclusive Learning Network, of which I am the Professional Learning Chair. I only had about 10 minutes to share some tips with our attendees so I tried to convey them with an easy to remember mnemonic: SLIDE.

As a follow up to that panel, I created this blog post. I hope you find it helpful and look forward to your feedback.

Note: Document accessibility is a complex topic. This is by no means a comprehensive guide, just a few tips to help educators get started by taking some easy steps toward making their content more accessible.

When it comes to making documents more accessible and useful for all learners, small changes can have a significant impact!

By following these tips, you will ensure all learners can enjoy access to information as a first step toward creating more inclusive learning environments.


  • Styles are used to reveal the structure of the information
  • Links are descriptive
  • Images include alternative text
  • Design is clear and predictable
  • Empathy and an ethic of care are a key focus


Properly marked up headings are important for screen reader users, who can use a shortcut to quickly access a list of these headings and navigate to any section in the document (saving valuable time). For other readers, headings reveal the structure of the information and make the document easier to scan.

Thumbs Up
Select the desired heading text and choose from the styles menu in your authoring tool.

Thumbs Down
Choose formatting options such as making the text bigger and bold. The text will look like a heading but lack the proper markup.


As with headings, screen reader users will often use a shortcut to bring up a list of the links in a document. Links need to be descriptive in order for them to make sense when they are accessed in this way, without the context of the surrounding text on the page.

Thumbs Up
Select some descriptive text and make that the link (see examples on this document).

Thumbs Down
Avoid using “click here” and “learn more” as the link text.



Alternative text allows a screen reader to provide a description of an image to someone who is not able to see the screen.

Thumbs UpCreate a short description that focuses on the information conveyed by the image: i.e. “smiley face with thumbs up.”
Thumbs Down
Focus on the appearance of the image: i.e. “white circle with eyes and frown drawn inside.”

Note: Creating helpful alternative text is as much an art as it is a science. Much will depend on the context in which an image is used. WebAIM has some great resources that discuss the considerations for creating effective alternative text in more detail. 


Through good design, you can reduce the amount of effort it takes your readers to process the information in a document, allowing them to focus on the meaning conveyed by the content rather than its presentation.

Some helpful design tips include:

  • Ensure sufficient contrast between the text and its background.
  • Use proximity and white space to make relationships clear: items that belong together should be close to each other and separated from other items by sufficient white space.
  • Use repetition to highlight patterns and build a cohesive whole.


Even more important than implementing these tips is changing your approach to design so that it reflects an ethic of care. Remember that not everyone reading your content can see, hear or process information as well as you. As you approach your work, try to think about the diversity in your potential audience. Doing so will allow your content to reach more readers and have a greater impact!

According to the U.S. Census:

  • 1 in 5 Americans Reports Having a Disability
  • For Americans over 65, that figure is 40%

Accessible content will not only benefit other people. As you age, your ability to see, hear and process content may be affected. When you creat accessible content, you are also designing for your “future self.”