How I do my photography in 2022 (A photo essay)

I continue to enjoy photography as much as I did when I last posted to this blog back in 2018. However, the way I go about doing my photography has changed dramatically in the last few years. On this post, I will summarize my learning over that time as someone pursuing photography while experiencing the world with a significant visual impairment.

I would estimate that 80-90% of my photos used to be taken with an iPhone. The biggest reason for choosing to shoot with an iPhone is the excellent support for accessibility features that continue to get better with each iteration of iOS. Also, “the best camera is the one you have with you,” and my iPhone is never too far away, ready to capture the moment, even when it’s just my pup Bruno taking a nap.

A small poodle mix dog laying on top of a red blanket taking a nap

I still rely on my iPhone (now an iPhone 14 Pro) to take many of my photos, but the pandemic made me want more out of my photography. I’m happy to continue taking candid photos as well as landscapes that benefit from the wide lens of the iPhone, but one thing the iPhone lacks is reach (even with the 3X telephoto lens on my model).

A red bellied woodpecker grabbing on to the trunk of a moss covered tree in an upside down position.
Sometimes you get lucky, as with this photo I took with my iPhone when a red bellied woodpecker landed near us at the park.

I would look at the wildlife photographers, with their huge lenses, and wish I could do what they were doing. The pandemic gave me the push I needed to not just wish I could do that kind of photography, but to actually make it happen. As someone with a progressive visual impairment, I have always felt a sense of urgency when it comes to experiencing the world with my eyes while I can. It’s complicated. I’m not saying that my experience will be lesser if I completely lose my vision, but let’s be honest for a second, it will be different. And besides my worsening vision, there is the fact that entering my 50s, no aspect of my health is guaranteed – the pandemic was a reminder of that. I want to do more photography while I can still move about with good legs and a good back, not just a serviceable set of eyes.

With those complicated feelings in the background, I set out to take my photography to the next level over the last four years. In this post, I will discuss every aspect of what I’m doing, from the gear that I’m now using to the workflow I use for capturing and editing the photos I share on social media.

Before I get to the details, I want to take a minute to share my gratitude for my partner Cindy. Without her, none of this would be possible. Fortunately for me, she shares my passion for photography and the outdoors and we make a good team. She makes it easier for me to get to the places I want to photograph, and she’s also my spotter. Without her help there is little chance that I would be able to notice some of the wildlife that moves quickly and blends so well with the surroundings when you are out in our wonderful parks where we live. A supportive partner is key, and I’m so grateful to have that.

Ok, now on to the details…

The Gear

The biggest change I’ve made is in my choice of gear. I still like my Nikon D3100. It’s great for flower photos and portraits, but it doesn’t have the reach or speed I need for any kind of wildlife photography. The reach is important for me for a variety of reasons. One, the Florida wildlife can be unforgiving, and I don’t want to be gator bait. Two, I can’t go to far off road if I want to be safe – I just can’t see tree roots and other obstacles. so it’s best that I stay on the trail or boardwalk if one is available and use the long reach of a telephoto lens to capture the wildlife from a safe distance

Although it was a big expense, the first thing I did during the pandemic was not take up gardening or learn how to make sourdough bread. No, I purchased my first full frame camera and a long telephoto lens. My current setup consists of:

It was quite an investment to put together this kit, but I can say without any doubt that it was worth it!

A group of white pelicans resting on a sandbar.
This photo is only possible with a long lens. These white pelicans are protected and you have to keep a minimum distance from their resting area in the sandbar at Fort De Soto.

The A7 III has a number of features that make it ideal for me:

  • Amazing auto-focus: my old DSLR only had a few focus points, but the Sony A7III has almost 700! It also has face and eye detection, and the latter can be set to find eyes of animals. It’s not perfect, but for larger birds (which are easier for me to photograph anyway) it works well enough and I need all the help I can get when it comes to focusing.
  • Burst rate of 10 fps (frames per second). This is important because I often don’t really “see” what I’m photographing. It’s challenging enough to capture a fast moving bird even if you have good eyesight, even more challenging when you can’t track it due to your visual impairment.
  • 24 megapixels. This is key because I often have to shoot a little wider than I would like to in order to make sure I don’t cut off an essential element of the wildlife and its surrounding environment. With more megapixels and a full frame sensor, I can do some serious cropping and still retain pretty good image quality. I already did this quite a bit with the iPhone, but with the full frame camera I have even more leeway in what I can do in post processing to “make the image.”

Not long after I purchased the new camera, Sony released a new model, the A7IV and guess what? – that camera has a screen reader! How helpful would that be! I could sell my current camera and upgrade (which is a pain because you lose so much money). It would be nice if Sony would just add the screen reader functionality with a firmware update for the older camera. Not holding my breath on that one (and there may be technical reasons why it’s not possible).

If anyone from Sony is reading this, please reach out – I have some suggestions for how to make the experience better for people with low vision, starting with adding an option for changing the colors and the thickness of some of the visual indicators. If I can spend less time making out what my camera is trying to tell me, that’s more time I can spend shooting.

As for the lens, the 100-400 Sigma lens was a definite improvement over my older setup, but once I experienced the clear photos I was getting from my new camera and lens combo, I was hooked! I needed more! That brings me to my latest toy – the beast! I’ve nicknamed my Sigma 150-600 that because it weights almost five pounds. Walking around with it for a few hours provides all the exercise I need on the weekends. I balance it all on an iFootage Cobra carbon fiber monopod. I use a monopod because it allows me to be fairly agile with the big lens. I carry everything on a PGYTECH OneMo Camera Backpack 25L. I like that it came with a separate shoulder bag I can use when I just want to take the 100-400mm lens for a lighter set up.  

Luis posing with his camera and a long lens resting on a monopod. A lake and woods appear in the background.

I just added a Wimberley MH-100 mono gimbal head to my kit to make it easier to pan up when I want to shoot birds that are perched high on the trees, or follow a bird in flight (that last one is a real stretch for me, but a man can dream).

As for settings, during pandemic I finally mustered the courage to take the camera out of the automatic modes. Inspired by a webinar from Matt Kloskowski I found through a Facebook link (glad I clicked on that one), I shoot in manual – ok, manual -ish. Let me explain:

  • Aperture is set to the lowest F number my lens will allow when it is at its maximum reach of 600mm. I never change this setting because long telephoto lenses like the one I use need all the light they can get.
  • ISO – this is set to auto so that I don’t have to worry about it as I go through changing light conditions while out in the field. I let the camera do its thing when it comes to this setting.
  • Shutter speed – this is the one setting I play with. I typically follow the rule of using a shutter speed that is at least the length of the lens (1/800 for my lens for most slow moving birds such as herons and egrets).

It’s manual-ish because I’m only really controlling one variable, leaving one set at the same value most of the time, and letting the camera handle the third one that makes the exposure triangle. Exposure is the area where I have lots of room for growth.

For the focusing, I shoot continuous (don’t think I’ve ever changed this setting since I got the camera), with the drive mode set to burst high to take multiple shots each time I press the shutter. My focus mode is often expand flexible spot and I have the focus point indicator set just above center because I have a hard time seeing it out in the field. I used to do back button focus (where you focus with one button and take the photo with another) but I now keep things simple by using the shutter button for both focusing and taking the photo.

Another great tip from Matt Kloskowski (a fellow Tampanian by the way) – get low! How low? As low as you can go.

Dumlin, a small wading bird captured moving along the tide pool at low tide. It is a white and tan bird with a long, pointed beak.
I got real low to capture this dunlin as it moved about looking for food in the tide pool at low tide

You get beautiful bokeh (the dreamy, out of focus background) by increasing the distance between your subject and the background. Combined with the long telephoto lens, this will obliterate the background so that it can’t be made out and distract from your subject. If you can’t get low due to bad knees, here’s a trick: use a tripod camping chair. I do this a lot, especially if the ground is sketchy (I’ve gotten bitten by random bugs a few times, no fun). Another trick I’ll often use is to hold the camera low and use my LCD screen flipped up so that I’m looking down on it while I hold the camera in place with the monopod. It works if it’s not too sunny out, especially when shooting shore birds like sandpipers and plovers. The LCD screen on the A7 III is actually pretty good in terms of brightness and I find that sometimes it helps me to use the LCD screen to find the object I want to capture. I’ll move the screen around a bit until I see a change in light on the LCD – not sure how to explain it well, but it works for me.

The Workflow

Any kind of photography I do, whether with an iPhone or a traditional camera, is only made possible by digital media. A typical outing for me involves taking around 500 photos. Of those, i may choose 4-5 that are in focus and where I did not completely swing and miss with the composition and cut something off. It’s a numbers game for me. Now imagine going to the local CVS and trying to develop 500 photos at a time. That would be an even more expensive hobby and likely not possible for me to sustain. With digital, pixels are somewhat free (cost of the camera and lenses aside). I can take as many shots as I need and play the numbers game like a really bad hitter in baseball if you go by average number of keepers.

American Kestrel perched at the end of a branch. It has orange and black banded back feathers and a mottled brown front. The head is light blue on top  with black, orange and white vertical stripes on the sides.
This American Kestrel, captured at Fort De Soto is a keeper. It was one of the birds on my life list. The photo is a bit backlit, but you can’t give the bird any instructions as to where they should pose.

The first step in my process is to move the photos from my camera to a mobile device with a better screen where I can pinch and zoom to check focus. It used to be an iPhone, but these days it’s also an iPad Air.

I will perform most of my basic edits in the stock Photos app: cropping, strengthening, highlights/shadows, etc. At this point, I’ll mark the five or so photos I want to work some more on (the “keepers”) as favorites. I will transfer those to my computer over Airdrop.

On the computer, the only thing I do on each photo is run it through Topaz Denoise AI to remove some of the noise (grain). That software is like magic! Another Matt Kloskowski recommendation that panned out really well.

Female cardinal perched at the end of a branch with berries on it.
Another keeper with this female cardinal having some berries for a snack. You can see the grain in the background because this was taken in a shady area. Still like it though, and I cleaned it up a bit with Topaz later.

The rest

The last step is to share the photo on Facebook and Vero (no more Instagram for me – it’s too much like Tik Tok now). And that’s how 500 photos become 5 or 6. It’s not pretty, and it takes time, but I really enjoy the entire process (ok, maybe not the file management part – I’m now constantly playing a game with my iPhone and iPad to clear up space for more photos).

The gear, the settings, that’s only part of what goes into taking decent wildlife photos. The other part involves research. I spend a lot of time on the eBird app checking out what other birders have spotted in the area. I also follow a number of groups on Facebook for specific areas (Friends of Fort De Soto) or types of wildlife (mostly local birding groups). That helps me narrow down where I go on a given outing in hopes of increasing my chances of capturing a bird on my life list. If you’re curious, they are:

  • Barred owl – my nemesis…lol..I’ve spent more hours on this one than any other. The closest I’ve gotten is hearing the call a few times but with no luck in finding the actual birds in the woods.
  • Belted kingfisher – this one will be tough. It’s a really small and fast bird that feeds by taking quick dives into the water.
  • Merlin (a small falcon) – I found the kestrel, I’m pretty confident I will find this one too at Fort De Soto.
  • Cooper’s hawk – each time I’ve thought I had this bird it turned out to be a different type of hawk (broad shouldered or red shouldered).
  • Red tailed hawk – I see lots of red shouldered ones, but not this one)

One bird that’s missing from the list now: the bald eagle! I found a nest and finally got to see one in the wild, not at a zoo or rescue. The nest is on a cell phone tower that is well out of reach so the photos I got are a little blurry, possibly due to the heat coming off the water.

Bald eagle in flight with wings spread out.

The other app I use is Merlin (like the falcon). It has really nice descriptions of each species along with all of their calls. Merlin also has an excellent AI-based feature that allows you to upload a photo and get an ID almost immediately with 90% accuracy. This helps me provide better descriptions when I share the photo online.

While out in the field, I will also use the app’s audio recognition feature to identify potential subjects. With many birds, such as owls, you are more likely to hear them before you see them (especially in my case). I can’t recommend these two free apps enough. They are an essential part of my kit, just like my camera and lenses.

Conclusion

In the end, you really have to like what you’re doing if you choose to do wildlife photography, because it takes a lot of patience. There’s lots of waiting involved.

Eastern blue bird perched on a branch. It has a brownish band across the chest with a white underside and blue wings that are barely visible.
We waited more than an hour for one of these birds to finally land on a perch where I could capture it without any branches in front of it. Worth it! It’s a beautiful female Eastern blue bird.

Then just when you’re getting ready to go home, a bird you’ve never seen before shows up and you almost miss the shot. It’s a lot like fishing, and just as expensive.

Osprey that just landed on top of a wooden post with a big fish on its talons. The osprey has its wings spread out.
One of those “nothing to see here, let’s go home” photos.

But the waiting also makes wildlife photography such a great activity for mental health. When taking the photos, you can’t rush. You have to not only settle in and wait until you get the bird (or other wildlife) to show its best side, but when you take the photo, you have to slow your breath and really focus so that you don’t introduce unnecessary shake that can result in a blurry photo. You have to be present and forget about everything that’s worrying you in that moment. That, along with the fact that it gets you out in nature where you can enjoy some sunlight and fresh air, makes it a great activity for addressing stress.

I think that covers everything I want to share in this post. Feel free to reach out if you have any questions (@eyeonaxs on Twitter).

Update: I just found out I can share my entire Vero gallery outside the app. I have also reached out to the Vero team to inquire about alternative text and how we can make that app more accessible. I have been using the comments to identify each bird as I post, but alternative text (along with the comments) is the ideal solution. I’ll update this post based on what I hear back from them.

Entre/Between

I presented this poem at the 2018 ISTE Conference during my TED Ed talk:

Neither here nor there
Neither blind nor sighted
I see you, but not all of you
You see me, but not all of me

Ni aquí, ni allá (Neither here nor there)
The islands, the city, the country
Español, Spanglish, English
Y hoy, ¿quién soy? (And today, who am I?)
Hay, ¿quién sabe? (Well, who knows?)

So I learn to live in between
In and out of the shadows
And as the light turns to dark
And the darkness comes to life
I’ve learned to just dance
Just dance in those shadows

Designed for (fill in the blank)

On the occasion of Global Accessibility Day (GAAD), Apple has created a series of videos highlighting the many ways its iOS devices empower individuals with disabilities to accomplish a variety of goals, from parenting to releasing a new album for a rock band. Each of the videos ends  with the tagline “Defined for” followed by the name of the person starring in the video, closing with “Designed for Everyone.” In this brief post, I want to highlight some of the ways in which this is in fact true. Beyond the more specialized features highlighted in the video (a speech generating app, the VoiceOver screen reader, Made for iPhone hearing aids and Switch Control), there are many other Apple accessibility features that can help everyone, not just people with disabilities:

  • Invert Colors: found under Accessibility > Display Accommodations, this feature was originally intended for people with low vision who need a higher contrast display. However, the higher contrast Invert Colors provides can be helpful in a variety of other situations. One that comes to mind is trying to read on a touch screen while outdoors in bright lighting. The increased contrast provided by Invert Colors can make the text stand out more from the washed out display in that kind of scenario.
  • Zoom: this is another feature that was originally designed for people with low vision, but it can also be a great tool for teaching. You can use Zoom to not only make the content easier to read for the person “in the last row” in any kind of large space, but also to highlight important information. I often will Zoom In (see what I did there, it’s the title of one of my books) on a specific app or control while delivering technology instruction live or on a video tutorial or webinar. Another use is for hide and reveal activities, where you first zoom into the prompt, give students some “thinking time” and then slide to reveal the part of the screen with the answer.
  • Magnifier: need to read the microscopic serial number on a new device, or the expiration name on that medicine you bought years ago and are not sure is still safe to take? No problem, Magnifier (new in iOS 10) to the rescue. A triple-click of the Home button will bring up an interface familiar to anyone who has taken a photo on an iOS device. Using the full resolution of the camera, you can not only zoom into the desired text, but also apply a color filter and even freeze the image for a better look.
  • Closed Captions: although originally developed to support the Deaf and hard of hearing communities, closed captions are probably the best example of universal design on iOS. Closed captions can also help individuals who speak English as a second language, as well as those who are learning how to read (by providing the reinforcement of hearing as well as seeing the words for true multimodal learning). They can also help make the information accessible in any kind of loud environment (a busy lobby, airport, bar or restaurant) where consuming the content has to be done without the benefit of the audio. Finally, closed captions can help when the audio quality is low due to the age of the film, or when the speaker has a thick accent. On Apple TV, there is an option to automatically rewind the video a few seconds and temporarily turn on the closed captions for the audio you just missed. Just say “what did he/she say?” into the Apple TV remote.
  • Speak Screen: this feature found under Accessibility > Speech are meant to help people with vision or reading difficulties, but the convenience it provides can help in any situation where looking at the screen is not possible – one good example is while driving. You can open up a news article in your favorite app that supports Speak Screen while at a stop light, then perform the special gesture (a two finger swipe from the top of the screen) to hear that story read aloud while you drive. At the next stop light, you can perform the gesture again and in this way catch up with all the news while on your way to work! On the Mac, you can even save the output from the text to speech feature as an audio file. One way you could use this audio is to record instructions for any activity that requires you to perform steps in sequence – your own coach in your pocket, if you will!
  • AssistiveTouch: you don’t need to have a motor difficulty to use AssistiveTouch. Just having your device locked into a protective case can pose a problem this feature can solve. With AssistiveTouch, you can bring up onscreen options for buttons that are difficult to reach due to the design of the case or stand. With a case I use for video capture (the iOgrapher) AssistiveTouch is actually required by design. To ensure light doesn’t leak into the lens the designers of this great case covered up the sleep/wake button. The only way to lock the iPad screen after you are done filming is to select the “lock screen” option in AssistiveTouch. Finally, AssistiveTouch can be helpful with older phones with a failing Home button.

While all of these features are featured in the Accessibility area of Settings, they are really “designed for everyone.” Sometimes the problem is not your own physical or cognitive limitations, but constraints imposed by the environment or the situation in which the technology use takes place.

How about you? Are there any other ways you are using the accessibility features to make your life easier even if you don’t have a disability?

3 Ways the iPhone Has Disrupted My Life for the Better

The 10th anniversary of the iPhone announcement in 2007 was mentioned on a number of podcasts I listen to this past week, and this got me into a reflective mood. I can remember vividly where I was when the announcement took place. At the time I was a graduate student at the University of South Florida, and I watched the announcement on the big screen in the iTeach Lounge where I worked as a graduate assistant.

I must admit that at first I was a bit skeptical. The first version of the iPhone was pretty expensive, and it took me a year after the launch to decide that I wanted to get in on the fun.  If I remember correctly, it cost me $399 for 8GB of storage when I bought my first iPhone from Cingular Wireless (remember them?). As cool as that first iPhone was, it took two important developments to make me a true believer.  The first one was the release of the App Store in 2008, which opened up  a world of possibilities only limited to developers’ imagination. The second was the accessibility support announced with the release of the iPhone 3GS. After my first iPhone contract with Cingular was up, I actually returned to a traditional flip phone for a little while for my next phone. Once the accessibility support was announced, though, I was locked in. I have been an iPhone owner ever since.

In addition to the App Store and the built-in accessibility support, there are three other important ways in which the iPhone has disrupted my life in significant ways that go beyond just being able to have access to information and communication on the go.

A Better Set of Eyes

The iPhone couldn’t have come at a better time for me. At the time, my vision loss was getting the point where using a traditional DSLR camera was becoming harder and harder. As I detailed in an article for the National Federation of the Blind’s Future Reflections magazine, the built-in accessibility features of the iPhone have allowed me to continue with my passion for capturing the beauty in the world around me. The way I see it, the iPhone is now “a better set of eyes” for me. Most of the time, I can’t be sure that I have actually captured a decent image when I aim the phone at a scene. It is not until later, when I am reviewing the images more carefully at home, that I notice small details I didn’t even know were in the frame. You can see some examples of my photography on my Instagram page.

Instagram collage showing best nine images of 2016.

Going forward, this idea of the iPhone as my “best set of eyes” is going to be important to me beyond photography. As my vision loss progresses, I will be able to rely on the iPhone’s ever improving camera to recognize currency, capture and read aloud the text in menus, business cards and more, and tell me if my clothes are exactly the color I intended. I have no doubt that “computer vision” will continue to get better and this gives me hope for the future. Already, the VoiceOver screen reader can recognize some objects in your images and describe them aloud. This technology was developed to make searching through large image libraries more efficient, but it will be helpful to people with visual impairments like me as well.

Independence at the Touch of a Button

The second major way the iPhone has disrupted my life for the better is by giving me back my independence in a big way, through apps such as Uber and Lyft. Now, I know you can use these apps on other smartphones, so they are not exclusive to the iPhone. However, when you really think about it, no iPhone means no App Store. No App Store means there is no incentive for other companies to copy what Apple did.

Uber has replaced the many frustrations I had with public transportation (lateness, high taxi fares) with a much more convenient and less expensive solution. Yes, I know some of my blind friends have had a number of issues with Uber (such as outright discrimination from drivers who are not comfortable with a guide dog in their vehicles), but this would probably happen with taxicabs too.

My own experience with Uber has been mostly positive, and the service allows me to easily get to doctor’s appointments, and provides me with a reliable way to get to the airport so that I can do my work of spreading the message of accessibility and inclusive design for education to a broader audience beyond my local area. Uber and Lyft, and the iPhone as the platform that made them possible, have really opened up the world to me.

Can You Hear Me Now?

One of the big trends at the Consumer Electronics Show (CES) this year was the presence of Alexa, Amazon’s voice assistant, on all kinds of consumer appliances. Alexa joins Apple’s Siri, Microsoft’s Cortana and Google’s Assistant in heralding a future where voice and speech recognition replace the mouse and the touch screen as the primary input methods for our computing devices. We are not quite there yet, but the accuracy of these services will continue to improve and I am already seeing the potential with some of the home automation functions that are possible with the existing implementations (having my lights be automatically turned on when I arrive at home, for example).

Here, again, the iPhone deserves quite a bit of credit. The release of Siri as part of the iPhone 4S in 2011 brought the idea of speech recognition and voice control to the mainstream. Previously, its use was limited mostly to individuals with motor difficulties or niche markets like the medical and legal transcription fields. Siri helped popularize this method of voice interaction and made it more user friendly (remember when you had to sit for several minutes training speech recognition software to recognize just your voice?).

Looking Ahead

The smartphone is a mature technology and some have questioned whether it has reached its apex and will soon give way to other options for content consumption and communication. One possibility would involve virtual, augmented or even mixed reality. Given the visual nature of AR and VR this gives me some cause for concern just like I had at the release of the iPhone back in 2007. However, just like Apple took a slab of glass and made it accessible when few people thought it could, with some creativity we can make AR and VR accessible too.

We have come a long way in just 10 years (sometimes I find it hard to remember that it has only been that long). In that time, Apple has shown that “inclusion promotes innovation.”  Accessible touch screens, voice controlled assistants, ride sharing services, are just a few of the innovations that have developed within an accessible ecosystem started with the iPhone. Thank you Apple, and congrats on the 10th anniversary of iPhone.Here’s to the next 10, 20 or 30 years of innovation and inclusion.

 

 

HazeOver as a low vision aid

HazeOver is a $4.99 Mac app marketed as a distraction aid. The idea is that it dims all other windows so you can focus on the content in the foreground window (a blog post like this one, a paper you are drafting for school, etc.). The developers have prepared a short demo video that shows how the app works.

 

While that may be a good way to use this utility, for me it has become a helpful low vision aid as well. I often have a difficult time finding the mouse cursor and popup windows if they are out of my field of view (currently about 7 or 8 degrees depending on the day). I have been using Mousepose to help with the mouse cursor problem. Even with the mouse cursor set to the largest size it allows in Mac OS, I still have a difficult time locating it on the screen, especially when I have a dual monitor setup. I have found that the spotlight Mousepose puts around the mouse cursor when I press a special key (I have set to F1) makes this task much easier.

HazeOver does pretty much the same thing but for popup windows. When one of these windows pops up on the screen, the focus is assigned to it and all other windows are dimmed. In the HazeOver preferences, you can determine whether you want just one window to be highlighted or all front windows within the active app. I find the one window setting to be the most helpful with popups. You can adjust the level of dimming at any time using a slider that can be accessed by clicking the Menu Bar icon. For the best performance, HazeOver asks to get access to Mac OS as an assistive device.

A free trial of HazeOver is available from the developer’s site if you want to try it out first before you buy it on the Mac App Store.

 

Read to Me in Book Creator 5

Read to Me Screen in Book Creator

Book Creator for iPad recently added a new Read to Me text to speech feature that allows learners to hear their books read aloud within the app (without having to transfer the book to iBooks first). The feature also supports accessibility in two other ways:

  • all embedded media can be played automatically. This is great for those with motor difficulties, and it also creates a better flow during reading (no need to stop and start the text to speech to hear the embedded media).
  • automatic page flips: again this is a great feature for those who can’t perform the page flip gesture to turn the pages in a book.

These options can be configured through a Settings pane where it is possible to change the voice (you can choose any voice available for Speech in iOS), slow it down, or remove the word by word highlighting that goes along with it. For better focus, it is also possible to show one page at a time by unchecking the “Side by side pages” option under Display.

I created a short video to show how the new feature works (with a bonus at the end: how to fix the pronunciations with the new pronunciation editor built into the Speech feature in iOS 10).

 

Amazon Echo as an Accessibility Support

Amazon describes the Echo as a hands-free, voice-controlled device that uses Alexa (Amazon’s answer to Siri, Cortana and other voice assistants) to play music, control smart home devices, provide information, read the news, set alarms, and more. I had been wanting to try the Echo since its launch, but I was just not willing to pay the $180 for the original version of this device. 

When Amazon announced a smaller version of the Echo, the Echo Dot, for $50 in the spring of this year, I saw this as a perfect opportunity to try it. The smaller version includes a lower quality speaker than its larger cousin, but since I have a number of Bluetooth speakers already this is not a major issue. Other than the speaker, the rest of the device performs similarly whether you are using the $180 model or the $5o dollar one. Unfortunately, the original Echo Dot was originally released in limited quantities and quickly sold out before I could get my hands on one. 

I had to wait until this fall, when Amazon released a second generation Echo Dot, at the same $50 price point. I quickly ordered one to see how I could use it as a person with a visual impairment. I am intrigued by the use of speech as an interface. I am excited by the prospect of a future where my interactions with my computing devices and even my home become even more seamless – with no buttons to find and press, no specific commands to memorize.  We are not there yet (the speech recognition still has some limitations), but devices like the Echo make me hopeful about the future.

What Is It?

The Amazon Echo

The Echo Dot is shaped like a large hockey puck. It is basically the equivalent of taking the top inch and a half or so from the cylinder-shaped original Echo (the part above the speaker). Around the top edge of this hockey puck are the seven microphones it uses to recognize your voice commands, and a ring light used to provide visual feedback when a command has been recognized. On top of the hockey puck are the few buttons you can use:

  • On/Off button (3 o’clock): the only indication the device has turned on/off is the ring light around the top edge coming on. A tone or other audio feedback would have been helpful.
  • Volume buttons (12 o’clock and 6 o’clock): As you press these buttons, the ring light around the top of the device will let you know the volume level (and you will also get some audio feedback in the form of a tone that will become louder or softer was you press the buttons).
  • Mute button (9 o’clock): Pressing this button will mute the Echo’s 7 microphones so that it temporarily stops recognizing your commands. The ring light on top of the device will turn red to let you know it is muted. This may come in handy if you are plan on watching TV for a while and don’t want the Echo to be triggered by the series of Amazon commercials featuring the trigger word.

Basically, you have to say a trigger word before the Echo will recognize a command. By default, this trigger word is “Alexa” but you can change it by going into the Alexa app on your mobile device. I have mine set to “Echo” (to avoid my device being triggered by Amazon’s commercials)  but “Amazon” is also an option.

The Alexa app is how you first set up your Echo Dot and adjust its settings. It is also how you download and install Skills (the Echo equivalent of apps). These Skills basically expand the range of commands you can use with your Echo.  Overall, Amazon has done a nice job of making the Alexa app for iOS VoiceOver compatible. I had no major issues with unlabeled buttons and the like as I interacted with it.

Ask and You Should Receive (An Answer)

The most basic use of the Echo is to ask it questions it can answer by searching on the Web. This ranges from simple math (“Alexa, what is 125 times 33?”), to unit conversion (“Alexa, how many pounds are in 40 kilograms?”), to spelling and definitions (“Alexa, what is the definition of agoraphobia?”, “Alexa, how do you spell pneumonia?”).

My favorite use of this feature is to ask for updates about my favorite sports teams: “Alexa,  how are the Giants doing?” or “Alexa, when do the Giants play next?” To help Echo provide more accurate responses, I have specified my favorite teams in the Alexa app for iOS (Settings > Sports Update).  In case you are wondering, I love the New York Giants and Mets!

Rather than going over everything you can ask Alexa, I will point you to Amazon’s own extensive list of Alexa commands you can use on the Echo devices.

Get The Day Off to a Good Start

I have set my Echo as my primary alarm to help me get up in the morning (“Alexa, wake me up at 7 am.” or “Alexa set an alarm for 7 am.”). Once I have set an alarm with my voice, I can open the Alexa app and use it to change the alarm sound (Nimble is currently my favorite), or delete the alarm if I no longer need it (I can also do this with just my voice by saying “Alexa, cancel my alarm for 7 am.”) I can just say “Alexa, snooze” if I want to get a few more minutes of sleep before I start my day.

Following my alarm, I have set up a number of Skills that provide me with a nice news summary to start the day (“Alexa, give me my Flash Briefing?”). Right now, I have the following Skills set up for my Flash Briefing: CNET (for the latest tech news), NPR (for a nice summary of national and international news) and Amazon’s Weather Skill (for a nice summary of current weather conditions). Some of these skills (CNET, NPR) play a recording of the content, while others (Amazon’s Weather) use synthesized speech (which is quite pleasant on the Echo if I may add).

To install a Skill, you will open the hamburger menu (located on the left side of the Alexa app if you are using it on iOS), then choose Skills. You can browse or search until you find the Skills that match your needs. Tapping Your Skills in the upper right corner will show you all of your installed skills. You can tap the entry for any of the listed skills to disable (delete) it. If you just want to temporarily disable the skill, you can go to Settings > Account > Flash Briefing and use the on/off toggles to disable or enable a skill (again, you will first have to tap the hamburger menu in the Alexa app to access Settings).

Manage Your Life

In addition to alarms, the Echo supports timers which can be helpful for cooking (we don’t want that casserole to be overcooked, do we?). To set a timer, just say “Alexa, set a timer for 10 minutes?”

Timers can also be helpful for individuals who have executive functioning challenges. Executive functioning is the ability to self regulate, which includes the ability to stay on task and manage and keep time. For someone with this kind of challenge, you can set multiple timers with your Echo. For example, you can set a timer for someone to do an activity for one hour (“Alexa, set a timer for one hour”) then set a second timer for each separate step  that needs to be completed to accomplish the assigned task during that hour. For example, I can say “Alexa, set a second timer for 25 minutes” to have someone read for 25 minutes as part of a larger one hour block of study time. When that 25 minute timer ends I can have the person take a five minute break then repeat the steps to set up a second timer for another 25 minutes of work.

You can also manage your to do list with Alexa: just say “Alexa, add (name of to do item) to my to do list” or “Alexa, remind me to (name of task).” You can review your to do list with Alexa (“Alexa, what’s on my to do list?”) but you can’t remove or edit to do items with your voice – for this you have to go into the Alexa app on your mobile device. Personally, I prefer to use other tools to manage my to do list (Reminders for iOS, Google Keep) but the Echo to do list feature can be helpful for to do lists that are more relevant for the home (cleaning supplies, groceries, etc.).

In the Alexa app, you can also set up any calendar in your Google account as the destination where any events created with the Echo will be added. For example, I can say “Alexa, add (event name) to my calendar,” respond to a few prompts, and that event will be created in the Google calendar I have specified. I can then check what I have scheduled for a given day by saying “Alexa, what’s on my calendar for (today, tomorrow, Friday, etc.).” Again, the ability to stay organized and follow up on appointments and due dates is something most of us take for granted but is a skill that is not as well developed in some people. Any kind of environmental support for these skills, such as what the Echo can provide, is helpful.

A New Way to Read

The Echo is a great way to listen to your books as they are read aloud with either human narration or synthesized speech. This can be a great way to take advantage of the Echo in a classroom setting. Since Amazon owns Audible, you can access any audiobook on your Audible account through the Echo. Just say “Alexa, play (book title) on Audible.” and the Echo will fetch the book and start reading it. You can then use the commands “Alexa, stop” and “Alexa, resume my book” to control playback. You can also navigate the book’s chapters by saying “Alexa, next (or previous) chapter.” Finally, you can set a sleep timer for the current book with the commands”Alexa, set a sleep timer for (x) minutes” or “Alexa, stop playing in (x) minutes.”

Many Kindle books can also be read aloud. To see a list of the books you have purchased that support reading on the Echo, visit Music & Books on the Alexa app, then choose Kindle Books. To start listening to a book, just say “Alexa read (title of the book).” The expected playback commands, “Alexa, stop,” “Alexa resume my Kindle book” and so on are supported for books that can be read aloud.

Let There Be Light

Echo can be a great way to control lights and other appliances using just your voice. This can be especially helpful for those who have motor difficulties that make interaction with with these features of the home a challenge. As a person with a visual impairment, I use my smart lights to ensure my home is well lit when I get home. I set this up as a “Coming Home” routine in the app for my Hue lights. Using geofencing, the app determines when I am close to my home and automatically turns on the lights and sets them to a specified scene (a preset brightness and color). No more fumbling to find my way around a dark home when I come home! Similarly, I can set up a “Leaving Home” routine to make sure the lights automatically turn off if I leave them on by mistake. How-to Geek has a nice article detailing how to set up and configure Hue lights.

By installing the Hue Skill, you can get basic voice control of your lights through the Echo. This Skill gets information about the rooms and scenes (presets for sets of lights with predetermined brightness levels/colors) you have set up from the Hue app  installed on your mobile device. The first step in getting your Echo to control your lights then is to get all of your Hue rooms and scenes recognized. You will do this by going to the Smart Home section in the Alexa app, then scrolling down to Your Devices and selecting “Discover Devices.” You may have to tap the circular button on your Hue bridge to get everything recognized. If everything is recognized correctly, you should see every scene and room you have set up in the Hue app listed as an individual device in the Alexa app. Although I only have three Hue lights (two white and one color)  I have 30 devices recognized by my Alexa app (one device for each individual light, room and scene).

The next step is to set up your Groups in the Alexa app. This is done by choosing “Create group” in the Smart Home section. To give you an idea of my setup, I have the following groups set up: All Lights, Living Room and Office. For each group, I have then enabled the lights, scenes and rooms I want it to include. For example, for my Living Room group I have the following items enabled: Living Room Color and Bookshelf (the names I assigned to the two individual lights I own), Living Room (the room containing the two lights together), and the different Scenes (presets) I have created. These presets are assigned to the room and are currently “Bright in Living Room,” “Dimmed in Living Room” and my favorite “Florida Sunset in Living Room.” For this last one, I was able to choose a nice photo of a sunset I took at the beach and the Hue app automatically picked sunset colors for the scene!

With my current configuration, I can use the following commands to control my lights:

  • “Alexa, turn on (or off) all the lights”: As expected, this turns on/off all of my connected lights using the All Lights group I set up in the Alexa app, which includes a single device called All Hue Lights.
  • “Alexa, turn on the Living Room (or Office) lights”: this commands turns all of the lights assigned to a specific room on or off at once.
  • “Alexa, turn on (or off) the Bookshelf light”: this command turns on or off the individual light called Bookshelf, a single soft white bulb I have set up near my bookshelf.
  • “Alexa, set the Bookshelf light to 50%” or “Alexa, set the Living Room (or Office) lights to 50%: I can control the light level of any individual light or room.
  • “Alexa, turn on Florida Sunset (or any of my named scenes)”: this will turn on my Florida Sunset scene which configures the main living room light to a nice red/orange shade selected from a photo in the Hue app.

Echo is not the only way I can control my lights. Because I have a version of the Hue lights that is HomeKit compatible, I can also use Siri on my iOS devices. In fact, I find the voice control provided by Siri to be not only more intuitive and easier to set up, but also to offer better performance (quicker response). If you have an old iPhone 6s just lying around, you could set it up with “hey Siri” so that it works pretty much like an Echo as far as light control goes. Another thing I like about the Siri control is that I can use my voice to change the color of my lights by saying “set the (light name) to blue (or any of the basic colors).

Finally, I have a Wemo switch I am using to control my Christmas tree lights over the holidays. I have set up this Wemo switch with a rule to automatically turn on the Christmas tree every day at 5:30 pm (around the time when sunset takes place for us in Florida) and then turn it off at 11 pm. I can also just say “Alexa, turn the Christmas tree on (or off) at any time for more manual control. Unfortunately, the Wemo does not work with Siri like the Hue lights. It is limited to the Echo for voice control.

There is a lot more you can do with your lights with the help of the online automation service IFTT, which has an entire channel dedicated to Hue lights. For example, you can say “Alexa, trigger party time” to have your lights set to a color loop. I am still looking for an IFTT trigger that turns my lights blue each time the Giants win.

Are You Entertained?

Ok, so you are not impressed by voice controlled lights? Well, there is more the Echo can do. By far, the most common way I use this device is as a music player. What can I say, whether studying or working out, music is a big part of my life. I have my Echo paired with a nice Bluetooth speaker for better sound than what the built-in speaker can produce. If Bluetooth is not reliable enough for you, you can directly connect the Echo to any speaker that accepts the included 3.5 mm audio cable.

Echo supports a number of music services, including Prime Music (included with Amazon Prime), Spotify (my favorite), Pandora, iHeartRadio and TuneIn. The following commands are supported for playback:

  • “Alexa, play (playlist name) on Spotify”: play songs from any playlist you have set up on Spotify. My favorite is the Discover Weekly playlist released each Monday. This isa collection of songs curated by the Spotify team and a great way to discover new music.
  • Alexa, play (radio station name) on Pandora (or TuneIn or iHeartRadio)”: if you have any of these services set up in the Alexa app, the Echo will start playing the selected station.
  • “Alexa, like this song (or thumbs up/down)”: assign a rating to a song playing on Pandora or iHeartRadio.
  • “Alexa, next”: skip to the next song. Saying “Alexa, previous” will work as expected (at least on Spotify).
  • “Alexa, stop” or “Alexa, shut up”: stop music playback. Saying “Alexa, resume (or play)” will get the music going again.
  • “Alexa, what’s playing?”: get the name of the song and artist currently playing.
  • “Alexa, set the volume to (a number between 1 and 10)”: control the volume during playback.

Update: In the first version of this post, I forgot to mention podcasts. The Echo Dot supports podcasts through the TuneIn service, which does not require an account. The Echo could be an excellent podcast receiver, but it is limited by the fact that podcast discovery is not that great on TuneIn.  The first  thing you need to do is look to see that your favorite podcast is available on TuneIn.

You do this through the Alexa app, by going into Music and Books and selecting TuneIn, then Podcasts. If your podcast is available on TuneIn, make a note of the name it is listed under. You can then say “Alexa, play the  (name of podcast) podcast on TuneIn” and you should be able to listen to the most recent episode of the podcast if you got the name correctly. This was hit or miss in my experience. For podcasts with straightforward names (Radiolab, the Vergecast) I was able to get my Echo to play the latest episode with no problems, but for others it got confused and instead played a song that closely matched my request.

Spotify also supports podcasts now, but I was not able to access them through my Echo. I hope Spotify adds better support for this type of content in a future update. I really enjoy podcasts because they allow me to access content without having to look at a screen, which is tiring to my eyes.

While the Echo does not control playback on a TV (it is limited to music), it can at least help with information about the program you are watching. For example, you can ask “Alexa, who plays (character name” in (movie or TV show)?” or “Alexa, who plays in (movie or TV show)?” to get a full cast list.

Out and About

While there has been some valid criticism of ride sharing services for refusing rides to people who use guide dogs, these services are an improvement over the taxi services many of us have had to rely on due to our disabilities. This is case with me. My visual impairment prevents me from safely driving a car, so I have to rely on other people to drive me or I have to use public transportation (which is not very reliable where I live). Uber and Lyft have been a Godsend for me: I use them to get me to the airport and any meetings or appointments. Most of the time I will request  a ride through an iPhone app, but with Echo I can do it with a simple command as well: “Alexa, ask Uber to request a ride” or “Alexa, ask Lyft for a ride.”

Uber and Lyft are both Skills you have to install on your Echo. Once you have them installed, you will also have to set up a default pickup location the first time you launch the skill. After requesting a ride, you will be prompted a couple of times to make sure you really want to order a ride. Once your ride is on its way, you can say “Alexa, ask Uber (or Lyft) where’s my ride” to get a status.

Before you go out, why not make sure you are dressed for the weather – whether that be snow in more northern parts of the country or rainstorms in the part of the country where I live (Florida). You can just ask “Alexa, what’s the weather like?” or “Alexa, is it going to rain today?” or even “Alexa, will I need an umbrella today?” You can get an idea of the traffic to your destination by saying “Alexa, what’s the traffic like?” This requires you to enter your home address and a destination you visit frequently in the Alexa app (this can be your work address or, in my case, my local airport).

There is a lot more you can do with Echo. I have just scratched the surface with some of the things I myself have been able to try out. For example, I would love to install a Nest thermostat so that I can use my voice to control the temperature (“Alexa, set the temperature to 75 degrees.” – hey, I am from the Caribbean, you know). Other smart home applications include controlling locks and even your garage door. I am not quite ready to trust my home security to my Echo, but it’s nice to know these options exist for those who need them as a way to make their homes more universally designed and capable of meeting their accessibility needs.

If you are a person with a disability (or even if you are not), how are you using your Amazon Echo? If you don’t have one, is this something you are considering?

Bonus: Can’t speak the commands needed to interact with the Echo? No problem. Speech generating devices to the rescue. I have been using the Proloquo4Text app on my iOS device to send commands to my Echo with no problems. I created an Echo folder in Proloquo4Text that has the commands I would use most frequently. Here is a quick demo:

 

 

 

 

VoiceOver on New MacBook Pro with Touch Bar: First Impressions

I finally had a chance to stop by an Apple Store to give the new MacBook Pro with the Touch Bar a try with VoiceOver. What follows is a summary of my initial experience, rather than a comprehensive review. If you do want to read a comprehensive review of these new Touch Bar MacBook Pros from a non-accessibility perspective, there are several of those around, including this excellent one by Jason Snell at Six Colors.

Your first question when you try out this new laptop for the first time is probably going to be: how do I perform the Command F5 shortcut to turn  VoiceOver on without the hardware function keys? Well, if you have been using an iOS device, the answer will sound familiar. It involves a triple-click of the Touch ID button located on the right side of the Touch Bar (this button doubles as the power button for the laptop as well). This is similar to how you use the Home button on iOS devices for the Accessibility Shortcut. The only difference on the Mac is that you have to hold down the Command key as you perform the triple-click on the Touch ID button. The Touch ID/power button is the only part of the Touch Bar that can click with a press. It is separated from the rest of the Touch Bar by a small gap that feels like a notch. I tried to take a photo in the bright lighting of the Apple Store.

Closeup of right side of MacBook Pro Touch Bar showing how Touch ID/power button is separated from rest of Touch Bar.

By default, the Touch Bar will display a set of five buttons on the right side. This is known as the Control Strip, a set of the most frequently used items that is similar in function to the Dock on an iOS device. From right to left, the buttons shown by default are: Siri, Mute, Volume, and Screen Brightness. A fifth narrower button expands the Control Strip and shows more options. When the Control Strip is expanded, it pretty much mirrors the media keys previously available on a laptop with physical keys –  with options such as keyboard brightness, Mission Control, Exposé, and media playback (Play/Pause, Previous and Next). The Close (X) button found on the left edge of the Touch Bar will collapse the Control Strip from its expanded state. The Control Strip is user-configurable, meaning you can swap out the default buttons for other options you use more often.

Closeup of right side of the Touch Bar showing Siri, Mute, Volume, Screen Brightness and More buttons.

Closed up of Touch Bar with More Options expanded.

If you are a fan of the Escape key, you will be happy to know it is still around, just in a different form. You will usually find it on the left side of the Touch Bar (at times it may be replaced by a Close (X) button).

Closeup of left side of the Touch Bar showing a software Escape key

Interacting with the Touch Bar’s software buttons while VoiceOver is turned on will again seem familiar for iOS users. Just like on an iPhone or iPad, you can move your finger over different areas of the Touch Bar to hear each key or button spoken aloud as you go over it with your finger, or you can use flick gestures to move the VoiceOver cursor from item to item. Once the desired item has focus, you can then double-tap anywhere on the Touch Bar (or even Split Tap) to make a selection.

With many of the buttons on the Touch Bar, selecting them will open a slider for adjusting the values for a given setting (volume, screen brightness, and so on). You will need to use a special gesture to interact with that slider. This gesture consists of a double-tap and hold followed by sliding your finger over the Touch Bar without letting go, which will adjust the value of the slider. When you let go with your finger, the slider may close automatically, or you can use the Close (X) button to its right. The special gesture for interacting with a slider is required because of the limited vertical space on the Touch Bar. On an iOS device, you would typically move the VoiceOver cursor to the slider and then flick up or down with one finger to adjust its value.

Brightness slider, with Close button on the right.

As with the Escape key, the Function keys are still around as well, but they are only accessible when you Hold down the  Function key on the keyboard. I recorded a short clip to show that in action.

https://youtu.be/LyrYI_hq9sc

Any of the VoiceOver keyboard shortcuts that use the Function keys still work, you just have to add one more key (Function) to the shortcut and then select the desired function key on the Touch Bar using an iOS-style double-tap. For example, to bring up the VoiceOver Utility, the keyboard shortcut is VO (Control + Option) F8. With the Touch Bar, you will press and hold VO (Control + Option) along with the Function key, then select F8 on the Touch Bar as you would on an iOS device (by double-tapping once it has focus). It took me a few minutes to get the hang of this, but I’m sure it will become more ingrained with practice if I ever get one of these machines and use it day in day out.

  • Note: As noted by @IAmr1A2 on Twitter, you can also use the number keys to perform a VoiceOver command that uses the function keys. For example, the command mentioned above would be VO + Function + 8.

The real power of the Touch Bar lies in the fact that it can morph into a variety of controls depending on the app that is open. Due to time constraints, I was not able to try the Touch Bar with as many apps as I would have liked during my visit. That will have to wait for another time. I did open up GarageBand and had no problems accessing any of the items on the Touch Bar with VoiceOver. With Photos, the only item I could not access was the slider for scrubbing through the photos collection.

Apple has made available a knowledge base article with additional information on using not only VoiceOver but also Zoom and Switch Control with the Touch Bar. I especially look forward to trying out Zoom on a future visit to the Apple Store, as I already know I will probably need to use this feature quite often due to the small size and dim appearance of the Touch Bar (especially when options are dimmed).

For the first few minutes using the Touch Bar, it felt like I was using two devices side by side as I interacted with the new MacBook Pro with VoiceOver, each with its own already familiar interaction model: the keyboard input method laptops have used for  decades, and the touch input method more recently introduced with iOS devices such as the iPhone. While these two input methods were each already really familiar to me, putting them together into a seamless interaction with the new laptop took me a little while.  As with any new interaction method, I know it will take me some time to build the same kind of muscle memory I have developed with the now familiar Trackpad Commander feature (which allows me to use iOS-style gestures performed on the Trackpad as alternatives to many VoiceOver keyboard shortcuts). For now, I am happy to see that the Touch Bar is as accessible as Apple’s other interfaces, but I will need more time experimenting with it on a variety of apps before I can  decide that it is an essential tool that justifies the higher price of the models that include it.

 

Quick Tip: New Visual Supports for Chrome OS Users

I was pleasantly surprised when I recently updated my Chromebook to the latest version of Chrome OS (version 54 at the time of writing). Whenever I do an update, one of the first things I do is go into the accessibility settings to see if any new options have been added. In the latest version of Chrome OS, Google has provided a number of visual supports that I am finding  helpful as a person with low vision. For example, there is now the option to enable additional highlighting (a red circle) when the mouse cursor moves. This kind of additional visual cue makes it much easier for me to use the interface.

To enable the new highlight options, go to Settings > Show Advanced Settings > Accessibility. The new options are as follows:

  • Highlight the mouse cursor when it’s moving: the cursor will be surrounded by red circle whenever it moves. There is already an option to enable a large cursor, but that can cause problems whenever you are trying to check a small box (as often happens on dialog boxes). With this additional highlighting added to the mouse cursor I can still find it on the screen even if I need to temporarily set it to its default size.Mouse cursor with red circle around it to indicate movement.
  • Highlight the object with keyboard focus when it changes: this is really helpful when interacting with form fields. Whenever a text field or other form element gets focus it is surrounded by a thick yellow border.Chrome's Search settings text field with yellow border around it to indicate it has focus.
  • Highlight the text caret when it appears or moves: adds a blue circle around the text caret. I did not find this setting as useful, maybe because there is not much space between the text caret and the highlight.Blue circle around the text caret to draw attention to it as it moves.
  • New animation for auto-click: as the circles get smaller, this indicates how much time is left before the auto-click takes place.New auto click animation: the circles get smaller to indicate how close it is to the auto click

There is some room for improvement with these visual supports (for example, the option to change the colors), but overall I think this is a good addition to Chrome OS. The options for highlighting the moving cursor and keyboard focus are going to always be turned on on my Chromebook.

How to Personalize Learning: New Book Available

I am excited to announce that my friends Barbara Bray and Kathleen MacClaskey’s book How to Personalize Learning, from Corwin Press, is finally out and available for purchase.  I was honored when Barbara and Kathleen asked me to write the foreword for this book, and I am sharing that foreword below. I highly recommend getting the book, which will guide you step by step through the process of creating a more learner-driven environment. 

Dr. David Rose, one of the originators of Universal Design for Learning, often says that “teaching is emotional work.” By that I take him to mean that teaching is not just a purely technocratic endeavor. It is more than just delivering the right content at the right time, though that is important for sure. It is also more than just assessing how well students have mastered said content, though again that is important as well. Rather, at the heart of teaching is the relationships we remember from our best learning experiences. If you were to close your eyes right now and think back to a time when you were most engaged with learning, you will probably see a teacher who was invested in your success, who encouraged you and helped you gain confidence in your abilities, and who balanced the right mix of support with freedom and trust. In short, you were in the presence of someone who, perhaps without realizing it, already understood what it means to be an expert learner, one who driven by his or her passion can then take ownership of learning and do the hard work that is needed for success. What if you could be that teacher for every learner who walks into your classroom?

Helping all of our learners develop their learning expertise is the focus of this book. It is also the ultimate goal of Universal Design for Learning, the framework the authors have chosen to frame their discussion of learning. Notice that I am using the term learners instead of students. This change in my thinking and vocabulary has been influenced by my reading of Barbara and Kathleen’s work. As they state, students are passive recipients of content and have little choice in how they participate in education. Learners are empowered, and as a result take on a more active role in the design of their education. If as some people suggest, language shapes our actions, then right away with chapter one of this book you will be on your way to reshaping your teaching practice. Starting with the language you use, you will be challenged to rethink the traditional teacher-student role in order to close the emotional distance it creates and develop a more equitable relationship with your learners. Thus, right from the start of this book, you will be engaged in the “emotional work” of teaching as you seek to build a different kind of learning environment, one where strong relationships based on trust and shared responsibility are the norm.

With a common language, vision and understanding of what personalization really means as a strong foundation, the rest of the book seeks to translate the latest research about learning into actionable strategies you can immediately implement in your classroom. In this way, the more abstract concept of “the learner” is translated into the more concrete one of “your learners.” This is accomplished through a number of activities (creating a Learner Profile and a Personal Learning Plan as just two examples) that help you get to know who your learners really are, what drives and motivates them, and what they need to do their best work and reach their full potential. I have a feeling that as you help your learners with their Learner Profiles, Personal Learning Backpacks and Personal Learning Plans you yourself will rediscover who you are as a learner. In doing so, you will also rediscover your own passion for teaching and the values that caused you to go into this profession in the first place. At the end of the book, you will be asked to create a 60 second pitch that will serve as a reminder of your core values and hopefully become your compass as you seek to align your practice with those values.

While I agree with Dr. Rose that “teaching is emotional work,” I would add that it is also “civic work.” As educators, we can play a role in ensuring that everyone can enjoy life in a fair and equitable society, but only to the extent that we dedicate ourselves to developing citizens who are actively engaged in the life of their communities.  This requires a commitment to providing all citizens with the skills they will need to be active participants in conversations about the future, including the ability to be critical thinkers and to appreciate and value diversity. We can do this work in each one of our classrooms as we develop each learner’s agency and ability to live a self-determined life, which is a major focus of this book starting in chapter 3.

One of my favorite quotes, attributed to former House Speaker Tip O’Neill, is that “all politics is local.” Similarly, all “learning is local” in the sense that it is not removed from the life of the community where a school is located and the issues that impact the lives of individual learners. In this way, learning is once again more than a technocratic exercise of delivering content and information. It is also about helping learners make connections: not only connections between the topics and ideas discussed in the classroom, but more importantly between those topics and ideas and the learners themselves. This is what “deeper learning” as discussed in chapter 8 is all about: going beyond the surface, and isolated facts that have little relevance to learners, to focus on the big ideas that move and inspire them to be the innovative thinkers and agents for change we will need to solve the complex problems of our shared future.

If you have picked up this book, you probably agree with me that the technocratic approach to education has not worked, and you are looking for a new direction. If that is the case, then I invite you to not just read this book, but use it as a blueprint for rethinking every aspect of your approach to teaching, from the questions that guide your lessons to the tools you use to engage learners and make education more accessible to them. This book asks a lot of you, but it gives you even more in return. By that I mean that it asks you to consider some of the tough questions that are often glossed over in most education books: what does it mean to be a teacher, and more importantly, what does it mean to be a learner? However, as you ask those tough questions, you will also be provided with the tools you need to formulate some good responses and take meaningful action. The many activities and resources found in each chapter will be an invaluable resource as you rethink your role and begin to engage in the “emotional” and “civic” work of teaching needed to create a better society for future generations.