Tag Archives: iPhone

10+ Accessibility Features Coming with iOS 11

 

Slide during Apple's Keynote at WWDC showing a big list of upcoming iOS 11 features that were not discussed during the keynote.At its recent World Wide Developers Conference (WWDC), Apple gave developers and the public a preview of the next version of iOS, its operating system for mobile devices such as the iPad, iPhone and iPod Touch. This post will not provide a full overview of all the changes coming to iOS 11, but will instead focus on  a few key ones that will have the most impact for people like me who rely on the built-in accessibility of these devices.

A public beta will be coming later this summer, and I can’t wait to get my hands on it to start testing some of these new features. For now, this overview is based on everything I have read on the Web (shout out to AppleVis for a great overview of the new features for those with visual impairments), what I saw in the videos available through the WWDC app after each day of the conference, and updates from people I follow on social media who were at the WWWDC (you should definitely follow Steven Aquino, who provided excellent coverage of all things iOS accessibility from WWDC).

Without further delay, here are ten new or enhanced iOS accessibility features coming soon to an iPad or iPhone near you:

  1. Smart Invert Colors: In iOS 11, Invert Colors will no longer be an all or nothing affair. The new version of Invert Colors will actually leave the images and video alone with a new Smart Invert option. This fixes a problem I have always had with Invert Colors. Sometimes there is text in a graphic or video that is essential for understanding, but with Invert Colors as it currently exists it can be difficult to read this text. This will no longer be the case with iOS 11 and its new version of Invert Colors.
  2. Enhanced Dynamic Type: Dynamic Type has been enhanced to reduce clipping and overlapping at larger text sizes, and Dynamic Type will work in more of the UI for apps that support it. In some areas of the UI where text is not resized dynamically, such as tab bars, a tap and hold on the selected control will show it at a larger size in the middle of the screen.
  3. VoiceOver descriptions for images: VoiceOver will be able to detect text that’s embedded in an image, even if the image lacks alternative text (or as Apple calls it, an accessibility description). VoiceOver will  also announce some of the items in a photo that has not been described (tree, dog, sunset, etc.), much like the Camera app already does when you take a photo with VoiceOver turned on.
  4. A more customizable Speech feature: you can now customize the colors for the word and sentence highlighting that is available for Speech features such as Speak Selection and Speak Screen. These features are helpful for learners who struggle with decoding print and need the content read aloud. The highlighting can also help with attention and focus while reading, and it’s nice to see we can now change the color for a more personalized reading experience.
  5. Type for Siri: In addition to using your voice, iOS 11 also allows you to interact with Siri by typing your requests. This is not only an accessibility feature (it can help those with speech difficulties who are not easily understood by Siri) but a privacy convenience for everyone else. Sometimes you are in a public place and don’t want those around you to know what you are asking Siri.
  6. More options for captions: for videos that include closed captions, you can now enable an additional style that makes the text larger and adds an outline to make it stand out from the background content.  Along with this new style, you can now turn on spoken captions or convert the captions to Braille . This last option could make the content more accessible to individuals with multiple disabilities.
  7. Switch Control enhancements: typing can take a lot of time and effort when using switch access technologies. With iOS 11, Apple hopes to make this process easier by providing better switch control word prediction  as well as a “scan same key after tap” option (this will repeat the same key without requiring the user to scan to it again, which can take some time). Other Switch Control enhancements for better overall usability include:
    • Point Mode has an additional setting for more precise selections: this will add a third scan to refine the selection at an even slower pace, and early reports are that it actually selects the actual point rather than the surrounding accessibility element (button, etc.).
    • Scanner Menu option for Media Controls: recognizing that media playback is a popular activity for switch users (just as it is for everybody else), a new category for Media Controls has been added to the scanner menu. I assume that this feature will work on any app with playback controls, which would make it a great option to use with Voice Dream Reader or any other app with playback controls at the bottom of the screen (which require a lot of scanning to access).
  8. Improved PDF accessibility support: while I am not a big fan of PDF as a format, there are still a lot of legacy PDF documents out there so it is nice to see improved support for PDF accessibilty in iOS 11. One of the common uses of PDFs is to collect information through forms, and with iOS 11 Apple promises better support for forms as well as for tagged (properly marked up) PDF documents.
  9. Better Braille support: as reported by AppleVis, the Braille improvements in iOS 11 include better text editing and more customizable actions that can be performed through shortcuts on a Braille display.
  10. A redesigned Control Center: you will have more ways to get to the features you use the most with the new Control Center, which will now allow you to add widgets for the Accessibility Shortcut,  the Magnifier and text resizing.

As with most releases of iOS, there are a number of features that can be beneficial to users with disabilities even if they are not found in the Accessibility area of Settings. These include:

  1. An improved Siri voice: we got a taste of how much more natural the new Siri voice will sound, but there was not a lot of detail provided. It is not clear if this voice will be available to VoiceOver users, or if it will be incorporated into the rest of the Speech features such as Speak Selection and Speak Screen when iOS 11 finally ships to the general public in the fall.
  2. Siri translation: Siri can now translate from English into a few languages – Chinese, French, German, Italian, or Spanish.
  3. Persistent Reader View in Safari: when you long-press the Reader icon, you can choose to have reader activate on all websites, or just on the current one. Reader view is helpful for removing ads and other distractions that can compete for attention, especially for people with learning difficulties such as ADHD.
  4. Apple TV remote in Control Center: It is also possible to add an onscreen Apple TV remote to the Control Center. This will be helpful for Switch Control or VoiceOver users who may prefer this option to using the physical Apple TV remote.
  5. One handed keyboard: this option is intended to help anyone who is trying to enter text while one hand is busy with groceries, etc. but it can also be helpful to someone who is missing a limb. Tapping on the Globe icon that provides access to third party keyboards will now show an option for moving the keyboard to either side of the screen where it can be easier to reach with one hand.
  6. One handed Zoom in Maps: this feature is intended for drivers to have better access to Maps while on the road, but as with the one-handed keyboard, others will benefit from this change as well. As someone who often has one hand busy with a white cane, I welcome all of these features that make the iPhone easier to use with just one hand.
  7. Redesigned onscreen keyboard: Letters, numbers, symbols, and punctuation marks are now all on the same keyboard. Switching between them is as easy as a simple flicking gesture on the desired key.
  8. Easier setup of new devices; anything that reduces the amount of time entering settings is helpful to switch and screen reader users. When setting up a new iOS device, there’s now an option in iOS 11 to hold it near an existing device to automatically copy over settings, preferences, and iCloud Keychain.
  9. More customizable AirPod tap controls: AirPods can now be customized with separate double tap gestures for the left and right AirPod.  One can be set to access Siri, for example, while another can be set to play the next track. Previously, double-tap settings were applied to both AirPods. This will be helpful for individuals who rely on AirPods to access Siri as an accessibility option.
  10. Less restrictive HomeKit development: it will now be possible to develop a HomeKit device without being part of Apple’s HomeKit licensing program. All that will be required is a developer account. The catch is that any HomeKit devices  developed this way cannot go up for sale. That should be fine for assistive tech makers who just want to experiment and prototype solutions for their clients without the investment required to be part of the official HomeKit program. As The Verge suggests, this could also encourage more developers to dip their toes with HomeKit development, which will hopefully lead to more options for those of us who depend on smart homes for improved accessibility.
  11. QR Code support in the Camera: QR codes can be a helpful way to provide access to online resources without requiring a lot of typing of URLS and the like. They are frequently used in the classroom for this purpose, so I know teachers will find having this feature as a built-in option a welcome addition.
  12. SOS: There’s an Emergency SOS option in Settings app that allows users to turn on an “Auto Call” feature. This will immediately dial 911 when the Sleep/Wake button is pressed five times. A similar feature has been available on the Apple Watch since the introduction of watchOS 3, and it’s nice to see it on the iPhone as well.
  13. Core Bluetooth support for Apple Watch: while this is an Apple Watch feature, I’m mentioning it here because the Apple Watch is still very closely tied to its paired iPhone. With watchOS 4, which was also previewed at the WWWDC, Apple Watch Series 2 is getting support for connecting directly to Bluetooth low energy accessories like those that are used for glucose tracking and delivery. Furthermore, the Health app will have better support for diabetes management in conjunction with CoreBluetooth, including new metrics related to blood glucose tracking and insulin delivery.
  14. Indoor navigation in Maps: Maps has been a big help for me whenever I find myself in an area I don’t know. I love the walking directions and how well they integrate with the Apple Watch so that I can leave my phone in my pocket as I navigate with haptic feedback and don’t give off that lost tourist look. With iOS 11, these features will be extended to indoor spaces such as major malls and airports.
  15. A redesigned App Store: the screenshots I have seen point to a bigger, bolder design for the App Store, which will be welcome news to those of us with low vision. If you like how Apple News looks now, you will be pleased with the redesigned App Store.
  16. Built-in screen recording: I rely on Screenflow to record my video tutorials, but having screen recording as a built-in feature will be convenient for quick videos. This will be great for providing tech support to parents, or for documenting accessibility bugs to developers.
  17. Person to Person payments in Messages: anything that allows payments without the need to use inaccessible currency is A OK with me.

The Notes app has been greatly enhanced in iOS 11 to make it an even better tool for diverse learners who need options for how they capture, store and retrieve information:

  • Instant Markup: adding an annotation to a PDF document or screenshot will be as easy as picking up the Apple Pencil and touching the screen to start drawing/annotating.
  • Instant Notes: tapping the lock screen with Apple Pencil will create a quick handwritten note that will appear in Notes once the device is unlocked.
  • Inline Drawing: when you begin to draw or annotate in Notes, the text around the annotation will move out of the way. You can add inline drawings in Mail as well.
  • Searchable annotations in Notes: everything you write with the Apple Pencil in Notes will now be searchable, making it much easier to find the highlights in long notes taken during a lecture or long presentation.
  • Document Scanner: the new Document Scanner in Notes will detect the edges to automatically scan a document, crop it, and remove glare and tilt to produce a cleaner image. This will result in a better scan you could then pass the scanned document off to a different app to perform even better optical character recognition (OCR). I am hoping this feature is just a start, and eventually we will get built-in OCR in iOS.

A major focus with iOS 11 is improved support for iPad productivity. This includes support for Drag and Drop in an enhanced Split View, as well as a new multi-tasking view with what appears to be the equivalent of Spaces on the Mac. With Apple’s excellent track record of accessibility, I’m confident these features will have the same level of accessibility as the rest of iOS.

I can’t wait to try out iOS 11 when the public beta becomes available to start enjoying some of these features on at least one of my devices (not the one I use to get work done, of course – at least not until late in the beta cycle when everything is much more stable).

How about you – which ones of these iOS 11 features have you most excited? Which ones do you think you will use the most?

 

Advertisements

3 Ways the iPhone Has Disrupted My Life for the Better

The 10th anniversary of the iPhone announcement in 2007 was mentioned on a number of podcasts I listen to this past week, and this got me into a reflective mood. I can remember vividly where I was when the announcement took place. At the time I was a graduate student at the University of South Florida, and I watched the announcement on the big screen in the iTeach Lounge where I worked as a graduate assistant.

I must admit that at first I was a bit skeptical. The first version of the iPhone was pretty expensive, and it took me a year after the launch to decide that I wanted to get in on the fun.  If I remember correctly, it cost me $399 for 8GB of storage when I bought my first iPhone from Cingular Wireless (remember them?). As cool as that first iPhone was, it took two important developments to make me a true believer.  The first one was the release of the App Store in 2008, which opened up  a world of possibilities only limited to developers’ imagination. The second was the accessibility support announced with the release of the iPhone 3GS. After my first iPhone contract with Cingular was up, I actually returned to a traditional flip phone for a little while for my next phone. Once the accessibility support was announced, though, I was locked in. I have been an iPhone owner ever since.

In addition to the App Store and the built-in accessibility support, there are three other important ways in which the iPhone has disrupted my life in significant ways that go beyond just being able to have access to information and communication on the go.

A Better Set of Eyes

The iPhone couldn’t have come at a better time for me. At the time, my vision loss was getting the point where using a traditional DSLR camera was becoming harder and harder. As I detailed in an article for the National Federation of the Blind’s Future Reflections magazine, the built-in accessibility features of the iPhone have allowed me to continue with my passion for capturing the beauty in the world around me. The way I see it, the iPhone is now “a better set of eyes” for me. Most of the time, I can’t be sure that I have actually captured a decent image when I aim the phone at a scene. It is not until later, when I am reviewing the images more carefully at home, that I notice small details I didn’t even know were in the frame. You can see some examples of my photography on my Instagram page.

Instagram collage showing best nine images of 2016.

Going forward, this idea of the iPhone as my “best set of eyes” is going to be important to me beyond photography. As my vision loss progresses, I will be able to rely on the iPhone’s ever improving camera to recognize currency, capture and read aloud the text in menus, business cards and more, and tell me if my clothes are exactly the color I intended. I have no doubt that “computer vision” will continue to get better and this gives me hope for the future. Already, the VoiceOver screen reader can recognize some objects in your images and describe them aloud. This technology was developed to make searching through large image libraries more efficient, but it will be helpful to people with visual impairments like me as well.

Independence at the Touch of a Button

The second major way the iPhone has disrupted my life for the better is by giving me back my independence in a big way, through apps such as Uber and Lyft. Now, I know you can use these apps on other smartphones, so they are not exclusive to the iPhone. However, when you really think about it, no iPhone means no App Store. No App Store means there is no incentive for other companies to copy what Apple did.

Uber has replaced the many frustrations I had with public transportation (lateness, high taxi fares) with a much more convenient and less expensive solution. Yes, I know some of my blind friends have had a number of issues with Uber (such as outright discrimination from drivers who are not comfortable with a guide dog in their vehicles), but this would probably happen with taxicabs too.

My own experience with Uber has been mostly positive, and the service allows me to easily get to doctor’s appointments, and provides me with a reliable way to get to the airport so that I can do my work of spreading the message of accessibility and inclusive design for education to a broader audience beyond my local area. Uber and Lyft, and the iPhone as the platform that made them possible, have really opened up the world to me.

Can You Hear Me Now?

One of the big trends at the Consumer Electronics Show (CES) this year was the presence of Alexa, Amazon’s voice assistant, on all kinds of consumer appliances. Alexa joins Apple’s Siri, Microsoft’s Cortana and Google’s Assistant in heralding a future where voice and speech recognition replace the mouse and the touch screen as the primary input methods for our computing devices. We are not quite there yet, but the accuracy of these services will continue to improve and I am already seeing the potential with some of the home automation functions that are possible with the existing implementations (having my lights be automatically turned on when I arrive at home, for example).

Here, again, the iPhone deserves quite a bit of credit. The release of Siri as part of the iPhone 4S in 2011 brought the idea of speech recognition and voice control to the mainstream. Previously, its use was limited mostly to individuals with motor difficulties or niche markets like the medical and legal transcription fields. Siri helped popularize this method of voice interaction and made it more user friendly (remember when you had to sit for several minutes training speech recognition software to recognize just your voice?).

Looking Ahead

The smartphone is a mature technology and some have questioned whether it has reached its apex and will soon give way to other options for content consumption and communication. One possibility would involve virtual, augmented or even mixed reality. Given the visual nature of AR and VR this gives me some cause for concern just like I had at the release of the iPhone back in 2007. However, just like Apple took a slab of glass and made it accessible when few people thought it could, with some creativity we can make AR and VR accessible too.

We have come a long way in just 10 years (sometimes I find it hard to remember that it has only been that long). In that time, Apple has shown that “inclusion promotes innovation.”  Accessible touch screens, voice controlled assistants, ride sharing services, are just a few of the innovations that have developed within an accessible ecosystem started with the iPhone. Thank you Apple, and congrats on the 10th anniversary of iPhone.Here’s to the next 10, 20 or 30 years of innovation and inclusion.

 

 

IOS 6 Accessibility Features Overview

At today’s World Wide Developer’s Conference (WWDC) Apple announced IOS 6 with a number of accessibility enhancements. I am not a developer (yet!) so I don’t have a copy of the OS to check out,  so this post is primarily about what I read on the Apple website and on social media. A few of these features (word highlighting for speak selection, dictionary enhancements, custom alerts)  were tucked away in a single slide Scott Forstall showed, with little additional information on the Apple website. So far, these are the big features announced today:

  • Guided Access: for children with autism, this feature will make it easier to stay on task. Guided Access enables a single app mode where the home button can be disabled, so an app is not closed by mistake. In addition, this feature will make it possible to disable touch in certain areas of an app’s interface (navigation, settings button, etc.). This feature could be used to remove some distractions, and to simplify the interface and make an app easier to learn and use for people with cognitive disabilities. Disabling an area of the interface is pretty easy: draw around it with a finger and it will figure out which controls you mean. I loved how Scott Forstall pointed out the other applications of this technology for museums and other education settings (testing), a great example of how inclusive design is for more than just people with disabilities.
  • VoiceOver integrated with AssistiveTouch: many people have multiple disabilities, and having this integration between two already excellent accessibility features will make it easier for these individuals to work with their computers by providing an option that addresses multiple needs at once. I work with a wounded veteran who is missing most of one hand, has limited use of the other, and is completely blind. I can’t wait to try out these features together with him.
  • VoiceOver integrated with Zoom: people with low vision have had to choose between Zoom and VoiceOver. With IOS 6, we won’t have to make that choice. We will have two features to help us make the most of the vision we have: zoom to magnify and VoiceOver to hear content read aloud and rest our vision.
  • VoiceOver integrated with Maps: The VoiceOver integration with Maps should provide another tool for providing even greater  independence for people who are blind, by making it easier for us to navigate our environment.
  • Siri’s ability to launch apps: this feature makes Siri even more useful for VoiceOver users, who now have two ways to open an app, using touch or with their voice.
  • Custom vibration patterns for alerts: brings the same feature that has been available on the iPhone for phone calls to other alerts. Great for keeping people with hearing disabilities informed of what’s happening on their devices (Twitter and Facebook notifications, etc.).
  • FaceTime over 3G: this will make video chat even more available to people with hearing disabilities.
  • New Made for iPhone hearing aids: Apple will work with hearing aid manufacturers to introduce new hearing aids with high-quality audio and long battery life.
  • Dictionary improvements: for those of us who work with English language learners, IOS 6 will support Spanish, French and German dictionaries. There will also be an option to create a personal dictionary in iCloud to store your own vocabulary words.
  • Word highlights in speak selection: the ability to highlight the words as they are spoken aloud by text to speech benefits many  students with learning disabilities. Speak selection (introduced in IOS 5) now has the same capabilities as many third party apps in IOS 6.

These are the big features that were announced, but there were some small touches that are just as important. One of these is the deep integration of Facebook into IOS. Facebook is one of those apps I love and hate at the same time. I love the amount of social integration it provides for me and other people with disabilities, but I hate how often the interface changes and how difficult it is to figure it out with VoiceOver each time an update takes place. My hope is that Apple’s excellent support for accessibility in built-in apps will extend to the new Facebook integration, providing a more accessible alternative to the Facebook app which will continue to support our social inclusion into mainstream society. You can even use Siri to post a Facebook update.

Aside from the new features I mentioned above, I believe the most important accessibility feature shown today is not a built-in feature or an app, but the entire app ecosystem. It is that app ecosystem that has resulted in apps such as AriadneGPS and Toca Boca, both featured in today’s keynote. The built-in features, while great,  can only go so far in meeting the diverse needs of people with disabilities, so apps are essential to ensure that accessibility is implemented in a way that is flexible and customized as much as possible to each person. My hope is that Apple’s focus on accessibility apps today will encourage even more developers to focus on this market.

Another great accessibility feature that often gets ignored is the ease with which IOS can be updated to take advantage of new features such as Guided Access and the new VoiceOver integration. As Scott Forstall showed on chart during the keynote, only about 7% of Android users have upgraded to version 4.0, compared to 80% for IOS 5. What that means is that almost every IOS user out there is taking advantage of AssistiveTouch and Speak Selection, but only a very small group of Android users are taking advantage of the accessibility features in the latest version of Android.

Big props to Apple for all the work they have done to include accessibility in their products, but more importantly for continuing to show people with disabilities in a positive light. I loved seeing a blind person in the last keynote video for Siri. At this keynote, Apple showed another  blind person “taking on an adventure” by navigating the woods near his house independently. As a person with a visual disability myself, I found that inspiring. I salute the team at Apple for continuing to make people with disabilities more visible to the mainstream tech world, and for continuing to support innovation through inclusive design (both internally and through its developer community).

iPhoto App and Accessibility

This weekend I finally had a chance to try out the new iPhoto app Apple released along with iPad 3 (or as they are calling it “the new iPad.”) As an aspiring photographer I was impressed with the many options for organizing, editing, and sharing photos Apple has packed into this app which only costs $4.99 in the App Store. There have been many reviews of the new app posted online already, so I will not add another one here. However, I do have a unique perspective on the new app that I would like to share. Not only do I like to take photos (calling myself a photographer might be a stretch but it’s a hobby I enjoy and continue to try to get better at every day), but I also have a visual disability so I am part of a small community of blind photographers.

When I opened the iPhoto app on my iPhone, the first thing I did was turn on the VoiceOver built-in screen reader to hear how it would do with the new photo editing app. Frankly, I was not surprised that the new iPhoto app would be as accessible with VoiceOver as it is. I have come to expect accessible products from Apple over last few years, and I’m proud to be associated with it as an Apple Distinguished Educator. However, as I dug deeper into the iPhoto app with VoiceOver, the level of attention to detail in providing accessibility was still pretty impressive. For example, the brushes used to retouch photos (repair, lighten, darken, etc) are all accessible through VoiceOver gestures, as are the exposure and color correction controls and the various effects, . When I selected the crop tool, VoiceOver told me to pinch to resize the photo and as I did so it told me how much as I was zooming in as well as how far the image was offset (“image scaled to 15X, image offest by 15% x and 48% y).

On the iPad, there is a dedicated help button that opens up a series of overlays indicating what each button does. Not only was every part of the overlay accessible, but so is the entire help built into the iPad version of the app. The attention to detail is more impressive to me because there are so few blind photographers who would take advantage of an app such as iPhoto. What it does show is the level of commitment Apple has to accessibility, because it will go to great lengths to add accessibility even when only a few people will benefit from it.

In a recent blog post, accessibility advocate Joe Clark called out a number of hot new apps (Readability, Clear, Path, and Flipboard) for what he called irresponsible web development that results in accessibility barriers. Well, to me this new iPhoto app shows that you can design an app that is not only visually appealing, feature-packed and easy to use and learn, but also accessible to people with visual disabilities. I hope more developers start to realize that accessibility does not have to compete with good design, but that both complement each other.

When I first loaded the iPhoto app on my iPhone (that was the first device I installed the app on) I was too impatient to go on the Web and read about the app before I started to work with it. That’s just the kind of user I am, I like to get right in and try things out. Well, on the iPhone app the Help button from the iPad version of the app is missing. Most of the icons make sense, but in some cases I was unsure, so what I did was turn on VoiceOver and move my finger around the screen to have it announce what each button was for (or to at least give me a better idea). In that case, compatibility with VoiceOver helped me learn the app much faster without having to consult the help, and that got me to thinking. As these devices (phones, tablets, and whatever comes next) continue to get smaller and the interfaces start to use more visuals (tiles, buttons, etc.) and less text, the ability to hear the help may become an essential aspect of learning how to use the interface. In this way, features like VoiceOver would actually enhance the usability of a particular app for everyone – what universal design is all about.

 

2012: The Year I Quit Photography?

Well, not quite. But it will definitely be the year I make a major transition in my photography. As I will explain below, 2012 will be the year that I begin to take most of my photos with my iPhone. Since I purchased my iPhone 4S this fall, I’ve been using it more and more as a replacement for my Nikon D3100 DSLR camera. The improved camera specs of the iPhone 4S (8 megapixels at F2.8), along with the new features in IOS 5 (such as quick access to the Camera app from the home screen, the ability to use the volume up button to take a photo and VoiceOver compatibility) make the iPhone the ideal device to “capture the moment” for someone like me.  As Chase Jarvis has stated, it is the camera that’s always with you, always at the ready to document those fleeting moments in life.

However, it’s not only the convenience and ease of use of the iPhone that’s drawing me away from using a traditional camera to capture images. As most of you reading this know, I have a visual impairment and I’m slowly losing my vision to a condition called retinitis pigmentosa, or RP for short. At the moment, I have less than 10 degrees of vision left (less than 20 degrees qualifies you as being legally blind). RP leads to progressive vision loss starting with peripheral and low light vision. In my case, my low light vision is what has been most affected by my RP, but the usual closing in of the field of the vision is also there.

I’ve been lucky that my progression with vision loss has been pretty slow, but the last few times I’ve gone out to shoot with my camera, I’ve noticed some changes in my remaining eyesight. It’s ironic that it is photography that is helping me judge these changes in my vision. I’m not sure if these changes are really there or if it’s just my mind playing tricks on me. Much of what I’ve read about RP states that people with the condition lose most of their peripheral vision around the age of 40, and guess what, I turn 40 in a few days. So, maybe it’s all in my mind, but the last few times I’ve gone out with my camera I’ve ended up with some major eye fatigue and pain afterwards. I think what’s happening is that since I can’t see that much of the frame through the viewfinder, I’m having to move my eyes a lot to make sure I have framed the shot properly. All of this eye movement is probably fatiguing my eye muscles, so that when I get home I have pain in my eyes and the area around them. It usually takes a few doses of pain relief medicine and some warm compresses for the eye pain to subside, and I would rather avoid it if at all possible.

I love photography, and I would hate to give it up. However, when I got into this hobby I knew that the day would eventually come when my vision loss would make photography really difficult. I have no regrets for having spent a considerable amount of money on my DSLR and my lenses and other accessories over the last couple of years. I would not give up the joy that the hobby has brought me over that time. My photography has allowed me to experience a lot of beauty around me that I would normally miss with my own eyes (the camera has a far better range of vision than my own eyes). I also saw photography as a challenge, not only for myself but also for all of us who have visual impairments. I have always enjoyed the expression on people’s faces (when I can see them) when I step up to a spot with my white cane and pull out a camera to take a photo. I know they look, and I know they probably ask themselves “wait, isn’t he blind, why is he taking a photo?” If I have forced anybody to confront their preconceived ideas of the meaning of blindness and disability, then it has been all worth it to me. I can continue to make a similar statement through my use of the iPhone as a video and still camera.

So the thought that has been on my mind for the last few days of 2011 and the first few of 2012 is, where do I go from here? Well, I would say that for 95-99% of the time I will be using the iPhone to take photos. The large, bright, sharp display on the device will make it easier for me to frame shots without having to stress my eyes as much. I also plan to use a trick I recently learned that makes it easier to take a photo by pressing the center button on the Apple headphones. I’ve looked at other options, but for now the iPhone appears to be the best one for me. The wide selection of apps with filters also means that even if I don’t quite get a picture right, I can apply a few filters and turn my failures into “creative experiments.” In some ways, I find not having to know so much about my camera sort of freeing, in that I can now focus on getting the best composition and less on what my camera is doing. In some ways, that’s exciting.

My DSLR camera does have a LiveView mode that allows you to use the LCD screen to frame a shot, but that mode is very slow (defeating the purpose of having a DSLR) and it is difficult to get sharp photos if you’re not using a tripod. Having said that, I have no plans to sell my camera and lenses. I could still use the LiveView mode for recording the videos I use in my tutorials on mobilelearning4specialneeds (after all, video is the reason that mode is in the camera in the first place). I could also use the camera for some brief shoots in a favorable lighting conditions. Limiting my time using the viewfinder will be the key, as will be making sure I take frequent breaks to let my eyes rest in between shots. At the very least, I will keep my camera and lenses as a nice present for my daughter when she gets older (though I’m sure there will be much better technology for her to choose from at that time).

I’m so grateful to Apple for taking the iPhone in the direction that it has by making it such as great portable camera (it is now surpassing traditional point and shoot cameras in the number of uploads on Flickr, one of the most popular photo sharing sites). Without the iPhone 4S, I think 2012 really would be the year I end my journey as a photographer. The way I see it, without digital I would have never gotten into photography in the first place (too costly considering the number of photos I have to take for a few good ones to turn out), and without the iPhone I would not be able to now continue in the hobby. It has been a beautiful journey with its usual ups and downs (times when I have gotten really frustrated when I couldn’t take the photos I wanted to, either because of my lack of technical expertise or the limitations of my eyesight), but I wouldn’t change a thing. There is a saying well known to those who follow Apple, “here’s to the crazy ones.” Well, I guess photography helped me see myself as one of those crazy ones who can change the world one small step at a time. It is crazy for someone with my kind of visual impairment to invest the money and time I have in pursuing a hobby like photography, but I hope that my crazyness has inspired somebody else to take on their own crazy adventure into whatever hobby fills them with joy and passion.

This long blog post is really the inspiration for the video I submitted for my application to the 2012 ADE Global Institue in Cork, Ireland, which is available below:

Overview of new accessibility features in IOS 5

With IOS 5, Apple has introduced a number of features to make their mobile devices even more accessible to people with disabilities:

  • VoiceOver enhancements: IOS 5 includes an updated voice for VoiceOver, the built-in screen reader for people who have visual disabilities. I have found the new voice to be a great improvement over the old one, especially when reading long passages of text in apps such as iBooks. Another improvement is that the triple-click home option is set to toggle VoiceOver by default. Along with the PC-free setup introduced with IOS 5, this small change has made it possible for someone with a visual disability to independently configure his or her IOS device out of the box, without any help from a sighted person. The Mac-cessibility website has an excellent overview of the many new changes in VoiceOver that I highly recommend reading.
  • Camera app compatibility with VoiceOver: this is a neat feature that will make photography more accessible to people with low vision and those who are blind. With VoiceOver on, if you launch the Camera app it will announce how many faces are in the frame. In my testing this worked pretty well, and I’ve used it successfully on the iPad and the iPod touch. It should work even better on the iPhone, which has a better sensor and optics. Combined with the ability to turn on the camera app from the lock screen on some devices (iPhone and iPod touch) by double-tapping the home button and the fact that you can use the volume up button as a shutter release, Apple has done a lot to make photography more accessible to people with visual disabilities.
  • Text selection showing Speak menu option.Speak selection (text to speech): This is one of my favorite features introduced with IOS 5. It provides another modality for students with learning disabilities who can benefit from hearing the text read aloud to them. To use it, go into Settings, General, Accessibility, tap Speak Selection and choose On. Once you’ve enabled this feature, when you select text a popup will show the option to Speak the text using the VoiceOver voice. Note that you can control the speaking rate for the speak selection feature independently from VoiceOver.
  •  Balance controls for audio: In addition to mono-audio, which combines both channels of stereo audio into a single mono channel, there is now an option for controlling the  left/right balance for stereo sound. On the iPhone, there is now also a special Hearing Aid mode that is supposed to make the device more compatible with hearing aids.
  • Handling of incoming calls: you can choose to automatically route incoming calls to the speaker phone feature of the phone, or to a headset.
  • New alert types: on the iPhone, you can use one of five unique vibration patterns to identify who is calling if you have a hearing disability, or you can create your own pattern by tapping it on the screen. These custom vibration patterns can be assigned in the Contacts app by opening a contact’s information, choosing Edit, Vibration and then Create New Vibration. There is also an option to have the LED  flash go off when you get a notification, a new message, and so on.
  • Assistive touch: this was one of the most anticipated accessibility features in IOS 5. Assistive touch was designed to make IOS devices easier to use for people with motor difficulties. For example, someone who is not able to tap the Home button to exit an app can now bring up an overlay menu with icons for many of the hardware functions of their device, including the Home button. Overlay menu for assistive touch.Assistive touch also includes options allowing for single finger use of many of the multi-touch gestures (including the new four finger gestures available only for the iPad and the pinch gesture used for zooming). To use assistive touch, choose Settings, General, Accessibility and turn on Assistive Touch. You will know assistive touch is enabled when you see a floating circular icon on the screen. Tapping this icon will open the overlay menu with the assistive touch options. Note that you can move the assistive touch icon to another area of the screen if it gets in the way. Please note that Assistive Touch is not compatible with VoiceOver. I really wish the two features could work in tandem. This would be helpful to users with multiple disabilities.
  • Custom gestures: assistive touch includes an option to create your own gestures. Update: I was able to create a few useful gestures after watching this video from Cult of Mac. I created one for scrolling up on a page and one for scrolling down. Now when I’m reading a long web page, instead of having to swipe up or down to scroll I can bring up the assistive touch overlay menu, select the new gesture from the Favorites group and tap once on the screen to scroll.
  • Typing shortcuts: under Settings, General, Keyboard you can create shortcuts for common phrases. For example, you could create a shortcut that would enable you to enter an email signature by simply typing the letters “sig” and pressing the space bar. This feature should provide a big productivity boost to anyone who has difficulty entering text on their mobile device.
  • Siri and dictation (iPhone 4S only): the new personal assistant uses voice recognition and artificial intelligence to respond to a range of user queries that can be made using everyday language rather than preset commands. The Apple website has a video that demos some of the capabilities of Siri.  One of the amazing things about Siri is that it works without any training from the user. Along with Siri, the iPhone 4S also includes an option to dictate text by tapping a microphone button on the keyboard.  The ability to use your voice to control the device can be helpful to many different types of disabilities, including those who have disabilities that make it difficult to input text. One of the things I have found especially frustrating when using VoiceOver on IOS devices is inputting text, so I hope this new dictation feature makes that easier. I will have a chance to test it out more thoroughly once I get my own iPhone 4S (currently out of stock in my area). Update: I finally got my hands on an iPhone 4 and I tried using the dictation feature with VoiceOver. It is working really well for me. I find the microphone button on the onscreen keyboard by moving my finger over it, double-tap to start dictation (as indicated by a tone) and then I double-tap with two fingers to stop it. Even better, after I’m done dictating the text, if I move the phone away from my mouth,  it automatically stops listening! I love this feature.
  • Word selection showing Define menu option.Dictionary: While it is not listed as an accessibility feature, having a system dictionary is a new feature that is great for providing additional language supports to students with learning disabilities. To use this feature, select a word and a popup will show the Define option that will allow you to look it up using the same dictionary that has been previously available only in iBooks.
  • iMessages: a new  add-on for the Messages app makes it possible to send free MMS messages to any owner of an IOS device. Many people with hearing disabilities rely on text messaging as a convenient means of communication. The iMessages will be especially helpful to those who are on a limited text messaging plan.
  • Reminders app: The new Reminders app has a simple interface that will make it a nice app for people who need help with keeping track of assignments and other tasks. On the iPhone 4 or iPhone 4S, tasks can be tied to a location using the phone’s GPS capabilities. One use of this feature could be to set up a reminder for a person to take their medication when they get to a specific location, for example.
  • Airplay mirroring (iPad 2, requires an Apple TV): along with IOS 5, a recent firmware update for the Apple TV enables mirroring to a projector or TV using Airplay. I can see this option being helpful in a class where there are students in wheelchairs who have difficulty moving around the room. Using air mirroring, the teacher could bring the iPad 2 to the student and the rest of the class could still see what is displayed by the projector or TV.
The new accessibility features make IOS 5 a must-have update for anyone who has a disability, as well as for those who work with individuals with disabilities. For schools and other educational institutions, the accessibility features of IOS make Apple mobile devices an ideal choice for implementing mobile learning while complying with legal requirements such as Section 504, Section 508 and the Americans with Disabilities Act.
Disclosure: I am an Apple Distinguished Educator.