Category Archives: ed tech

New webinar setup with Reflector, iPhone and iPevo Stand

I have had great success using Reflector on my Mac to mirror the screen from my iPad when I do webinars. However, after some feedback  I received from a recent webinar on switch access I decided to look into improving my setup.One of the viewers suggested that I show my interaction with the switch interface (the tapping of the buttons, etc.) along with the mirrored iPad screen. I agree that this would be helpful when showing off not only Switch Control but also VoiceOver. With VoiceOver, there are many gestures (flicks, swipes and the like) that don’t translate well during a webinar if you are only mirroring the device screen. I had a chance to try a new setup when I did a webinar on VoiceOver and Zoom this past week, and I was very pleased with the results.

I took advantage of Reflector’s ability to mirror multiple devices as follows:

  •  Device 1: iPad mini mirroring the screen to Reflector as usual.
  • Device 2: iPhone mounted on an iPevo iPhone stand ($69) and running the iPevo presenter app.

The iPevo presenter app is a free app designed for use with iPevo’s iPhone stand. It has the option to hide all controls and show a very minimal interface so that there are no distractions. Below is a photo of my setup where you can see the split screen effect I got on my computer display, which I then shared with my webinar participants using the screen sharing feature of our webinar platform.

Webinar setup: iPad mini and iPhone mounted on iPevo stand on the left, Mac showing mirrored devices on the right.

I tried a similar setup with iPevo’Ziggi HD document camera, but I found it could not keep up with the motion whenever I performed a gesture on the iPad with VoiceOver. In the end, the iPhone camera did much better in keeping up with the motion of my hands during the VoiceOver demos.

My one concern is that having the two screens up could be distracting, so we’ll see what the feedback says on that point. For now I plan to use this setup for any of my upcoming webinars that involve VoiceOver or Switch Control.

Update: iPevo suggested lowering the resolution while using the Ziggi HD camera to see if that would work better for capturing the motion. I found that a resolution of 1024X768 worked well on my 11 inch Macbook Air. I also made sure to let the camera focus on my iPad screen and then selected Focus Lock in the Presenter app on my Mac (pressing the letter M will also lock focus). I will probably use that setup when doing a Switch Control webinar where it is nice for people to see the hardware and the iPad at the same time. Thanks for the suggestion, iPevo.

Advertisements

Accessibility Options in Voice Dream Writer App

This week, Winston Chen and the Voice Dream team released a new Voice Dream Writer app,. I am highlighting the new app here not only because Voice Dream Reader is one of my favorite apps for students who need reading supports such as text to speech, word highlighting and customized text, but also for the attention to accessibility from the Voice Dream team in this new app. Not only are the controls and the interface in the app nicely labeled for VoiceOver users, but there are even a few features specially designed to make things easier for VoiceOver users.

Screenshot of Voice Dream Writer interface on iPad with VoiceOver controls highlighted.

When VoiceOver is turned on the app can recognize this and adds three buttons for editing text to the interface (they appear in the toolbar located just above the onscreen keyboard, on the left side). These buttons are:

  • Cursor:  allows the user to move the cursor by flicking up or down with one finger.
  • Cursor movement unit: changes how the cursor movement takes place by allowing the user to choose from characters, words or sentences.
  • Select text: selects text based on the cursor movement unit. For example, flicking up with sentences as the cursor movement unit will select the text one sentence at a time.

All of these controls are adjustable. A flick up or down with one finger will change the value (for the cursor movement unit) or navigate to/select the next item (for the cursor and select text buttons).

A three-finger swipe gesture is also supported for cursor movement and selection: a three-finger swipe up will move the cursor to the beginning of the document and a three-finger swipe down to the end, and three-finger swipes up or down will select the text from the cursor position to the beginning or end of the document.

Another nice feature of the app is the way it makes it easy to find misspelled words by opening Tools in the upper right and choosing Find Misspelled Words. You can then flick down with one finger to navigate the misspelled words in your document. When you get to a word you want to fix you have two options: you can double-tap with one finger to edit it with the onscreen keyboard or you can swipe from the right with three fingers to use the Word Finder with a phonetic search. The phonetic search will bring up a list of words that closely match the one that is misspelled in your document.  You can then choose the correctly spelled word from the list and double-tap with one finger to make the correction.

I did a short video to demonstrate some of these options in the Voice Dream Writer app. I hope you find it helpful. For more information about the app, make sure to check out the Voice Dream website.

Two New Options for Recording Your iPad Screen

When recording my iOS tutorials my setup has consisted of mirroring my iPad to my Mac with the Reflector app, then using the Screenflow app to record the mirrored iPad display. For audio I use a Blue Snowflake mic. This setup works well and I get the added benefit of an iPad frame around the mirrored display for a nice aesthetic.

With Yosemite, I have two more options for recording my iPad screen. First, I can select the iPad as a camera source in QuickTime Player. To create a new recording of your iPad screen with QuickTime:

  1. Make sure you have your iPad connected to your Mac with a Lightning cable.
  2. Launch QuickTime Player.
  3. Select File > New Movie Recording.
  4. Select your iPad as the camera from the pulldown menu to the right of the Record button.
    iPad selected as camera source in QuickTime Player.
  5. Perform the actions you wish to record on the iPad.
  6. Press Stop in QuickTime Player on your Mac.
  7. Choose File > Export and select your desired resolution. Another option is to choose File > Share (or the Share icon to the right of the QuickTime controls) to upload your iPad recording directly to a site such as YouTube or Vimeo.

This workflow will work well in situations where you are not able to use AirPlay to connect to Reflector or another mirroring app (I also use Air Server on occasion).

With the release of Screenflow 5, TeleStream has built on this support for recording the iPad screen in Yosemite. As with QuickTime Player, you can now choose the iPad as a camera source when configuring a new Screenflow recording session.
iPad selected as camera source in Screenflow new recording pane.
Screenflow adds a nice touch (literally): you can add touch callouts that show many of the iOS gestures (taps, swipes, zoom gestures) at specific points in your video recording. This is helpful for pointing out where a user should tap or perform a gesture while you demo apps and built-in features on the iPad.
Touch callouts menu in Screenflow.

Along with the other editing features included with Screenflow ($99 or $34 to upgrade from a previous version) I think this makes it the ideal solution for those who need to record the iPad screen on a regular basis (educators, app developers who need to demo new apps, or anyone with a passion for teaching on a public forum like YouTube or Vimeo).

 

A Visually Impaired Photographer Experiments with the GoPro

As many of you who follow me online know, I am very passionate about photography and the possibilities it presents to us as people with disabilities to tell our own stories and exercise our creativity. I love to use my iPhone to take photos because it incorporates so many accessibility features that help me with my photography, such as Zoom, VoiceOver and Invert Colors. However, I am always looking for new options to expand my photographic horizons and the Go Pro action camera is one such option that has fascinated me for some time. I believe that just because you have a disability it doesn’t mean you should not be able to ski, sky dive, or do anything else you set your mind to doing, and the Go Pro has become the go-to camera when it comes to action sports and an active lifestyle.

I started out with the  least expensive option in the GoPro lineup, the new entry-level Hero which retails for $129.  However, after about a week with the camera, I returned it and opted for the older Hero 3 white model, which is think is a better fit for my needs. The new entry-level Hero has a number of shortcomings due to its low price and limited feature set. However, if you’re an educator looking for an inexpensive camera for recording classroom activities (science experiments, plays and performances, etc) this is a nice camera and there are ways to get around its limitations:

  • it does not have an LCD screen for framing shots and adjusting camera settings like the more expensive Go Pro 4 Silver. I don’t think this is  a significant drawback, since use of  the LCD screen outdoors would be difficult anyway due to glare. The camera’s wide field of view makes it likely that you will capture the shot you want even when you can’t frame it with a viewfinder. For someone who has tunnel vision, the wide FOV is actually one of the things that made the Go Pro so attractive to me. Go Pro does sell an add-on LCD screen but I’m not sure if it is supported on the new Hero. Regardless, using the add-on screen will probably lead to reduced battery life.
  • it does not support Wifi connectivity. With other Go Pro cameras (like the Hero 3 White I eventually traded up to), you can set up a Wifi connection between the camera and a smartphone to control the camera and see what you are capturing. However, as with the addition of an add-on LCD screen, a drawback to Wifi connectivity is that it drains the battery much faster.
  • it has a built-in battery that cannot be replaced or swapped out to extend the length of time the camera can be used in the field.  A workaround for this is to use any of a number of smartphone or tablet external batteries that matches the needs of the Go Pro (5V and 1-1.5 amps). The external battery will allow you to capture longer time lapse photos where the camera has to be turned on for extended periods of time. I was also fortunate to find an old power adapter for a Kodak zi8 camera that  allows me to plug in the Go Pro to a wall outlet to charge it much faster than through the USB port on a computer.
  • it does not support the higher resolutions of more expensive Go Pro cameras. The Hero tops out at 1080p (30 fps)  and also supports 720p at 30 or 60 fps. It does not support 4K, which is fine by me as the higher resolutions result in huge files I can’t possibly store or process on my Macbook Air.

Despite its limitations, I still think the new Go Pro Hero is a  nice entry level camera for use in educational settings. It provides access to the many possibilities for using this type of camera to support learning (examples of which are featured on this website by Lisa Tossey) but at a very reasonable price. However, from an accessibility perspective, the biggest problem is not the lack of an LCD viewfinder with features such as large text or a high contrast mode. Rather it is the fact that there is not an option to set up spoken feedback other than a series of beeps as you advance through the various menu screens which are displayed in a small window on the front of the camera. If the camera had Wifi connectivity I could probably use VoiceOver and other accessibility features on my iPhone to get better access to the camera menus and settings.

This possibility convinced me to exchange the Hero for the older Hero 3 White, which does support Wifi. I was able to download the free Go Pro app from the App Store and it has some VoiceOver compatibility.  I’m convinced that with a few tweaks this app could be made very accessible to the blind. For the most part the  buttons have VoiceOver labels that can be spoken aloud to a blind user, but these labels could be improved so that they are clearer and easier to understand when read aloud.  For example, I don’t need to hear the following when I choose the option for reviewing the clips on my camera: GoPro app, list view icon, cam roll. Just describing it as Camera Roll would be sufficient. Surprisingly, the shutter release button is the only button or control with no label at all (it just says “button”). In any case, through the Wifi connection I will still be able to use Zoom and other accessibility features on my iPhone even if the app does not have great VoiceOver support.

With the Hero 3 White I lose the following  features which are only available in the current generation cameras: Quick Capture, Super Wide Capture and Auto Low Light. Quick Capture allows the capture of video with a single tap of the top button and time lapse with an extended press, while Super Wide extends the FOV slightly and Auto Low Light gives the camera better dynamic range in low light situations. Of these three features only Super Wide would be significantly helpful to me. I don’t shoot in the kinds of environments where Auto Low Light would come in handy (due to my difficulties with navigating low-light environments) and Quick Capture is a nice-to-have but not an essential feature.

The Hero 3 White also has a slightly lower top frame rate for photos, topping out at 3fps as compared to 5fps for the new Hero, as well as a smaller capacity battery.  However, I can compensate for the smaller capacity battery by purchasing a number of inexpensive add-on batteries (retailing for $15-20 each) which the Hero 3 White supports but the new Hero does not. The swappable batteries would make up somewhat for the battery drain resulting from the use of the camera with a Wifi connection to my iPhone for accessibility support.

Along with the Wifi connectivity, the Hero 3 White also has support for an HDMI port (for connecting the camera to a HD TV), the ability to connect an external microphone for improved audio (using an adapter), support for  higher capacity memory cards (topping out at 64GB as opposed to 32GB with the new Hero) and again,  swappable batteries. The Hero 3 White has more customizable time-lapse settings, allowing for intervals from half a second to a full 60 seconds. The new Hero on the other hand is set to a single interval of half a second. Both cameras are very similar in terms of mounting options and underwater performance (with a top depth of 131 feet in each case).

I have had great fun with the  Go Pro cameras during the short time I have owned them, and I really think the Hero 3 White will be a better action camera for me than the entry-level Hero (at least until I can get one of the higher priced models like the Go Pro 4 Silver). I end this post with this photo I took with my new GoPro during a recent visit to the beach.

Go Pro selfie taken at the beach in St. Petersburg, Florida.

 

iBeacons Experiment with Beacondo

As detailed in a blog post on AT Mac, iBeacon is a new technology that has a lot of potential for people with disabilities. iBeacons are small devices capable of emitting a low-power Bluetooth signal that can be recognized by an iOS device and used to trigger an action such as opening a website, playing a video or sound and more. One use case that is already in implementation is the use of iBeacons to provide environmental cues that help people who are blind navigate the environment in an airport or  other place of public accommodation.

I had been curious about iBeacons for a while, and even purchased a single iBeacon from Radius Networks to try the new technology out. The one I got was only $29 and it works while plugged into a USB port for power. Other iBeacons have their own battery and don’t have to be plugged in, providing more flexibility of installation. In the future, I will probably buy the $99 3-pack from Estimote for this reason.

In preparation for a session at Closing the Gap focusing on how Augmented Reality and iBeacons can be used to provide UDL supports, I finally took the plunge and started experimenting with my new iBeacon. I created a simple content delivery app using the Beacondo software which is available as a free download along with an SDK. I followed along with the tutorials on the Beacondo site and a couple of hours later I had a working iBeacon app inspired by a similar one I saw demoed at this year’s Apple Distinguished Educator Institute in San Diego. At Closing the Gap, I will use this app to introduce iBeacons to the participants as they walk around the room and learn what an iBeacon is, the different types of  iBeacons available for purchase, and how they are being implemented in education (with links to the websites of my ADE colleagues Paul Hamilton and Jonathan Nalder, who are the true experts in this area).

I couldn’t believe how easy it was to create the app with Beacondo. I just followed these steps:

  • Downloaded the free Xcode software from Apple.
  • Downloaded the Beacondo Designer software and the SDK.
  • After watching the tutorials, opened Beacondo and got started customizing the various screens in the template included with the SDK. I had to include any videos and images I wanted in the app inside my project directory so that they would be available in the various pulldown menus inside Beacondo Designer.
  • Clicked on Build and chose Xcode to preview the app using the iPhone simulator.
  • Rinsed and repeated as needed to get my content looking the way I wanted.
  • When I had the app looking just the way I wanted it was time to add the iBeacon and assign an action as demonstrated in this video.
  • Did a final build for Beacondo Viewer, an iOS app that allows you to open your app for testing on your device. Building for Beacondo Viewer exports the app as a zip file that can be easily shared online.
  • Uploaded the app as a zip file to Dropbox and created a QR code using Kaywa QR Generator, my favorite tool for creating QR codes.
  • Opened Beacondo Viewer and chose the Scan from QR Code option, then scanned the QR code I had created earlier.

The first few times I did this I could not get the app to open in Beacondo Viewer. A quick email to Beacondo and they informed me that I had to change the ending to my Dropbox link from “dl=0” to “dl=1.” Beacondo will not be able to download the app’s zip file if it encounters a “Download now” screen and changing the end of the URL gets around that. With that small change I was able to download the app to Beacondo Viewer and the next time I walked into my room I was greeted with an audio message I had recorded and the app opened up directly to a page explaining what an iBeacon is, just as I would want it to do for participants at our Closing the Gap session.

From a UDL perspective, iBeacons could be really useful for embedding instructions and other context-aware supports that are available to learners when and where they are needed. Paul Hamilton does a nice demonstration of how he is using iBeacons to create learning centers in his classroom. iBeacons would be a great way to embed virtual coaches in these learning centers or stations to aid students with autism or executive functioning difficulties (think visual schedules, social stories and other supports that are triggered only when the user is at the location where these supports would be useful). I am also interested in the use of QR Codes, Augmented Reality and iBeacons to create layered learning environments where users have multiple pathways through the content, triggering on-demand access to background information or more advanced topics  as they see fit.

 

 

Recording Setups for iPad

While the iPad has a nice microphone that records decent quality audio, I have been wanting to explore the possibility of getting even better audio using some of the external microphones I already own, such as the Snowball from Blue. The Snowball works great with Garageband for iOS, provided you have the proper adapters to connect the mic to the iPad. In order to connect the mic to my iPad mini with Retina display I needed an Apple Camera Connection Kit as well as a Lighting to 30-pin adapter. Both of these I already owned, but if you don’t have them already they can be purchased at any Apple store. Some electronics stores such as Best Buy also carry them.

When you launch Garageband with the Snowball plugged in and select the Audio Recorder as your instrument, Garageband will actually let you know if you are using the external USB mic to record (just look in the text box to the right of the VU meter.

As much as I love the Snowball, I wanted to have a more portable solution that I could take with me if I wanted to record while on the road. After doing some research online, I think I found a really nice setup consisting of the following parts that I purchased on Amazon:

The IK iRig PRE is the key to the setup. It provides power to the XLR microphone by way of a 9 volt battery and connects to the iPad through the headphone jack. I can then connect my headphones to the iRig PRE if I want to be able to listen in while I record.

This setup also works well if you are blind and want to create recordings with Garageband while using VoiceOver, as you can hear the VoiceOver speech with your headphones and still use the microphone to record. Garageband has excellent support for VoiceOver and it allows recordings to be posted directly to SoundCloud from within the app. I created the following recordings to show the difference in quality between the three different options for recording: built-in mic, Blue Snowball, and Behringer mic connected with iRig PRE. All of the recordings were made using VoiceOver and Garageband on the iPad.

First, the built-in microphone on the iPad:

Next we have the Blue Snowball connected to the iPad through a Camera Connection kit.

Finally, we have the Behringer XLR mic connected to the iPad with the iRig pre using the headphone jack.

To me, I think the Snowball sounded the best, but it is not very portable due to its weird shape and weight.

 

A SAMR and UDL Framework

As I was traveling to Macworld 2013, where I presented a session on iBooks Author, I had some time when I was trapped on a plane without Wi-Fi (the horror!). Rather than reading the magazine in front of me, I gave into my urge to try to combine two frameworks I am really passionate about, the SAMR model developed by Dr. Ruben Puentadura and the UDL framework developed by CAST. Below is an image showing the framework I developed and some apps that address each level. This was just a quick brainstorm on a long plane ride, but I do appreciate your feedback.

SAMRandUDL008.008 SAMRandUDL.009

 

Update: Here is a text version that should be more accessible with a screen reader (with app and feature matching):

n: needs assessment and profile
determine current level of performance and desired outcomes.

A: access to content and tools
The technology eliminates barriers that prevent access to information

  • Proloquo2Go
  • FaceTime
  • VoiceOver
  • AssistiveTouch
  • Closed Captioning Support
  • Dictation (built-in with iOS)
  • Dragon Dictation
B: building supports and scaffolds for learner variability
The technology includes scaffolds and supports that account for learner differences.
  • iBooks
  • AppWriter US
  • Speak It!
  • Typ-O HD
  • Evernote
  • Notability
L: leveraging multimedia
The technology provides multiple means of expression.
  • Book Creator
  • Creative Book Builder
  • StoryKit
  • SonicPics
  • StoryRobe
  • Pictello
E: expression and creativity
The technology unleashes creative potential and disrupts perceptions of disability.
  • Camera
  • iMovie
  • Garageband
  • iPhoto
  • Instagram