Tech Tips: An Accessible Smart Phone
Apple's iPhone VoiceOver is an application that provides access for blind and low vision users to Apple's OS X operating system and their iPhone and iPod touch devices. A sighted user sees application icons located in a grid on the iPhone's screen, and then taps with one finger on the desired application and it opens. Once open, the user taps on the various buttons specific to the application and reads the text, using a finger flicking gesture to scroll up and down to view more text. When more than one screen contains icons for applications, a sighted user would use a one-finger flick gesture to page left or right to show those other screens. Note that this flick gesture mentioned throughout this article is one in which your finger touches the touch-screen, drags across the surface, and then lifts off of it quickly.
VoiceOver can be activated by the user who is blind if his or her iPhone is connected to a screen reader enabled computer with iTunes running, otherwise it would require a sighted user to turn it on manually via the phone by doing the following steps:
- Tap the Settings icon.
- Tap on the General choice within Settings.
- Scroll down to Accessibility and tap on the VoiceOver category.
- Tap on the on/off switch for VoiceOver.
You'll then hear the iPhone speak via VoiceOver. Once enabled, interacting with the iPhone changes from what the sighted-user expects.
Key flick and tap gestures used with the iPhone:
- Flick Right = Move to next item
- Flick Left = Move to previous item
- Flick Down = Move to next item using rotor setting
- Flick Up = Move to previous item using rotor setting
- Three finger flick down = Scroll up one page
- Three finger flick up = scroll down one page
- Three finger double tap = toggle speech on/off
- Two finger double tap = Start and stop the current action
- Double tap = Activate or open the selected item
- Tap = Speak item
- Two finger flick up = Read all text from top of screen
- Two finger flick down = Read all text from current location downward
- Three finger triple tap = Turn screen curtain [privacy] off/on
With VoiceOver active, you can explore your iPhone by moving your finger around the screen. As you hover over an icon or control, you will hear a click sound followed by the name of that icon or control. Alternatively, you can flick left or right to jump to the next/prior icon or control. Flicking downward will read one character at a time the name of the highlighted application icon.
To open an application, double-tap an application icon or hold one finger on the icon and simultaneously tap with another finger elsewhere on the screen. Once the application is open, tap in various places on the screen to find buttons unique to the application, or flick left/right to toggle through available buttons. To page through screens that require scrolling, use three fingers to flick forward and back between pages.
Another controlling gesture is called the Rotor. This is done with two fingers both touching the screen and pivoting as if turning a knob clockwise or counter-clockwise. Doing this gesture in the mode where application icons are showing on the screen will toggle back and forth between two verbosity choices, words or characters. Once set, the up/down flick gesture will read one character/word at a time.
The VoiceOver speaking rate can be changed within the VoiceOver settings area by scrolling down to the Speaking Rate area and moving the slider. A single finger flick up when this is selected will increase the speaking rate. Single flicking down will decrease the speaking rate.
Limitations to Accessibility
Just as poorly designed webpages can be inaccessible if web authors don't label images with alternate text, poorly designed iPhone applications, or apps, can have similar problems. Following are a few examples.
The Weather app that ships with the iPhone presents the weather using graphic symbols that are familiar to sighted users, such as bright glowing suns and dark clouds with lightning bolts. However, this app does a great job of providing this same information to people without sight. Each tap or flick causes VoiceOver to read meaningful information, such as "Tuesday rain, high 63 degrees Fahrenheit, low 50 degrees Fahrenheit" (typical Seattle forecast).
Similarly, the KOMO TV news app, a local Seattle news source, is highly accessible. When you tap on a headline VoiceOver reads it aloud. If you double-tap the article it opens. Single taps on a paragraph reads aloud that entire paragraph. If this or another application uses italicized text however, VoiceOver fails to read that text.
In contrast, the New York Times app isn't designed to read back the article headlines nor does it enable the user to choose an article. Double-tapping doesn't bring up that article but instead, refreshes the list of articles in the application, leaving these articles unavailable to the person using VoiceOver.
One of the biggest barriers to iPhone users with visual impairments is typing. The iPhone has no physical keyboard for keystroke entries; instead it has an on-screen keyboard that appears when the user has entered a mode requiring characters to be entered. When VoiceOver is enabled, the interface acts similarly to how it deals with icons. Tapping on a keyboard character echoes back that character. Double-tapping on the same tiny target inputs that keystroke into the active application. This can be a challenge to do accurately as the on-screen keys are tiny and located close to each other, even in the landscape tilted orientation where they are spaced further apart. Fortunately there is an alternative keystroke entry method. With one finger on the desired key that you hear read aloud, you can hold that down and tap elsewhere on the screen with another finger to make that keystroke selection. The user can also do right/left flick gestures to move through the various on-screen keyboard characters when that is active. However, this can be a time-consuming way to type.
Privacy can also be a problem for VoiceOver users. For example, when entering a password on a desktop or laptop computer, screen reader software typically reads aloud only "asterisk, asterisk, asterisk" rather than the characters you're typing. This way, if someone else is listening as you type, they won't hear your password. Unfortunately the iPhone doesn't work this way. All characters are echoed as you type them, so in order to have similar privacy as what is found on a desktop or laptop computer, an iPhone user should make use of headphones so that someone else cannot overhear the password characters.
VoiceOver is capable of accessing email accounts that the phone can access. Single taps will reveal the folders in the account or the key information in the summary view of individual email messages. Double-taps will open a message; single taps will read stand-alone sentences or paragraphs.
There are thousands of applications available through the Apple App Store, and each of these can contain a number of buttons and controls. Keeping track of all these controls and remembering where they're located on the screen and how to access them my moving your fingers requires a bit of cognitive overhead.
Summary
The iPhone with VoiceOver enabled provides a great deal of access for users who are blind, not only to phone and messaging functions, but also to a variety of Internet-based applications. However, just as poorly designed websites can be difficult or impossible to navigate if the user utilizes a screen reader, poorly designed iPhone apps can prove to be equally limiting.
The on-screen keyboard does not come close to the input capacity of an external keyboard. If at some time, a user could connect an external keyboard via Bluetooth it would be much easier for both the sighted user and the user who is blind to do large scale input into iPhone applications. One clip-on mini-keyboard is due to be released in November of this year and may increase the ease of input for users who are blind as well as for sighted users.