A conceptual introduction for sighted readers on how VoiceOver works

On January 9th, 2007, Steve Jobs first announced the iPhone. Many people including my friend Dr. Nathan Klapoetke were very excited about it and couldn’t wait to buy one. That day, Nathan was reinstalling windows for me as there was no real practical way to do that yet with a screen reader. While Nathan ranted about how he couldn’t wait to get one, and how all future phones would be all touch screen with no buttons, I screamed inside because I couldn’t think of any way to make that kind of interface accessible to blind people like myself. This was the common thought in the blind world, and it continued for 2 years.

We were all extremely surprised then, when Steve Jobs announced VoiceOver was coming to the iPhone at WWDC in June 2009. VoiceOver is Apple’s screen reader, that make Apple products talk, and accessible to blind people.  I bought an iPhone 3GS a month later, and never looked back.

I am often asked, “How do you use an iPhone when you can’t see what you’re touching?” since for sighted users if you touch an icon it activates immediately. Apple has always liked to say “think different” here is one of those ways, interact different.

When a sighted person looks at their phone they can see the current screen, read all the text, without even touching the screen. These actions I call, passive interactions, as nothing is changed. VoiceOver has gestures to do all of these things.

Swiping with one finger, the rotor gesture, and exploring the screen by sliding one finger around is how VoiceOver users find what’s on their screen without changing anything. Those are the passive gestures, that a sighted person does through only looking at their devices.
With VoiceOver turned on, swiping right or left with one finger will move the VO cursor to the next interface element, swiping up or down vertically, depends on the currently selected rotor option.

The rotor gesture is super awesome and for years gave VoiceOver a huge advantage over Android’s screen reader called Talkback. Recently, Talkback has been rewritten, and has improved from some of its past more annoying interfaces. The rotor action is suggested by many to be thought of as turning a radio dial on your screen. Place your thumb and a finger on the screen and then turning it either right or left while VoiceOver changes what swiping vertically up and down will do. In the VoiceOver settings, there is a list of options one can either add or delete from the list of features that will show up in the Rotor, it’s in effect, a menu and can make using VoiceOver or VO as most VO users call it much , more capable and efficient.
Some of the most used options used in the rotor menu are: words, characters, lines, VoiceOver volume, Language, speaking rate, braille screen input, etc.

Words, characters, lines, actually change what the VO curser moves by when swiping up or down. We VO users use this to change how VO moves through documents.

Many people can read more than one language, and when configured, VoiceOver can too.

Braille screen input, allows VoiceOver users to calibrate the screen so that they can input text using the 6 dot matrix of braille. For those using VoiceOver, it is way faster than an on screen qwerty keyboard to type on.
There is another option which can dynamically appear in the VO rotor called rotor action. This only appears when the currently open app has rotor actions available in the current UI view. They must be provided by the developer of the app in code, and when they are, the currently active interface can be many times easier or more efficient. When I’m using an app and find it has rotor actions, it really makes me smile.

Once when I-spoke at a Coco Conf conference, my friend Eric Knapp suggested that I always mention the rotor when demonstrating VoiceOver during my talks. I guess even when sighted people know about VoiceOver, and might even know how to use it at some level, some of them hadn’t yet learned about the rotor. Taking the time to understand the rotor will make your day go much easier.

Typing, opening apps, changing something, these I call active interactions, this is when a sighted person actually touches the screen in some way.
I have been ask, “so if you can touch the screen to find out where things are, how do you open an app?” There are two ways to activate on icon using VoiceOver. Well, actually more if using a physical keyboard; that’s another article for another day.

When you swipe or explore to an interface element, and then lift your finger from the screen, unlike for a sighted person where the touch point may be very small, the entire screen now is the button for the currently selected icon. Double tapping will activate that element. When exploring the screen if one keeps their finger on the element they want, and taps with a second finger anywhere on the. screen, that icon will be activated.

Eric said this was another thing that forced sighted audience members to think in a different way, that before activating an icon with VoiceOver, the whole screen was the button.
Other active gestures like scrolling, or reading the entire screen, which will scroll in long documents uses multiple fingers. Typing on iOS touchscreens is also doable with VoiceOver announcing what letter is being touched, but braille screen input or a Bluetooth keyboard is much faster.

So if sighted, one has active gestures only, if one uses VoiceOver, or TalkBack on Android, then they have both passive or active gestures. For a sighted person, passive gestures are all accomplished by looking at the screen without touching anything. Some might argue that scrolling is a passive gesture, but I would say it is changing the screen layout even if just moving down a page of text.

VoiceOver first appeared in macOS 10.4, Tiger, and when macOS 10.6, Snow leopard came out, VoiceOver got a new feature, trackpad commander.

When turned off, the trackpad behaves like it does for sighted users, when trackpad commander is on, the trackpad with VoiceOver behaves much like an iOS screen. All of the same VO gestures found in iOS also work on macOS. This, in some ways, makes macOS and iOS for beginning VoiceOver users very similar and easier to learn. So much that Tim Sniffen wrote a book called “Mastering the Mac with VoiceOver“, where he primarily taught his clients to use VoiceOver on the mac in this way.

I have already even argued, that VoiceOver users have had touch screen macs since trackpad commander came out in summer 2009, I still hear many sighted mac users lament. Maybe, they should just use VoiceOver.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s