My realization that blind users of VoiceOver have had touch screen macs since 2009

In the early 1990s , Neal Stephenson released his now well known book “Snow Crash“. Then in 1999 he wrote the even more famous book Cryptonomicon. He also wrote a lesser known and much smaller essay entitled “In the Beginning was the Command Line“. In this essay Neal Stephenson talks about interfaces; not just of computers but how every object we use has an interface, beginning with his best friend’s dad’s old car. He talks about how beginning with the first mainframe terminals up to Microsoft Windows and Apple’s Macintosh, the way humans first interacted with the computer was through the command line. The command line is still great, takes few resources, and even still today potentially simultaneously provides many more options than any graphical interface often called a GUI. The GUI was invented though, for the same reason the command line replaced punch cards, the command line was way more efficient than punch cards for everyone, and then later the GUI was more convenient than the command line and easier to use , at least for sighted people . Graphical interfaces meant people didn’t have to remember tons of commands, and could become more familiar with a system faster. The mind with sight available to it, is great at making data points of spatially presented, and intersecting pieces of information. The GUI is great at displaying information in 2 or 3 dimensions to the visually enabled mind, instead of 1 dimension the command line presents. It was a great match, except for the abstractions we have still today. The arrival of Apple’s first Macintosh in 1984 blew the world away with it’s amazing graphics for that time, and the mouse? I’m sure many wondered why they would ever want a small furry rodent on their desk.

 

Along with computer mice, we also saw trackballs and trackpads, but they all still have the problem of dynamic rather than static reference.
When using a trackpad, if the mouse pointer is in the center of the screen, but the user places their finger on the lower left corner of the trackpad and slides to the right, the pointer will move from the center of the screen to the center of the right edge; and depending on how the settings are the finger may have only moved a half an inch, or 6 inches, still on the bottom of the trackpad. The mouse is even more removed by abstraction. I played with all 3 of these input devices during my years on Microsoft Windows, but was never productive with any them.

 

In early January 2007 while having dinner with my friend Nathan Klapoetke  he was ecstatic about the new iPhone that had just been announced; at the time I cringed in fear knowing that soon all cell phones would no longer have buttons and had no idea how a blind person would use them.

Two years later at WWDC 2009 Apple announced that VoiceOver was coming to the 3GS  and the blind community was completely blown away, no one saw that coming. Late in June 2009 I went to the Apple store and played with VoiceOver for the first time. I’d already read the iPhone manual’s chapter on VoiceOver, so I had a bit of an idea what to do, or at least how to turn VO on. I only had an hour to play, but except for typing, reading text and getting around basic apps didn’t seem too bad; 9 days later I bought one. The first text message I tried to send though, was a complete disaster, but I still knew my world had changed for the better.

The idea that when you touched some part on the screen, you were directly interacting with that icon or word made a lot of sense to people; blind and sighted alike. Even young children before they can read understand tapping on icons to start games they already , , know how to play. In some ways, the touch screen is the command line equivalent of visual interfaces. Being able to directly touch and manipulate screen elements is efficient on such a basic level, that I wouldn’t be surprised at all if using touch screen interfaces activated the same parts of the brain as making something out of play dough  or clay. There’s an interesting topic of discussion currently going on over how Microsoft tried to make Windows 8 a touch first interface, failed, and now how Windows 10 offers touch based interfaces for those who want it but still behaves like a traditional desktop. On the other hand, Apple has never tried to bring touch screens to their macOS at all until the 2016 line of MacBooks with the new touch bar, which really isn’t a screen at all and currently must only be an extra program’s offering as many macs still don’t have it.

And now, as Paul Harvey used to say, “, the rest of the story.” as most people would tell you, and as google searches would reply with, there are no Apple computers with a touch screen. Except, unless you’re a totally blind person using VoiceOver. The gestures VoiceOver users learn on their iPhones have been available to them on their macs as well starting with Snow Leopard. ; with trackpad commander on VoiceOver , behaves very much like it does on iOS. If with trackpad commander on, I touch the exact center of the trackpad, the mouse pointer is also on the exact center of the screen, and if VoiceOver announces an icon i want i just double tap to activate it. All of the abstraction I struggled with trying to use a mouse or trackpad without the commander mode are gone; but here’s a rare moment where sight still gets in the way. It is so instinctive for someone who can see to visually follow  where their hand is going, that even if most of them turned VoiceOver and trackpad commander on and speech off while still looking at the screen, they still would find it quite difficult to use. that the screen being separate from the trackpad visually is too abstract for many of them. The trackpad is obviously much smaller than the actual screen, though since I can’t see it that doesn’t really matter anyway, but beyond that as a VoiceOver user I’ve had a touch screen on my mac for 7 years. I and probably most other blind users still don’t use it as much as we probably should, or for many of us hardly at all, though I have found some ways in which it is way more efficient than using more traditional VoiceOver keyboard commands.

 

If I’m told that an interface control I want is in the lower left corner of the screen, using trackpad commander, I can go there directly. If I’m using an interface with a list of items in the center and buttons around the edge I can get to the list way faster than navigating there with a keyboard.

Tim Sniffen has published a book entitled “Mastering the mac with VoiceOver” in which he for the most part ignores keyboard commands altogether and teaches with trackpad commander mode instead. He trains many veterans who lost their sight while deployed. , and says after they become comfortable with VoiceOver on iOS it’s an easy transition for them to their macs. We VoiceOver users should probably listen more to Tim and learn from his experiential wisdom, and for the sighted proud, at least you know if your vision ever degrades so far that in the end you have to use VoiceOver, at least you’ll have a touch screen on your mac.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s