My Address to the National Council of State Agencies for the Blind
Using Apple hardware and apps as an effective tool in the toolbox
Address delivered to the National Council of State Agencies for the Blind at its conference in Greenville, South Carolina, 17 November 2017
It’s an honour to have been asked to speak with you today. By way of background, I have a foot in two distinct camps that handily co-exist on this occasion. I’ve been a senior manager in, and ultimately Chairman of, New Zealand’s blindness agency. Our situation is a little different, in that it’s a charitable organisation receiving some Government funding, but also with many services dependent on public giving. So I know what it’s like to identify enormous need, while having to live within budgetary constraints. I know what it’s like to have to make those difficult calls about programmes, resources and staffing.
In the other camp, I’m also a shameless geek. I share the enthusiasm your tech people have for the power this technology has to change the lives of blind people for the better. I live it, and through Mosen Consulting, I write about it, train in it, talk to mainstream developers about it, and advocate for it. I suspect I’m also here because, while I own a lot of technology and enjoy geeking out, I have a reputation for not mincing my words, and am no one’s fan boy. I’ve been a consumer leader and advocate, and I know that the benefits we now enjoy didn’t fall out of the sky by magic. They’ve been hard-won, and they must be safeguarded with vigilance. So, here I am, a geek who can do a bit of public speaking and identify with some of your challenges. Today, I want to give you some straight talk about the many exciting, game-changing aspects of Apple technology, while making it clear that there are challenges. I may send you away with some homework, because I believe that those of you here today may be uniquely placed to assist the community to overcome some of those challenges.
As the old cliché goes, the only constant in the world is change. And there’s no better, more dramatic example of this than technology. For those who make decisions about resource allocation, it’s not just that the technology itself is being updated at a frenetic pace, it’s also that the rules of the game have changed.
I’ve been asked to address Apple hardware and apps as an effective tool in the toolbox. It’s an important topic, because a smartphone, based on the current state of accessibility, isn’t always the right tool. Just as you wouldn’t use a chainsaw to hammer in a nail, it’s important that we set realistic expectations about when a tool like an iPhone is the right one, and what risks may exist when using one.
I’m glad to be of an age where I’m old enough to remember when my needs were given scant consideration by mainstream technology companies, because I’ll never take for granted the thrill of being able to pick up a brand-new product on release day, and use it fully. New Zealand was the first country to get iPhone X due to time zones, and I was just as excited as anyone to put Apple’s new pride and joy through its paces. Apple clearly spent considerable time developing options to ensure the new Face ID technology took our needs into account. And they did that work in time for the initial release. Apple deserves enormous praise for doing the right thing.
With its extensive battery life and small size, iPhone is a phenomenal productivity tool. Apple has taken great care to ensure that all the apps built into iOS, the operating system that powers Apple mobile devices, are accessible. It means that a blind person has access to email, the web including any private Intranet pages secured behind a virtual private network, tasks and reminders including having them trigger when in a specific location, multiple calendars, and turn by turn directions. Oh, and here’s the amazing thing, it even lets you make phone calls, with an accessible phone app and address book. That’s all without having to install anything in addition to what’s on the device, all in a device you can hold in your hand and carry in your pocket.
Apple came up with an ingenious user interface that made touch screens accessible to blind people, cleverly separating the process of exploration, then confirmation. But a touch screen is just one of several interface choices available within VoiceOver, the screen reader built into Apple mobile products. When VoiceOver is running, an iPhone can be controlled by a Bluetooth qwerty keyboard, with which there is a set of powerful screen reading commands allowing for navigation and text manipulation. VoiceOver offers Braille Screen Input, turning the touch screen into a virtual Braille keyboard for rapid input of text by Braille users. That’s a tool I use for writing on those occasions when I don’t have a physical Braille display connected. The phone can be controlled, and material written, via voice, thanks to Siri and dictation. It makes composing short texts and controlling various phone functions, such as checking your calendar, extremely efficient. The shapes of print letters can be written on the touch screen, using handwriting mode.
Accessible book stores such as iBook’s and Kindle are a boon for professional development. Often, a blind professional or student can purchase the same book from the same source as everyone else, and it’s readable on their iPhone, allowing people to carry around literally thousands of books in their pocket. Let’s not overlook the significance of that from a social integration perspective. It’s now possible, on a regular basis, for a blind person to join in on a water cooler discussion about a best seller, because we’re able to read them at the same time as our sighted work colleagues.
When paired with a Bluetooth Braille display, yet another way of controlling the device, you have instant transcription from print to Braille. Apps are available on this one device connecting the user with services such as Bard, Bookshare and Learning Ally. So in my view, for many capable users, iPhone can replace several proprietary blindness devices formerly used for content consumption.
Apple has done an excellent job of documenting the process for designing apps accessibly, meaning that even where proprietary systems are being developed, such as internal sales tracking and client management tools, any developer can soon learn how to make an accessible app, thus creating opportunity for a blind person to enter and examine data using an iPhone or iPad.
One of the most remarkable things about the smartphone revolution from a blindness perspective is that it has turned many blind people into photographers. Many of us who’ve never seen at all have become familiar with the concept of distance as it pertains to the camera, and the things we need to do to increase our chances of getting an object fully in the photo. Despite the increasingly electronic world in which we live, there are many vocational situations where a blind person may encounter print they need to deal with on the spot. It’s liberating to be handed some print at a meeting, and know it doesn’t pose the difficulty it once did, unless it’s handwritten of course. Apps like the extraordinarily effective and free Seeing AI app from Microsoft, or KNFB Reader, can snap the picture and let us read the material in speech or Braille. Currency identifier apps and object recognition technology help at home and the office.
Apps such as Nearby Explorer and the Seeing Eye GPS app can give us blindness-specific turn-by-turn directions. Apps can give us a detailed summary of the public transport options for getting from A to B.
Here in Wellington, New Zealand, our central business district is peppered with BlindSquare beacons, giving me information about businesses I am passing, and remarkably, when I enter a business, information about what’s inside including how to find the counter or cafe.
Sometimes, it’s useful to be able to summon up some working eyeballs, whether it be to find out if your tie matches your shirt, to quickly skim a range of print more quickly than a machine can, or to find out what’s around. Apps like Be My Eyes, which is staffed by volunteers, and the subscription-based AIRA service staffed by trained professionals, are exciting developments.
I’m also mindful that most blind people are seniors, and that age-related blindness can often be a final straw that causes someone to question whether they can continue to function safely in their home. One-off extensive training in iPhone can be an investment that in some cases may make expensive care unnecessary. This will be increasingly viable as today’s baby boomers keep joining the seniors category. They are used to adopting technology, and will be more willing to adopt technology that mitigates their disability. Reminder functions can prompt people to take medication. Accessible barcode scanners using the camera on the iPhone can help differentiate that medication. Sighted assistance can look around the home at the touch of a button. An accessible thermometer, wirelessly connected to the iPhone, can give temperature information to assist a blind person to know when food is cooked correctly. HomeKit technology can control room temperature, lighting, and other appliances.
So in many cases, I believe iPhone should be considered an investment in greater productivity, independence, and inclusion. This one device is a veritable Swiss army knife of information and independence.
Gradually, after its release in 2007, iPhone disrupted everything. So many aspects of life have changed because of it, and the blindness system isn’t immune. There was once a time when it was easy to say that a blindness agency just doesn’t fund cell phones. It was simple then. If a blind person needed a cell phone for their job, their employer should pay for that, just as the employer would fund it for any other employee.
Then, cell phones started to become smarter. Today’s cell phones offer magnitudes more power in every respect than the huge bulky desktop computers in our offices even 20 years ago. In 1997, I was using a 32-bit computer with just 15GB of storage. Now, I have a 64-bit iPhone with 256 GB of storage in my pocket. Making and receiving phone calls on my iPhone is now a feature I use infrequently. So it’s not as easy anymore to make a blanket statement like, “we don’t fund phones”. Because that statement really equates to, “we don’t fund powerful, portable computers in a phone-like form factor”.
In parallel with this change, it also used to be easier to find a demarcation point between assistive technology, and mainstream technology. As cell phones started to become smartphones, some agencies may have concluded that in certain specific cases, they would fund the cost of a third-party screen reader, and maybe a text-to-speech engine, to run on the mainstream smartphone that the client or employer would fund.
Things are very different now, because there is no separate assistive technology component to fund on these devices. The United States has been a leader in advocacy initiatives, both directly to operating system developers and to legislators, making the case that technology should be accessible out of the box. Does that mean that the days of specialised AT products are over? Not at all, at least in the case of Windows, which is still the operating system used in the main by most workplaces. Let me draw a comparison. When you buy a new computer, or install a new copy of Windows, Microsoft includes a basic virus checker and a basic firewall. They do so knowing that many people with a security mindset will install a more comprehensive virus checker and firewall, and some computer manufacturers even do this for their customers. But Microsoft covers the basics. When sighted family members ask me to fix a computer problem for them, I’m glad Narrator exists. When I need to put food on the table, I couldn’t do that without JAWS. In Windows, it offers the efficiency and configurability necessary to make the difference between being able to do a job, and not being able to do a job.
As a New Zealander who has worked closely with US organisations for almost two decades, I admire the outcome-driven nature of the US rehabilitation process. Since success is largely defined by how many people you can get to a successful closure, your voc rehab system has some built-in protections against bean counters somewhere deciding that free options are adequate. Many free options simply don’t have the features necessary to allow a blind person to perform on the job. So, when it is proven that a commercial screen reader is necessary to help someone obtain or gain employment, a blind person can usually get it.
Things are different with iOS. It’s unique in one key respect, and it’s vital to understand the implications of this uniqueness. iOS is fundamentally a closed platform, using an approach called sandboxing. If you’ve ever uninstalled a Windows app, only to find it has had negative consequences for some other program you use that seems totally unrelated, you’ll have sympathy for Apple’s sandboxing approach. An iOS app plays in its own sandbox. It has no power to influence or cross-pollenate with any other app on your system. For stability and security, that’s a very good thing, and it’s one of the reasons why iOS is being widely adopted by federal Government agencies and corporations with highly sensitive commercial data.
For blind people, there are downsides. For example, a blind person might own two reading apps that wish to use voices not built into iOS. One app can’t see that the other app already has the voice installed, and that means that the second app must install a second copy of the voice. And here’s the big one. Sandboxing means that iOS is the only operating system in widespread use by consumers where it isn’t possible for a third-party screen reader to be developed for it. Apple develops VoiceOver, and that is the only screen reader available. That will remain the case unless Apple introduces one of its tunnels in the sandbox, an application programming interface or API for short, that will allow third-party screen readers. Since a screen reader needs to see many low-level things going on right across the system, that would be a big tunnel, potentially fraught with security and abuse risks, so I consider it unlikely that Apple will ever do this. Just in the last few days, for example, the technology press has reported that Google is cracking down on mainstream applications misusing its accessibility API which, among other things, makes third-party screen readers possible.
In principle, if the climate of engagement is healthy, it’s not necessarily a bad thing that Apple exercises such control over the screen reading experience in iOS. Unlike Android, where there’s a wide variety of largely third-party hardware to be considered, Apple controls the complete user experience. They manufacture the hardware, they develop the operating system, with the screen reader included. What’s not to like? I’m about to offer some views on that question. As an agency leader, you need to be aware of the risks, because the consequences of those risks have been evident of late.
First, market dynamics. I’ve held positions in the past at a senior level in assistive technology companies, where I’ve had meetings with agency directors. In those meetings, I knew how important it was to take your concerns seriously, to ensure you felt that you or members of your team had influence over product direction. It’s critical that dedicated assistive technology companies nurture those relationships. In different capacities, everyone is working towards the same goal, blind people participating fully in society, what NFB very accurately describes as living the life we want.
The dynamics are different with mainstream technology companies who are displaying a commendable commitment to accessibility. First, the market segment blind people represent is a subset of a tiny segment. Mainstream companies are thinking about the accessibility needs of customers with a wide range of disabilities. It can be hard to establish an ongoing, quality dialogue that makes you feel like you are being heard and taken seriously. Some mainstream companies are doing a better job of this than others, which brings me to my second point.
Apple is historically a secretive company. It seldom discusses future plans. It will talk to people of its choosing, under very strict nondisclosure agreements. So it can be difficult to influence its development processes, and even to have a conversation that you can take anything tangible away from.
Finally, if there are quality control issues pertaining to the screen reader, you can’t choose to use another screen reader on the platform, because no other screen reader for iOS exists. And there have been serious quality control issues for years. In 2016, the National Federation of the Blind passed what in my view was an appropriate, necessary and accurate resolution about problems with Apple’s quality control with respect to its VoiceOver screen reader on iOS. My view is that when a mainstream company chooses, or is required by legislation, to produce a screen reader, they then become an assistive technology company. It is vital that we hold these companies, with considerable resources at their disposal, to the same standards we rightly expect assistive technology companies to meet. Let me give you just a couple of examples of serious bugs in VoiceOver in recent years. In both cases, they had a significant impact on blind people’s ability to do their job, which is why I am raising them here. Also in both cases, Apple was advised very early in its testing processes of the bug, but saw fit to wait until after the first official release of the operating system concerned before it acted.
One of the catalysts for the 2016 NFB resolution was the case of iOS 9, where some blind people found themselves unable to answer phone calls when VoiceOver was running. You can immediately see the vocational consequences of this, I’m sure. If your employer, your customers, your clients, can’t reach you consistently because you can’t answer the phone, your job is seriously impacted. What’s frustrating about this is that if people in general, not just blind people, were having trouble answering their iPhones, you can be sure a fix would have been rushed out within 24 hours and it would have been headline news. Blind people had to wait for weeks to get that essential functionality restored.
Only this year, some positive changes have been made to Braille, after years of advocacy by me and many others. However, as is often the case with software, when you change things, you create unexpected bugs. Despite numerous reports early on in development, those bugs weren’t fixed before the official public release of iOS 11, meaning that people were unable to input text using a Braille display with any speed, without the system becoming unreliable. We know that the employment rate of Braille users is far higher than blind people who don’t use Braille. We know that Braille is a critical tool for many on the job. We also know that if Braille isn’t working reliably, it could literally put a deaf blind person in danger if they’re unable to communicate via text, email and relay services. There have also been occasions when even minor updates break connectivity between iOS and certain types of Braille displays. All software has bugs, of course. But a company makes a judgment about which bugs are show stoppers, and which bugs they think can be lived with in a release. I submit that too often, Apple is making calls that don’t show adequate regard for blind people’s productivity and independence. When this happens, it’s important to understand that fixes cannot be quick, because of the way VoiceOver is baked so deeply into the operating system. It isn’t a separate app that can be updated overnight in the App Store. A new version of VoiceOver requires a new release of the entire operating system. Unless Apple changes this approach, and such a change would be architecturally difficult, it’s incumbent upon Apple to take much greater care when it comes to VoiceOver quality.
Again, let me draw a parallel. If Apple released a software update that rendered the screen useless or significantly less responsive, there would be a very quick fix. I have been saying this for years, and now I have proof. A week ago, I began reading reports of some iPhone X users experiencing performance issues with the touch screen of iPhone X in cold weather. Some also reported unpleasant lines randomly appearing on their displays. Those issues were addressed in a software update within six days of the first report hitting the technology press. Blind people, however, are such a tiny fragment of Apple’s user base that it’s hard to get on the radar for urgent matters. And perhaps, while we can hope for much improved quality control, we just have to accept that that’s just the way it is. The tiny size of this market segment will mean that even serious bugs affecting just us are going to take longer to fix. But if that’s the case, it’s something purchasers need to factor in when it comes to depending on these tools in a vocational context.
Some sighted people have abandoned laptops entirely for their phone or tablet. But if Apple leaves us high and dry after an iOS update, how do we explain to an employer that it may be weeks before their blind employee can be productive again? iOS 11 came out in September. The fix for Braille input is not yet publicly available, and may not be until December. That will be nearly three months of effective Braille downtime, and we can’t expect employers to put up with that.
State agencies have a lot of purchasing power. They may be able to bring some pressure to bear on Apple that individuals and consumer organisations cannot. If that’s the case, you would be doing blind people a tremendous service by taking up the cause.
Quality issues aside, VoiceOver on iOS continues to evolve. That evolution has been consistent and impressive. I don’t mind admitting, I was concerned that Apple may do just enough to satisfy people that they had a screen reader, then leave the product largely unchanged. VoiceOver has been a part of iOS since 2009, and its capabilities have expanded significantly every year.
Phones have been transitioning from devices primarily used for content consumption, to also being used for content creation. In this regard, I think VoiceOver has a lot of progress still to make when word processing large documents where formatting is important. Depending on the app in use, it’s possible to get some indication about the way a document is formatted, but it’s not easy in general to navigate quickly to common elements like headings, or get to a specific page in a large document. Spell checking is doable, but it might be argued that the user interface is somewhat convoluted. In a visual world, it’s critical that blind people can produce visually attractive, well-formatted documents, and that their screen reading technology allows them to confirm precisely how those documents look. For that kind of task, Windows is still considerably ahead.
Let me conclude by emphasising that I remain grateful and impressed every day by the extraordinary transformation in my life brought about by Apple and iPhone. But we can be grateful and objective at the same time, and indeed I consider that our duty. On the job, at school, at play and for safety, iPhone is a remarkable device. Our challenge is to encourage Apple to lift its game when it comes to quality control, and to continue with the work it is doing to develop iOS for content creation. Let’s also encourage Apple to reach out to us in good faith dialogue to make a remarkable product even better, because the significant progress we’ve made in all endeavours to date have occurred in the main due to partnerships where blind people have controlled the direction of travel. The phrase “nothing about us without us” is just as true today as ever.
One thing’s for sure though, it’s an incredibly exciting time to be in the technology field.