Apple’s Worldwide Developer’s Conference, WWDC, kicks off with the keynote at 10 AM US Pacific time on Monday 5 June. That’s 1 PM eastern, 6 PM in the UK, and bright and early on 6 June in New Zealand at 5 AM. Apple will stream the event live via the Apple TV Events app and on its website.
Right after the WWDC keynote, I’ll be recording a special edition of Mosen Consulting’s podcast, The Blind Side, in which I’ll be joined by a group of people to help make sense of what promises to be some major announcements by Apple. Participants in the podcast include:
- David Woodbridge, Apple ambassador, podcaster, and a man who’s helped many of us get more from our Apple devices.
- Jeff Bishop, tech expert and Apple enthusiast.
Heidi Mosen, Apple enthusiast, student electrical engineer, and someone who’ll be watching the screen closely during the WWDC keynote. She’ll be taking screen shots of any slides that go up on the screen that the team on-stage don’t have time to describe fully. She’ll also be ready to give thorough descriptions of any devices, such as the rumoured Amazon Echo competitor.
The bottom line is that we’ll give you the best level of detail and analysis from a blindness perspective anywhere.
I can also confirm that Mosen Consulting will be releasing “iOS 11 Without the Eye”, our comprehensive guide to iOS 11 from a blindness perspective, on the same day that iOS 11 is released officially. This means that when you get your copy of iOS, you’ll have access to a thorough guide to help you upgrade and become familiar with all that’s new.
If Apple follows its usual tradition, the first beta of iOS 11 will be in the hands of developers immediately after the WWDC keynote next Monday. Be warned, usually the first developer beta is, perfectly understandably, rough around the edges, and probably not for use on a device that you depend on. It is likely to be available to those in the wider Apple beta program sometime later, when some of those rough edges are a little smoother.
For now though, I’m looking forward to our usual pre-WWDC discussion here on the blog. This post continues a tradition I started when I founded Mosen Consulting in 2013, where just ahead of the big iOS reveal at Apple’s Worldwide Developer’s Conference, I write down my wish list for the latest iOS and open the discussion to read what you’re hoping for. No doubt there’ll be some items on my list that aren’t important to you, maybe there’ll be some you agree with, and there’ll be other items that are must-haves for you that I haven’t included. So please feel free to share your opinion in the comments.
This post is written from my perspective as a blind person who uses VoiceOver, the screen reader that has made the iPhone a powerful productivity tool even when you can’t see the screen. Some of the features are blindness-specific, while some are not.
As I look back on previous years’ versions of this post, it’s gratifying to see how much progress Apple continues to make with VoiceOver. Since Apple first introduced VoiceOver to the iPhone in 2009, not a year has gone by where there haven’t been substantial changes to VoiceOver that have added real value, and boosted productivity. It’s exciting, and it explains why so many VoiceOver users really look forward to a new major iOS release.
Here are 10 changes I’d like to see in iOS10 that, for me, would make a great user experience even better.
Fix the hardware keyboard bug
It might be argued that bug fix requests shouldn’t be part of a wish list, but there are some that are of such significance that they spoil what is otherwise a great experience. Top of the list for me is an issue with hardware keyboard that has been around for some time and which only exists when VoiceOver is running.
Frequently, typing using either a Bluetooth keyboard paired with an iDevice, or a smart keyboard on an iPad Pro, results in duplicate letters. The work-around is to perform a triple-click Home to toggle VoiceOver off, and another triple-click Home to toggle it back on again. If, like me, you’re a fast typist and have your keyboard echo completely disabled, you may have typed a lot before you realise that what you’ve typed is unusable.
2. Sort out notifications
Notifications in iOS 10 have changed markedly for the worst compared with iOS 9 in several respects.
First and most important, the old ability to scroll infinitely through an unlimited number of notifications when VoiceOver is running no longer exists. I often wake up in the morning to many notifications. After scrolling through around 20 or so, I’m unable to scroll anymore without deleting a notification, or opening an app to clear its notification, which I don’t always want to do. It’s frustrating and can cause me to start my day feeling grumpy, because it shouldn’t be this hard for me to interact with what has come in overnight.
I would also like to see available actions in notifications return to the actions rotor as was the behaviour in iOS 9. You can 3D touch to display the available options for a notification on the Lock Screen or in Notification Centre, or expose options by choosing the “More” item from the Actions Rotor, but it’s not as convenient.
Sadly, a classic bug from way back when has returned in iOS 10, where if a notification interrupts continuous reading, reading sometimes resumes with spurious text-to-speech characters interspersed with the text.
Finally on the subject of notifications, I continue to wish for the ability to toggle whether they interrupt continuous reading. I am still genuinely grateful on a regular basis for the fact that we now have access to so many books at the same time they’re released to sighted people. Reading via the Kindle and iBooks app is fantastic. There’s a “but” coming though. It makes for a distracting, disjointed experience to be reading a book, only to be interrupted on a regular basis with notifications. When a notification comes in, VoiceOver stops what it’s reading, speaks at least part of the notification, and then if you’re lucky, resumes reading.
I’d like a setting, available on the rotor and in VoiceOver settings, to be able to specify whether notifications interrupt what’s being read. This would be particularly useful to those of us who make extensive use of custom tones. All the important people in my life have unique ring tones and text alerts. I have a special sound assigned to VIP email. So all this means that when a notification comes in that I know I may want to look at right away, the tone will tell me I should stop reading my book and check my notification. When I hear other alert sounds, I know that they can keep until I want to take a break from my book. It sure would beat being interrupted all the time, and would lessen the need for some people to have a blindness-specific reading device.
Talk to anyone who has either a Google Home or Amazon Echo product in addition to their iPhone, and you’ll find a consensus that Apple is far behind in the personal assistant space and just isn’t innovating. Apple made some excellent moves last year by opening Siri to some third-party developers, but they took a fatally conservative approach. Only certain classes of apps can use Siri. I still can’t open a book of my choice in iBooks, Kindle or Voice Dream Reader. I’m not allowed to ask TuneIn Radio or Ootunes to tune to Mushroom FM, and I can’t ask Spotify or Deezer, my favourite lossless music service, to play a song.
But further opening of the API is not enough. Siri just doesn’t know basic stuff. I must use Siri for control functions on my phone, but if I want to know a fact, locate a business, get a phone number, it’s quicker for me to run the Google Search app, perform the magic tap, and ask the question. The quality and detail of the responses Google gives me is astonishing.
Not only that, I find Google’s recognition significantly better. With the processing power and storage now available on these devices, offering training for those who want to improve speech recognition should now be viable.
Apple really must pull something significant out of the hat this year to even catch up with its competitors, let alone surpass them.
I’m not going to stop going on about this one until it’s fixed, because Braille equals literacy, and literacy equals jobs. Apple’s Braille is streets ahead of Android, no question, but I maintain that it’s not fit for purpose particularly in the education market.
As any contracted Braille user knows, despite Apple having endeavoured to make improvements to Braille input, there are still situations where correcting a word can result in a single letter being expanded mid-word, for example insert an “l” into a word and it will be expanded to the word “like”.
5. Braille Keyboard Manager
While we’re talking Braille, I continue to hope for this feature. I know Apple doesn’t like to have too many options in iOS, but Braille reading styles and preferences can vary. When using my braille display with my iDevices, I know I’d be a lot more efficient if I could reverse what the thumb keys do. My braille reading style suits having the left panning button advance the display, and the right one reverse. This is the opposite of Apple’s implementation, and there’s no way of changing it unless you jailbreak. A reverse panning function is standard in most screen readers and note taker products.
If it’s deemed appropriate to offer brightness and wallpaper settings, and a night shift mode, then giving us more flexibility over the functions each control performs on a braille display is a reasonable request, and it’s something already available in Mac OS. These issues become increasingly important as we start to think about using our devices for more serious content creation, and not just content consumption or a little bit of note taking.
6. VoiceOver creators update
Earlier this year, Microsoft released Windows 10 Creators Update. I’d welcome a similar emphasis on content creation when using VoiceOver, particularly given the work Apple, Microsoft and Google have done in making their document composition applications more powerful and accessible.
We need to be able to reliably query information about font and formatting in any application. Being able to navigate by elements such as headings and paragraphs is important not just on web pages, but within documents too.
Apple has tried to market the iPad as a laptop replacement. It won’t be, at least for a blind person, until we can make beautiful documents and verify what they look like.
While focussing on content creation, I know I’m going to upset my Mac-using friends when I suggest that an option in VoiceOver needs to be added that allows speaking what’s under the cursor to work like it does on Windows. Such a feature exists in VoiceOver’s preferences in Mac OS, and when I was a Mac user, it was one of the first things I changed. I’m not saying that the way Apple represents the position of the cursor in VoiceOver is wrong, it’s just different. And the reality is that most of us who own iPhones, by a sizeable margin, use Windows. So let’s at least have the choice, to help make composing documents seem more intuitive to many of us.
7. Text-to-speech API
Thankfully, Apple has added a lot of APIs (application programming interfaces) in recent years. It’s helped to make iOS more flexible and vibrant. It was a needed and pragmatic response that sought to deal with the criticism that Apple is too restricted, while not opening the platform to malware.
But we don’t yet have an API which allows a third-party text-to-speech engine to be available to any application. If you install a third-party voice on Android, any application using text-to-speech can make use of that voice. That’s what I want for iOS. I have multiple copies of several voices on my phone because each app must use its own copy of the voice. That wastes precious storage, and it means VoiceOver can’t use any of the third-party voices I really like.
Yes, haters are gonna hate and I know one’s text-to-speech engine choice is highly subjective, but if we could get to a point where a company such as Code Factory could release a version of Eloquence for iOS like their Android offering, I for one would be delighted.
8. More keyboard flexibility
One thing I’ve come to appreciate about my Android device is that the number row is part of the standard keyboard, along with a few more common punctuation symbols. On the larger iPhones, this really wouldn’t be a problem, and it would speed up touch screen entry.
On the hardware keyboard front, I’d like to see all the iPad keyboard shortcuts available to iPhone as well, at least when VoiceOver is running. Long ago, it used to be possible to press CMD+Tab on an iPhone to switch apps, then Apple took it away. Later, it was brought back, but only to the iPad, along with several other keyboard shortcuts that have been made iPad-exclusive.
When I train customers, some of whom have had quite a bit of Mac OS experience, I’m surprised by how many people don’t realise you can assign a keyboard shortcut to any menu function in any app. This is an absolutely marvellous feature and in the days when I was a Mac user, I’d use it when a developer hadn’t given me an easy way to access a function I use frequently.
I would love the ability to assign a keyboard and possibly Braille shortcut key to activate items in an app. What a boost to efficiency that would be.
It would be a complex thing to get right, but hey, giving us access to this platform at all was a complex thing to get right. I have faith.
9. Continuous Reading Improvements
There are several enhancements to the iOS continuous reading function, the two-finger flick-down, that would further improve it.
I like to read my Twitter timeline in proper chronological order. Reading from the most recent tweet backwards doesn’t give you the sense of a story unfolding. When I read my Twitter timeline this way, I need to do it manually, flicking through each tweet, because there’s no way of reading continuously up the screen. This feature would make a real difference in apps that put the most recent item at the top of the screen.
The most significant source of energy consumption on the iPhone is the screen. We can improve battery life by turning down the brightness, mine is usually set at 0. Contrary to widespread belief, having screen curtain on has no positive impact on battery life. Since we don’t need the screen when continuously reading, it would be wonderful if technology could be developed allowing us to lock the screen while continuously reading. This would also eliminate the hassle of accidentally tapping the screen when reading continuously, causing the reading to stop and our place to be lost.
Finally, I’d like to be able to flick up, down, left and right to navigate by elements such as sentence and paragraph, without interrupting reading. You could still use a two-finger tap, or even perhaps the magic tap, to stop reading.
10. A more interactive home screen
I enjoy customising all the iOS widgets on their dedicated page, but I’d like to see them promoted to the home screen, so my iOS home screen can be as dynamic and engaging as my Windows and Android devices.
For example, the CNN app could show the top headline of the moment. My local New Zealand weather app’s icon on the home screen could show me the weather conditions at my current location, all without me having to go to a special page or opening the app.
So there you have it. My sort of top ten. I realise some of the items in this list could easily have been broken down to several discrete items, but hey, I’ve got a lot to say.
The cool thing is that Apple always surprises. I wasn’t expecting Braille screen input when it arrived, and now it’s my primary input method on my iDevices. I wasn’t expecting Apple’s new threaded mail technology, and now it’s by far my favourite way of dealing with email efficiently. The combination of threaded mail and the unified inbox in Apple Mail is just epic. So I’ve no doubt there’ll be something in iOS 11 that’s unexpected and that makes an anticipated, positive difference.
What are some of your wishes for iOS 11? Share your thoughts in the comments.