Podcast Transcript, Mosen At Large episode 149, iOS 15, new Apple hardware, and adventures in Android

This transcript is made possible thanks to funding from InternetNZ. You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

Jonathan Mosen: I’m Jonathan Mosen, and this is Mosen At Large. The show that’s got the blind community talking. Today, iOS 15 is about to be released and I’ll show you some of my favorite features, listener reaction, and some more information following Apple’s big reveal last week and part one of the adventures in Android.

Female Speaker: [singing] Mosen At Large podcast.

Jonathan: As always, it’s a pleasure to have you listening to the podcast. This is Episode 149, and it’s an Apple-flavored podcast once again today. That’s because in the coming week, those who have ordered their iPhone 13s will be receiving them. iOS 15 is going to be released on Monday the 20th of September, US time. That means that it will be on the 21st in Australia and New Zealand. Typically, this happens at about 10:00 Pacific, 1:00 Eastern. I have sometimes seen it delayed, but usually, it does happen at around about that time.

I thought that we would go through and have a look at some of the features that I think are particularly noteworthy in iOS 15. Now, this is not a substitute for iOS without the I, sorry about that. I still get a lot of emails from people saying, why have you stopped iOS without the I? It’s nice to have it missed, I suppose, but the reality is that just because I have a full-time job, it’s not something I can commit to, but I hope that this brief overview of some of the things that I think are particularly cool will be helpful for you.

I’ve also made an effort to be helpful by making extensive use of Chapter Marks in this demonstration. If you’re listening to a podcast app that supports Chapter Marks, then you will be able to navigate easily from section to section of this review. If you are using Castro, which is my favorite podcast app, you can go into the Now Playing screen and bring up a list of the chapters in this episode and unselect or de-select those that you are not interested in hearing about.

Then what will happen is you will get a seamless review of some of the iOS features that I’m talking about, skipping over those things that are of no interest to you and you won’t even know that they’ve been skipped. First of all, can you run iOS 15 on your device? Well, the answer is that if you have an iPhone that currently runs iOS 14, then yes, you can run iOS 15 on that device as well. Now, there is a caveat there, and that is that there are some newer features that might require an iPhone 10 or higher because of the processing that’s on device that is used for those features.

It’s understandable that as devices get older, they won’t support some of the newer features that require a lot of processing power, but good on Apple for having devices that were purchased in 2015 still work in some form with the operating system. Are there any serious bugs at the iOS 15 release time that you should be aware of? There are some annoyances. For many years now, in an effort to be constructive and help to resolve any bugs before release, particularly where VoiceOver is concerned, I have run an email list where people who are beta testing iOS can compare notes, talk about the bugs that they find.

See if we can produce steps to replicate, to help make Apple’s job easier. It’s a constructive way to try and help them. From that list, there are a few things. One thing I have heard is that sometimes when you hold down the side button for Siri, on some devices, it doesn’t always activate. Most of the time, it does, but sometimes, it doesn’t. It’s an annoyance and I guess there is a workaround you can use that magic phrase to get Siri working.

Speaking of that magic phrase, I’ve also seen some reports that if you are setting up HEY Siri for the first time, maybe on a new device, that some people are experiencing VoiceOver muting on those screens, that it requires a Braille display in order to complete the setting up of the HEY Siri process. Now I can actually verify this one, but I don’t know how widespread it is. I don’t know under what conditions it is true, but I have been able to confirm this and I deleted my HEY Siri feature.

I set it up from scratch and the only way that I was able to complete it was with a Braille display. I hope that it is not too widespread for when everybody gets their new iPhones. I have also heard of issues with Wallet, so if you rely on the Wallet app and you have multiple passes, this could be a showstopper for you and also with the clock and editing alarms. I guess that one is not so bad because you could just delete the alarm, then create the alarm again.

One that is very interesting, and it could be very difficult for some people, is that I have heard that some iPhone users certainly with the SE 2nd-generation and possibly with other models are seeing a scenario where when you go into the App Switcher and you swipe through the list of apps, the names of those apps are not being spoken properly. I cannot duplicate this on the iPhone 12 Pro Max that I have, but I have heard several people reporting it. That may be something to be aware of.

If you make extensive use of the App Switcher, it could be a problem for you to get to the various apps. I suppose a workaround is that you can open the apps using Siri or some other method, but if you want to close apps, it may make it a bit confusing. Now, I want to pay some attention to accessibility features just as I used to do in the iOS without the i series, I start with accessibility and we’ll go through to more mainstream features.

I’m going to focus as I did in those books on the VoiceOver features, but also take a look at a couple of features for people with hearing loss. The reason for that is that these podcasts are now transcribed. This could be one of the few accessible ways that people have of getting information about accessibility in iOS 15 for hearing impaired people. There are a couple of really interesting features in that regard. For those of us who really like to customize our iPhones, there have been some precious gestures and past versions of iOS that VoiceOver was not using.

They are using some of them now. If you want to make the most of the new features, what you may like to do is reset your gestures to default. That way, any customization that you’ve done will, of course, have been lost, but you will then be able to take advantage of some of the new features that I’m talking about. I’ve had to do this myself for the purposes of recording this demo because I have a two-finger swipe left and right navigate by heading. I use it so often that I didn’t want to use the Rotor for that.

I also have my Castro clear this episode shortcut, set to a two-finger quadruple tap, that has also now been used by VoiceOver for a very useful feature. That’s something you may want to consider if you’ve customized a few gestures. First, I want to introduce you to VoiceOver Quick Settings. One of the criticisms that people have had a VoiceOver is that it does so much that it’s made the Rotor very busy. When I did training, I noticed that a lot of people have difficulty with the Rotor gesture. I could usually teach it to people, but it doesn’t seem to be intuitive for everybody.

The way I would recommend you use Quick Settings is keep the Rotor for things that you use all the time. Maybe you change your speech rate or your volume and, of course, if you’re navigating by characters, words, and things of that nature, you definitely want that on the Rotor. Then there are some settings that you don’t change nearly as often, but you would like to have access to a little bit more quickly than going into VoiceOver settings and finding where they are. The first thing we’ll do then is configure Quick Settings to contain just the items that we personally like to use.

There’s no right or wrong way to do this because it’s all going to depend on the way you use your device. Go to VoiceOver settings using whatever method you feel comfortable with, probably the easiest way is to instruct Siri to take you there, but if you have a Bluetooth keyboard connected, you can also press VO with F8 When you’re in VoiceOver settings, find this new option.

Automated Voice: Quick Settings button, and double-tap it. Selected activities, actions available.

Jonathan: What we have now is a list of all of the things that you can include in VoiceOver Quick Settings. Those that are going to appear in your Quick Settings will have the word selected proceeding them. Those that are not selected will not appear in the settings. You can deselect those items that you’re not interested in seeing in your Quick Settings. The first one is activities. If we navigate right–

Automated Voice: Reorder activities. Button draggable. Actions available.

Jonathan: You’ll note that you can not only deselect those things that are not interesting to you and select the ones that are, but you can move the order around in the screen. If there are things that you use a lot, you probably want to put those to the top of this list, as well as being able to drag these around in the same way that we’ve been able to do since the beginning in iOS. You can use actions, move down. You can move them up or move them down because this is the first item in the list, there’s only a move-down option at this point.

I’ll flick right and we’ll have a look at some of the items that you can choose to have enabled in this list.

Automated Voice: Selected. Always speak notifications.

Jonathan: Now, I am not interested in having this in my Quick Settings. I’m going to double-tap.

Automated Voice: Always speak notifications.

Jonathan: Now, it’s not selected.

Automated Voice: Reorder selected. Audio ducking.

Jonathan: Now, this is a good thing. I don’t often turn off audio ducking, but I do turn it off occasionally. This is one of those settings where I will be glad to take audio ducking off my Rotor, but keep it in Quick Settings for those few occasions where I want to make a change.

Automated Voice: Reorder audio ducking selected. Braille alert messages.

Jonathan: I will unselect this.

Automated Voice: Braille selected. Braille auto-advance.

Jonathan: I will unselect this as well because I have a keyboard command assigned to Braille auto-advance.

Automated Voice: Braille auto-reorder selected. Braille input.

Jonathan: Because I have a Mantis, this is not something I’m going to change, so I’ll double tap and it is now unselected. There is quite an extensive list of choices here. I won’t record going through this and customizing it the way that I like it, but I have done it now. Then what you can also do is go in and make sure that you’ve turned off any duplication that you don’t want with the Rotor.

If you’ve added something to Quick Settings that you no longer want in the Rotor, you can disable it in the Rotor to make it a bit less congested, but there is no reason why you can’t have both if you want. If you want to have it in the Quick Settings and in the Rotor, that is fine as well. When you’ve used the screen in VoiceOver settings to customize Quick Settings to your liking, you can invoke the Quick Settings from anywhere on your phone by performing a two-finger quadruple tap, so you know the magic tap that we all use so often where you double-tap with two fingers, you do that four times.

Automated Voice: VoiceOver settings heading.

Jonathan: Now, when I navigate, I’ve just got the Quick Settings that I have left selected, and that list is quite a small one for me.

Automated Voice: Done button, filter, search field.

Jonathan: This is another way to use Quick Settings. If you want to leave them all enabled, then you can use the search field to type in a partial string for what it is that you want and VoiceOver will narrow down the list and show you just the items that your search string matches, but I’m going to navigate to the right.

Automated Voice: Activities off button adjustable. Audio ducking on adjustable. Direct touch on adjustable. Gesture direction automatic button adjustable.

Jonathan: I think the use of the word adjustable in this way is a little bit superfluous, but perhaps they will tidy that verbiage up. When I found this item gesture direction, I was really excited about it because I thought that it was the answer to one of the top 10 wish list items that has come up for me repeatedly in the years that I’ve posted my top 10 for what I was hoping for in this year’s version of iOS.

My scenario was that I use a lot of Twitter and I prefer, as I’ve said on this podcast before, to read my tweets in the order that they were received in chronological order. Now, if I’m busy and I just want my tweets to play as it were, I can’t do a continuous read with a two-finger flick down because that goes from top to bottom, not from bottom to top. When I saw that you could change the gesture direction, I thought that I could set this up for Twitterrific.

I’d be good to go to have my tweets reading in chronological order continuously, but it appears not to work for me and maybe it’s to do with the fact that English is not a right to left language. If I double-tap this button–

Automated Voice: Gesture direction heading.

Jonathan: I’ll flick right.

Automated Voice: Selected automatic.

Jonathan: That’s the default.

Automated Voice: Right to left, left to right.

Jonathan: If I try changing it-

Automated Voice: Right to left.

Jonathan: -to go right to left-

Automated Voice: Right to left.

Jonathan: -nothing happens.

Automated Voice: Selected automatic.

Jonathan: You’ll see that automatic is still selected. Either this is a bug or perhaps it just doesn’t work for the English language. That’s a shame because I thought that was a very long-standing feature finally dealt with. I’ll perform a two-finger scrub to go back to the Quick Settings screen.

Automated Voice: Activities off button.

Jonathan: Now, we’re not having our place respected there, so I have to flick through.

Automated Voice: Direct gesture direction hints off adjustable.

Jonathan: I used to have this on my Rotor and now I just have it here because I don’t often toggle the hints.

Automated Voice: Language English UK, but screen recognition off. Sounds on at volume 100% button adjustable.

Jonathan: In this case, the adjustable does actually make sense because you can flick down here and the volume is getting quieter.

Automated Voice: 689, 100.

Jonathan: I think they will probably clear this up where the word adjustable is only spoken when it is in fact adjustable. That’s all I have on my Quick Settings. It’s a very small subset of all the features you can have and you can customize this for the way that you like to use your phone. I’ve returned to the VoiceOver setting screen to demonstrate a new feature that may seem a bit old to some of you if you’re a Mac user.

Automated Voice: Navigation style flat button.

Jonathan: Now, you have a new choice in terms of how you navigate your iPhone. Since iPhone became accessible with the iPhone 3GS, we’ve been used to the flat mode, and that remains the default. This is where you can drag your finger around the screen or flick left and right and hear each object on the screen in sequential order. If you’re a Mac user, you’ll be familiar with the concept of interacting with groups of items. This is an efficient way to work when you get used to it because if there are a series of related items that you’re not interested in, you can skip over them really quickly.

Now, this is an option in VoiceOver for iOS. If you’ve not used a Mac before, you may find this a little bit different to begin with, but you may like to play with this if you’re an efficiency nerd like me. To illustrate how this works, I’m going to give you a simple example by going into the new iOS Weather app, which is worth checking out if you’re weather geek because there are new features there, including notifications and other data that was not there before.

When I first go into this Weather app, I’m going to do so using the default mode that we are all used to. I’m in the Weather app and I’m going to touch a part of the screen that has quite a lot of useful data.

Automated Voice: UV index, one low, low levels all day. Minimum value, one maximum value 11 plus.

Jonathan: I flick right.

Automated Voice: Sunset 6:11 PM. Sunrise 6:19 AM.

Jonathan: I’ll flick right.

Automated Voice: Wind winds from the South Southeast at 41.8 kilometers per hour. A white line on a gray surface.

Jonathan: Good weather data here and it’s navigable the same way that this thing has always been navigable. Now, let’s go back. Open VoiceOver settings and now that I’m at VoiceOver settings again, I want to locate–

Automated Voice: Navigation style flat button.

Jonathan: I’ll double tap-

Automated Voice: Selected flat-

Jonathan: -and flick right.

Automated Voice: -group.

Jonathan: Now, we are at navigation style group, and this is where things start to get interesting. I’ll double tap-

Automated Voice: Selected group.

Jonathan: -and group is now active. I’m going to go back to the Weather app. I’ll just invoke the Apps Switcher to do that.

Automated Voice: Apps Switcher settings, weather active.

Jonathan: And double-tap.

Automated Voice: Weather, Grenada Village.

Jonathan: Now, I’m going to tap around the same area of the screen that I did before when we had the default navigation mode selected, known as flat.

Automated Voice: Temperature, UV index, sunset, wind.

Jonathan: I’m flicking right and you’re hearing that pop sound. Now, I choose to not have the normal click sound that VoiceOver emits when I flick right because I just don’t find that it gives me any useful additional information, but I do need this sound active because what that sound is telling me is that I’m skipping through grouped items and that, unless I expand that group, we’re not going to get all the information available.

Automated Voice: Rainfall.

Jonathan: I’ll flick left.

Automated Voice: Wind. Sunset.

Jonathan: Now, I’m going to perform a two-finger flick to the right.

Automated Voice: Sunset 6:11 PM, Sunrise, 6:19 AM.

Jonathan: Just by performing that two-finger to the right, it has expanded the sunrise and sunset group and it’s speaking the data contained inside the group, so where this is useful is that if you have a busy app, where data is conglomerated together in this way, then you can compress the data and make it much more efficient to navigate the screen by putting your phone into group mode for navigation. Now, as you know, I’m a major efficiency nut, I get a bit obsessed with efficiency, so I think this is a fantastic addition to iOS 15, but it falls short for me in one key respect.

That is you cannot set the navigation style on an app-by-app basis through activities. If you’re familiar with activities, you will know that this is a handy way to customize the VoiceOver experience to some degree in an app that you are in. I use this, for example, when I go into Clubhouse where they use an annoying amount of emojis in titles that just slow me down and I have emojis turned off in Clubhouse and it really makes a big difference.

What I would like to be able to do, is have an activity setting so that when I go into the Weather app, then my navigation style is set to group but by default, it’s set to flat. I can’t do that at the moment. The way to get around it is to add the navigation style either to your Rotor or to your Quick Settings and toggle it as required. The final big VoiceOver change I’d like to demonstrate is the ability to explore images in great depth.

This is a fantastic feature and particularly in conjunction with the new Live Text feature which while not an accessibility feature does have accessibility benefits, we’ve really got a lot of enhancements in terms of the way that we can explore and engage with photos and particularly text in photos. Here’s a tweet that I have liked from Sir Paul McCartney.

Automated Voice: Paul McCartney. Paul will appear at @southbankcentre in a world exclusive event with Paul Muldoon and Samira Ahmed, to discuss his new book #PaulMcCartneyTheLyrics. Members on sale starts Thursday 16 Sept, and general booking opens Friday 17 Sept: https://mpl.pm/LyricsEvent 12 hours ago. Attached photo, retweeted 153 times, liked 1,245 times, via Twitter Web App.

Jonathan: We all know that it’s good social media practice to attach a text description to a photo like this, but unfortunately, there is no text description. Even when there is one, though, it can be quite interesting to explore the photo in the way that I’m going to show you with iOS because sometimes they give you a description. That’s actually a bit more descriptive in terms of what somebody looks like, so I’m going to open this photo.

Automated Voice: At Southbank’s number Paul– MPL show media.

Jonathan: There’s s show media, I’ll double-tap to go there.

Automated Voice: Image, adult. Possible text. The lyrics, Paul McCartney in conversation with Paul Muldoon. Adult, possible text. The lyrics, Paul McCartney in conversation with Paul Muldoon.

Jonathan: I did nothing at all there other than double-tap to open the photos. It gave me a bit of a description and it read the text in the photo. The actions Rotor is in focus at the moment. One of the actions is–

Automated Voice: Explore image features.

Jonathan: You will see this when you have a photo that is in focus now. Explore image features, I’ll double-tap.

Automated Voice: Image explore heading.

Jonathan: We’re in the new image explorer of VoiceOver in iOS 15. We’re at the top of the screen, and I’m going to flick right.

Automated Voice: Done button. Image heading.

Jonathan: The first thing you’ll notice is that when you’re in the image explorer, each of these key sections is navigable by heading. When you get familiar with that, you can quickly get to the information that you want, particularly if you’ve set up a VoiceOver gesture to navigate by heading. Even if you haven’t, presumably you’ve got headings on the Rotor. I’ll flick right through the image section now.

Automated Voice: Document. Cap X. Center Southbank. The lyrics, Paul McCartney in conversation with Paul Muldoon, Friday of November, Royal Festival Hall, 2% a person with straight gray hair, smiling near left edge, more content available.

Jonathan: Now, if we go to the more content available option and choose to view that we will simply see what’s on the right edge of the photo, but if I navigate to the right, we’ll see that as well.

Automated Voice: Sign near right edge, more content available.

Jonathan: Similarly, if I choose the more content, he will get the left edge.

Automated Voice: Image description heading.

Jonathan: Now we go onto the image description.

Automated Voice: Adult.

Jonathan: All that we have in the image description is adult. Not terribly helpful. It’s not describing the adult in any way. I’ll flick right.

Automated Voice: Scenes heading. Adult.

Jonathan: Similarly, adult is under the scenes. That’s what we have with this particular image. Now you may be saying, why is it that certain other services can identify celebrities? Because this is clearly a photo of Sir Paul McCartney, and I think the answer is that all of this is going on on your device. Apple is privacy-focused, and this image is not being sent anywhere for the analysis to take place. I guess it’s possible that they could have had a database of celebrities on your device, but that might take up some space. That’s the sacrifice you make.

You can, of course, send this to any number of other services. Microsoft, Google, and a number of other players who’re into recognizing images. They will often tell you when a celebrity is in the image. One of the beautiful things about this is that everything’s happening on the device, you don’t need internet access to explore an image because this applies to photos in your photo library as well. There’s no potential compromise in terms of people looking at the photo. Because all of this is happening on your device, it is very fast. That’s the image explorer in iOS 15.

If you have a hearing impairment, there are some great new accessibility features in iOS 15 and you find them by going to accessibility. Then there’s a heading called hearing. We’re going to go to–

Automated Voice: Audio slash visual button.

Jonathan: And double-tap.

Automated Voice: Audio heading,

Automated Voice: Headphone accommodations off button. You can customize the audio for supported Apple and use headphones. Background sounds off button.

Jonathan: This is new. Let’s double-tap and take a look at background sounds.

Automated Voice: Background sounds off. Plays background sounds to mask unwanted environmental noise. These sounds can minimize distractions and help you to focus, be calm or rest.

Jonathan: You might say why on earth is this under the hearing category? The reason for that is that there are an increasing number of hearing aids now that offer this kind of functionality built into the hearing aids. The main reason they have it is for people who’ve experienced profound tinnitus. When sounds like this are played through hearing aids, it can help to mask the really annoying sound of tinnitus, the constant ringing in your ears that some hearing-impaired people have. You have a variety of sounds available.

Automated Voice: Background sounds off.

Jonathan: I’ll double tap to enable them.

Automated Voice: On.

[background noise]

Jonathan: It’s very soothing, isn’t it? Let’s flick right.

Automated Voice: Plays background sounds to mask unwanted environmental noise. Sound balance noise button.

Jonathan: There are several sounds you can choose from. I’ll double-tap.

Automated Voice: Selected balanced noise edit button.

Jonathan: This is the first in the list.

Automated Voice: Balance Noise, bright noise, dark noise, ocean, rain, stream.

Jonathan: That’s all we have at the moment. If I want some rain–

Automated Voice: Rain.

Jonathan: Well, thank goodness. I haven’t had enough of that lately. I’ll double-tap.

Automated Voice: Selected rain.

Jonathan: Then you hear the sound has changed to rain. VoiceOver told me that actions are available, so let’s see what they are.

Automated Voice: Delete. Activate. Delete.

Jonathan: Just delete and activate. You also do have some control over the volume of these sounds. I’m going to go back.

Automated Voice: Background sounds on.

Jonathan: Double-tap.

Automated Voice: Off.

Jonathan: The background sounds are now off. You may remember that in the Mosen At Large right after Global Accessibility Awareness Day in May, we covered the extensive announcements that Apple had made in conjunction with that day about some forthcoming iOS 15 changes. One of them was the ability to do what they’re calling headphone accommodations. This works with specific Apple devices and allows you to do a custom audio setup, including importing an audiogram. If you have compatible headphones, this could really make a difference to how well something sounds to you.

Those are just some of the accessibility features that are new in iOS 15. It’s fantastic to see Apple continuing to add to the slew of accessibility features that they offer for a wide range of disabled people. Now, let’s take a look at some of the mainstream features that have caught my attention in iOS 15. To do that, we’ll segue from accessibility into something that has many accessibility benefits. I like this feature, not only because it’s useful, but also because it demonstrates that when we make an accessible society, everybody benefits. This feature is called Live Text.

If you’ve read any reviews of iOS 15, any reports about it, you’ll see that Live Text comes up quite often because it’s so useful to everybody. The feature is pretty descriptive. Live Text can detect text and photos and then help you to perform actions that are related to text. Now, clearly, this does have some accessibility benefits and it’s a tool in the toolbox. The one thing I would say is that I don’t think that Live Text replaces the specialized apps, like Seeing AI, Supersense, EnVision, and similar apps.

They still do have an edge in terms of reading text to you, taking a picture, and getting the text out of it. I don’t think that this is going to replace those apps, but it does have some use cases that are quite unique and pretty exciting. First, let’s take a look at a scenario where it can sort of work, but it’s not the best tool necessarily to use. For this demonstration, I’ve got a meal here, that’s pre-packed, it’s nice and fresh, it’s a low carb meal.

It’s from a service called Muscle Fuel that Bonnie and I use here in New Zealand because we both have busy lives and we want high-quality keto-friendly food, and Muscle Fuel provides it. I’m going to open the camera app now. Open camera.

Automated Voice: Camera, take picture, but zero people.

Jonathan: I’ve got the frozen meal in the view of the camera now.

Automated Voice: Text detected, text detected.

Jonathan: It said text detected. Yes, okay, so you’ll know. Let me, just go and have a look at text then.

Automated Voice: Camera mode, photo detect text button.

Jonathan: I’ll double tap to detect text button.

Automated Voice: Detect text. Text detected. Share Menu item.

Jonathan: Now, I’m going to go to the top of the screen.

Automated Voice: Detected text. Net weight 300gm, net weight 300gm. Swipe or tap to select text. Selected.

Jonathan: That’s what it got. It got some text from this meal that I have in front of me here. The way this is working is similar to what you are already used to. You’ll know that if somebody is in the view of the camera, it tells you that zero people are detected or one face in center of the image is detected, and that helps you to take a good quality picture. In addition to detecting people, it will now also detect text. This works particularly well for me with printed text on a piece of paper, even text on a screen. It’s not quite doing as well with this meal that I have in front of me. If I open Seeing AI. Open Seeing AI.

Automated Voice: Low high fat carb, high fat K-E-T-O-G-E-N-I-C meal. Chicken with lemon herb sauce. Muscle Fuel points.

Jonathan: Just by holding it out in front of me, I immediately heard with Seeing AI, without any fuss, that what I’ve got here is the chicken with lemon herb sauce. Sounds delicious too. It’s much easier to use one of the blindness tools to identify products like that. In making this point, I’m in no way saying that the feature isn’t fit for purpose because the purpose is slightly different. It is not really designed for you to hold something in front of the camera and be told what something is, although that may work from time to time.

The idea here is that you should be getting enough information in order to take a picture, then store that picture in your photo library. This is where Live Text has been really impressive for me. Because in the world that I frequent, I get a lot of business cards given to me. I might go to a meeting, meet someone, and they’ll say, “Let me give you my card,” and it’s got their contact information on it.

To make sure I get a really good picture, what I sometimes do is say to that person, “Look, would you mind if I just handed my phone over to you and get you to take a snap of your business card?” It only takes a couple of seconds because they immediately can see whether it’s completely in the view that the distance is just right and they’ll take a picture. This is where Live Text in my view gets particularly interesting. Because if you’ve got a business card or something like that, that’s got the text, then it’s stored in your photo library.

You can come back and this works in the Photos app as well and have a look at the Live Text in the photo. A couple of things happen here that are of interest. The first is that anything that’s an email address or a phone number, the typical things that you’re used to iOS being able to detect becomes what they call a data detector. That means that it’s tappable. If I find a business card in my photos, and it’s got somebody’s name, email address, and phone number, if I want to call that person, then I can just double-tap on the phone number, and I can make the call.

I can also copy the text to the clipboard and then insert it into my Contacts app if I wanted to create a new contact for that person, for example. It’s a very versatile feature. You will need to double-tap the detect text button on a photo for all this to go on. As all the good infomercials say, “But wait, there’s more.” This is where it gets really interesting. If you combine the features of Live Text with some of the new features that have been rolled out to Spotlight Search, you really have a powerhouse of a feature here. Spotlight Search has been in iOS for a long time.

I must admit I haven’t used it very much but it’s becoming more and more relevant. It’s got quite a significant facelift in iOS 15. If I’d been traveling extensively and I’ve got a collection of business cards that I want to deal with or send an email to say hello, and that might be another way of getting contacts into your phone more easily, then I want to search for these people in Spotlight Search.

When I bring down Spotlight Search, either by swiping down with three fingers from my home screen or pressing command space on a Bluetooth keyboard, if I type in the name of a person whose business card I have in my photos, then thanks to Live Text and its integration with Spotlight Search. I will find that person and then I can go into the photo and perform whatever action I need to do. Spotlight Search is also very good at finding photos that have specific things in them.

I’ll show you this, I’ve still got some of my wedding photos on my phone and I might like to show them off to people. I could fossick around in my photos, maybe I’ve been really organized and I’ve created a specific album but it takes some time. Instead, now I can swipe down with three fingers to bring up Spotlight Search.

Automated Voice: Spotlight Search text field is editing. Search insertion point at start.

Jonathan: I’ll type the word wedding.

Automated Voice: Top search result makeup girls wedding dress up, 63.

Jonathan: Even without having to press return or do anything else, that search has already been acted upon. This is one of those times when navigating by heading is handy because the Spotlight Search results are grouped by heading. If I do that–

Automated Voice: Series of jet photos heading.

Jonathan: There we go. I have actually found that photos is top in this instance. If I flick to the right–

Automated Voice: Image, a group of children wearing dresses and standing on a wooden shelf.

Jonathan: Spotlight Search has decided that this is a wedding photo. It may well be, I don’t remember that photo.

Automated Voice: Image, a photo containing a bouquet and a bridesmaid.

Jonathan: That is definitely a wedding photo. Now if I double tap here.

Automated Voice: Image back button.

Jonathan: We’ve got the image on the screen.

Automated Voice: The 27th of June 2015, 3:47 PM heading.

Jonathan: I know this is a wedding photo now because it tells me when it was taken. That was on my wedding day so I’ll flick right.

Automated Voice: Photo, the 27th of June 2015. More content available. Two children wearing blue dresses and holding bouquets of flowers. Possible text, cone.

Jonathan: That is absolutely amazing that this is all happening on the device.

Automated Voice: Toolbar share button.

Jonathan: Just by typing in the word wedding into Spotlight Search, I was able to locate that photo. Now, remember that we also have the image describer.

Automated Voice: Photo, the 27th of June.

Jonathan: It’s in focus now which means that if I flick down–

Automated Voice: Explore image features.

Jonathan: I’ll double-tap-

Automated Voice: Image explore heading.

Jonathan: -and flick right.

Automated Voice: Done button image heading. Flower near left edge. More content available. Document, cone. On [unintelligible 00:36:04], slight right profile of a person’s face with wavy blonde hair, laughing near top left edge. More content available.

Jonathan: That is a very good description, isn’t it?

Automated Voice: Slight left profile of a person’s face with straight blonde hair, laughing near top left of the edge my head. More content available. Image description heading, two children wearing blue dresses and holding bouquets of flowers. Scenes heading, bouquet, bridesmaid. Date heading, the 27th of June 2015. Time heading, 3:47 PM. Orientation heading, landscape.

Jonathan: That is all in the image explorer. Really, amazing that as blind people and for me as a totally blind person who has never seen a photo in his life, we can have this sort of access to photos. It really makes them worth while taking and keeping. If you view these photos in the Photos app itself, then you can also add Markup to the photos. We’ve all become familiar with ALT text on the web and ALT text in tweeted photos and on Facebook, where you can add additional image descriptions.

You can add Markup with text descriptions of your photos to make them even easier to find. In the past, some people have used the graphic labeling feature of iOS as a workaround. The trouble is that data is not transferable if you give the photo to someone else. If somebody marks up a photo, and then sends it to you, you will get not only the photo but all the Markup information that describes the photo as well. There is so much more to Spotlight Search in iOS 15. You can search for an app even if you don’t have that app, and if it’s available in the App Store, will often find it.

If you want to search for text in a handwritten image, then it will find the text even if it’s handwritten, unless I guess the handwriting is particularly bad. Beat up on Spotlight Search, give it a try. Even if you’ve thought it wasn’t particularly useful in the past. I am amazed at the way that it is working in iOS 15. Next, I want to go back to what in my view is the killer feature of iOS 15. I demonstrated this extensively in Episode 131 of Mosen At Large. If you haven’t heard this yet, I recommend going back to Episode 131 and taking a listen. This is Focus in iOS 15.

To give you a brief synopsis of what this does. The idea is that you can set up various Focus modes depending on what you’re doing, what’s important to you at any given time. A Focus mode can determine what apps are on your screen, who can contact you, what kind of notifications you receive, and from which apps. It’s very flexible. I’ve taken to using different Focus modes a great deal. It may even change the way that you choose to group your apps on your iPhone because I’ve got these lovely, neat, and tidy folders.

As we have the ability to turn off and on different home screen pages with Focus, it could be that you want to group your apps onto different pages based on their purpose. That way when you’re at home, and you just need a break from work, you can hide apps like Teams or Slack or anything that is work-related so that you can recharge in peace. Of course, block email notifications as well. I am a shameless news junkie tragic so I have many, many news-related apps on my phone. To my great regret, many news apps are now getting way too willing to push notifications to you. It’s all about clickbait. When I first had an iPhone, most of the push notifications I got from news apps were breaking news, genuinely important things. Now, you’re getting all sorts of trivia pushed to you a lot to the point that some of the news apps I actually turn their notifications off. This is a big deal if you, like me, use your iPhone as your multi-purpose device. I don’t have a standalone book reading device, for example. There are times when you just want to switch off and enjoy your book.

With the Focus features, you can set a range of options and so you could have one for when you’re reading for leisure, and indeed, I have a notification set up like that now. Since I did my demonstration, all the way back in Episode 131 of the Focus features in iOS 15 because I was so enthusiastic about what Apple had done with those, they really have done a tremendous job. A few refinements have been made and there’s one in particular that I just wanted to bring to your attention.

To show you this, I’m going to go into a Focus that I have created myself. There are some that come pre-bundled, if you will, with iOS that you can set up. Then you can set up any number of Focus options. I’ve got one here.

Automated Voice: Minimal notifications button.

Jonathan: I’ll double-tap.

Automated Voice: Minimal notifications switched button off.

Jonathan: It’s off right now but I can enable it at any time. The easiest way to enable a Focus when you’ve got one set up is to go to Control Center, swipe through to Focus, and then flick down. When you double-tap, you’ll have all your Focus options available to you. I’ll flick right.

Automated Voice: Allow notifications heading. People, Alistair, Amanda, Anthony, Bonnie.

Jonathan: I’ve got a bunch of people who can contact me when I’m in minimal notification mode, but not too many. This is not completely a Do Not Disturb, I do have that as well. I will enable that when, for example, I’m recording or when I’m in a meeting, and I just can’t afford to have anybody at all disturb me. This is a minimal notification feature where I don’t want to be bothered by any random apps that are being too pushy with their Push, but I do still want to be available to the world if anything urgent is going on.

Automated Voice: Apps, calendar, FaceTime, Life360, Messages, Time Sensitive button.

Jonathan: These are the apps that can get through. Also, I will allow Time Sensitive notifications. These can be defined by app developers. For example, one of my favorite apps is called Parcel and they have time-sensitive notification options when a parcel is out for delivery when something’s been delivered, and that does breakthrough. Also, calendar notifications and reminders are considered time-sensitive. If you enable time-sensitive notifications in your Focus, they will breakthrough as well. Now, I’ll flick right.

Automated Voice: Allow calls and notifications from people, apps, and those marked as time-sensitive when this Focus is on.

Jonathan: I’ll flick right.

Automated Voice: Options heading. Focus status on button.

Jonathan: What this means is that if someone tries to message you, they will get a notification that you are in a particular Focus mode.

Automated Voice: Home screen button.

Jonathan: This is where you can specify the home screens that are visible when this Focus is active.

Automated Voice: Lock screen button, name and appearance button, turned on automatically heading.

Jonathan: Now, this is what we didn’t show you in Episode 131 because, way back then, this part of the user interface was not there yet.

Automated Voice: Add schedule or automation button.

Jonathan: Double-tap-

Automated Voice: Cancel button.

Jonathan: -and flick right.

Automated Voice: New automation heading. Choose when you want the automation to happen. Example from 12:30 PM to 2:30 PM. Example from 12:30 PM to 2:30 PM button. Example when I arrive at work button.

Jonathan: I’m not sure why it’s speaking all of that twice but we’ll live with that.

Automated Voice: Example when I open books button.

Jonathan: Actually, that is a perfect example for the scenario that I’m demonstrating. When I open the Books app, I do want this Focus enabled because it gives me the bare minimum of notifications. I think I allow one New Zealand-based news app through that is not so spammy with it’s Push notifications. I’ll double-tap-

Automated Voice: Search field.

Jonathan: -and flick to the right.

Automated Voice: Section index adjustable. Amazon Alexa, American.

Jonathan: Now, we have all the different apps that I have on my phone and I have an awful lot but here we have–

Automated Voice: Books.

Jonathan: I’ll double tap–

Automated Voice: Books. Focus back button.

Jonathan: Flick right and see what we have now.

Automated Voice: Minimal notifications heading, minimal notifications switch button off. Allow people Alistair and–

Jonathan: We’ll navigate by heading actually.

Automated Voice: Options heading turned on automatically heading. Books on while using books button.

Jonathan: At the moment and I can tweak this automation, it set up so that when I’m home and I’m using the Books app, it should turn on the notifications and that’s actually okay. That’s how it’s set up at the moment. As you can see, there’s a lot of good configurability with this Focus system. Definitely, get your teeth into this. It will really give you a great deal of control over who can contact you, what apps can bug you, what apps are visible to you at any given time. Speaking of what apps can bug you, there is now a feature called Notification Summaries.

We are seeing a theme here with Apple updates for the last few years, where shareholders have become a bit concerned about how addicting these smartphones are to many people. The first big feature in this area was screen time, where you could monitor how much you were using your device and in which apps and even limits certain apps at different times of the day, particularly where kids are concerned. We’ve got the Focus mode. We now also have Notification Summaries. The reason why this is significant is that when you get a notification, you get a little dopamine hit.

It’s true, and you’re getting addicted to it apparently. Well, some people do, then what happens is that you just have to deal with that notification. It does affect your ability to get things done. If you’re working away, and your phone makes some sort of sound, you just have to check it. It takes a lot of discipline not to check it. It’s kind of like people who can eat one potato chip, there aren’t very many of them. Notification Summaries is another way of working around. Another tool in this suite of taking control of your life back.

Well, it’s a lofty ambition, isn’t it? If you double-tap notifications under your settings screen, you will find–

Automated Voice: Scheduled summary off button.

Jonathan: I have this feature off, perhaps I’m just not disciplined enough, but I do like to know what’s going on. I’ll double-tap.

Automated Voice: Scheduled summary off.

Jonathan: If I want to know more, I have to turn it on. I’ll double-tap.

Automated Voice: Six.

Jonathan: I’ll flick right.

Automated Voice: Notification summary heading. Bundle non-urgent notifications and receive a summary at convenient times. Scheduled delivery choose when you’d like your notification summary to arrive. Get what’s important calls, direct messages, and time-sensitive notifications will be delivered immediately. Even for apps in your summary. Continue button.

Jonathan: Double-tap continue.

Automated Voice: Choose apps for your summary. Direct messages and time-sensitive notifications are not included since they’re always delivered immediately, ordered by daily notification average star.

Jonathan: Now, as you flick to the right with this enabled, what you will see is the apps that give you the most notifications first. In my case, it’s a very scary number of emails that come in every day and tweets are actually next. I can go through and choose which of these apps I want to receive in a notification summary. Earlier, I talked about the news apps that are now much more willing to send nonsense. Really, things that in my view do not warrant a push notification.

They should be left for when you open the app and you explore the app, but they want you to go into the app because so many of them are advertising-driven. If you’ve got some particularly egregious apps that bother you too much, you can choose them from this list and they will be added to a notification summary. You can schedule that summary or indeed a series of summaries at different times of the day so that you’re likely to get the summaries when you’re not particularly busy.

You might do it, say, at 6:30 in the morning or 7:00 in the morning, and 6:00 or 7:00 at night when hopefully, you’ve stopped working and you’re having dinner and talking to the people that you care about. This is a cool new feature in iOS 15, the Notification Summaries. You’re given a lot of control over notifications as well if you don’t want to go the summary route or maybe you have one particular app that’s just being really vexatious today. This also applies to message threads that may be quite spammy. That can be the case if you get locked into a group message thing.

They’re talking about something that you are not interested in and you’ve got to get some work done. You just want to shut it down for a wee while. Next, I want to talk about some big changes that have come to FaceTime in iOS 15. I think this is kind of a case of too little too late for Apple. When the pandemic hit, Apple was not well-positioned to offer a service that allowed everybody to talk to one another to get business done. If everybody was in the Apple ecosystem, then it was great. You could use FaceTime if you have an iPhone and an iPad and a Mac.

If you needed to communicate with colleagues who choose to use Android devices and Windows, Apple couldn’t help you with that. You were left with a third-party solution such as Zoom, which exploded during the pandemic and Microsoft Teams. Google Meet was also cross-platform and could work as well. Apple was this little walled garden that could only work with other Apple users. FaceTime in iOS 15 seeks to remedy that by providing a web-based user interface two FaceTime calls. This means that it works on non-Apple devices such as Android smartphones and windows and the sound quality is actually really good. There are a couple of ways that you can set this up and we’ll explore this now. Go into the FaceTime app first, open FaceTime and I’ll flick right.

Automated Voice: Facetime heading, create link button.

Jonathan: Right at the top of the screen so you can’t possibly miss it, is a new option called create link. I’ll double tap this.

Automated Voice: Facetime link heading, actions available.

Jonathan: That actions available is actually quite important. If I flick down we can add a name to this link and activate it. I’m going to flick down again to add name and double tap.

Automated Voice: Alert, add name.

Jonathan: We’ll flick right.

Automated Voice: Text field is editing. FaceTime link, character mode, insertion point at start.

Jonathan: Am going to type Mosen at large test and just verify that. I’ll flick right?

Automated Voice: Cancel button. Okay button.

Jonathan: At having created the link we’re now popped into the iOS share sheet. I’ll go to the top of the screen

Automated Voice: Mosen At large test heading close button, airdrop button, Instapaper button. Twitterrific button messages button.

Jonathan: We have all the usual suspects in the order that I like them and I can also copy this link to the clipboard if I want to. I can flick right.

Automated Voice: Mail button.

Jonathan: On the way that I have my share sheet organized there is the mail option so I can send this link to someone via email and I’ll show you what that’s like. I’ll double tap.

Automated Voice: Mail, cancel button. Mosen at large test heading send dimmed.

Jonathan: There is the send button that’s dimmed right now and you’ll see that the Mosen at large test has become the subject line of the email.

Automated Voice: To text field.

Jonathan: I’m going to email this text to myself so we’ll send it to jonathan@mushroomfm.com, just check that.

Automated Voice: Jonathan@mushroom.com results Siri found in apps heading Jonathan Mosen. Jonathan@Mushroom.com is fine.

Jonathan: I’ll double tap action.

Automated Voice: Message body send button.

Jonathan: I’ll double tap. It has now sent because I’m in do not disturb mode at the moment while I’m recording this we didn’t get the usual swishing sound but it has sent that email. Now I’m going to go into Microsoft outlook on my PC running jaws. The jaws is running the Tom vocalizer voice, remember iPhone is running Alex so hopefully you will hear the difference. I’ll tab into Microsoft outlook.

PC:: Inbox mushroom.

Jonathan: I believe this will be the last email I got in.

PC: Jonathan Mosen at large test Thursday.

Jonathan: There it is. I’ll press Enter to open the email.

PC: Join my FaceTime graphic icon slash FaceTime icon wide one X link, FaceTime link link, John.

Jonathan: It just says join my FaceTime. It’s got my signature at the bottom of the email because I sent it through the iOS mail App and that’s all there is to it. I just have to press enter to activate the link when we are on it.

PC: Blank, link link.

Jonathan: There we go. We’ll try that.

Automated Voice: Facetime Microsoft Edge page face.

Jonathan: I am running Leasey and I’ve got sounds enabled so that’s what you are hearing is that with page loads let’s explore this tab.

PC: Facetime personal Microsoft edge.

Jonathan: Go to the top.

PC: Beta heading.

Jonathan: It’s telling you it’s a beta so anything can happen that may well come out of beta designation when iOS 15 is officially released.

PC: Enter your name to join the conversation, name, edit.

Jonathan: I’m going to type Jonathan on the computer and we’ll verify that.

PC: Name edit Jonathan on the computer, more tab FaceTime continue button.

Jonathan: There’s a continue button. I will press space but just to say that if this is the first time that you’ve done this your browser will pop up and ask if it’s okay to use your microphone and your camera.

PC: Facetime heading level two.

Jonathan: Let’s see what we’ve got now.

PC: Facetime call controls heading level two, FaceTime call. Join as Jonathan on the computer, join button.

Jonathan: We’ve got the join button so we’ll press that. There are a few steps to go through.

PC: Alert, waiting to be let in.

Automated voice: Facetime. Now, Mosen at large test someone requested to join time-sensitive.

Jonathan: Even though I am not in a FaceTime call right now I got that notification and even though I’m in a do not disturb mode I do allow time-sensitive notifications through in the particular do not disturb focus that I’ve set up. That’s why I’m getting this now, I will bring down the notification.

Automated Voice: Notification center, six notifications FaceTime now Mosen at large test someone requested to join time-sensitive.

Jonathan: The cool thing is that you can give somebody a link and say if ever you want to call me then just choose this link and you don’t have to be sitting there. You will get a notification with someone joins. I’ll double-tap.

Automated Voice: New York times one hour ago.

Jonathan: Yeah we’ll not bother with that one me button. Let’s go to the top of the screen, me button and flick right.

Automated Voice: Mosen at large test one person waiting button, join call button, open messages, dimmed button selected audio, route button, mute off button, camera on button, effect button, selected blur background button, flip to back camera button, flip to back.

Jonathan: You can blur the background. This is a feature that I first came across in Microsoft teams and then Zoom also adopted it but what I’m going to do is go back and I’ll turn my camera off.

Automated Voice: Select effects, camera on button, camera off.

Jonathan: Now I’ll flick left.

Automated Voice: Mute off button, selected audio route button, open messages dimmed button, join call button.

Jonathan: If I flick lift one more.

Automated Voice: Mosen at large test one person waiting button

Jonathan: I’ll double tap that.

Automated Voice: Mosen at large done button and flick. Facetime one person waiting. Maybe Jonathan on the computer wants to join. Add people dimmed button, share link button, silence join requests off.

Jonathan: At this point we can’t let them in because we are not in the call but this is where you can see who’s here so we’ll go back.

Automated Voice: Back done button done. Join call button.

Jonathan: Join the call.

Automated Voice: Join call reject, join request button, approve join request button.

Jonathan: I’ll double tap approve join request.

Automated Voice: Leave call. Maybe Jonathan on the computer joined button.

Jonathan: I’m just going to see if I can stop there.

Automated Voice: Maybe leave open message selected, mute off button mute on.

Jonathan: I am now muted and the microphone is also muted on this side as well because there’s obviously two copies of me, watch out world. We now have the FaceTime call established with me on the phone and then another copy of me on the computer. Let’s test a couple of things. The first thing I want to do is wind up the mic on my mixer. It’s the same microphone you’re hearing me with. So you understand what the audio quality of this is like, how is it sounding on FaceTime? What you’re hearing now is me talking from my mixer, which has the Heil PR 40 microphone connected.

It is being picked up though by the iPhone and re-broadcast to you, what you can hear is a really weird spatial effect called spatial audio. You may have heard about spatial audio in the context of music. This is where your voice doesn’t sound completely mono, where this does shine on a FaceTime call though, is that if you’ve got a group of people who are participating in that call, everybody’s slightly differently placed on the stereo spectrum so it is quite a cool feature. This is how I sound over the iPhone. When I’m talking into the mic from my mixer, let’s reverse the trick now. I am going to pick up my phone and let you hear how I sound to me on the PC. Hello from

My iPhone 12 Promax I have the iPhone held up to my ear just as you would normally do. If you were making your call and I’m talking into the iPhone, this is a good chance for me to show you that there are actually a range of modes that you can choose from when you are in a FaceTime call. Sometimes they’re not available and if they’re not available, it seems to be because of the device that you have connected at the moment. When I have my hearing aids connected, my made for iPhone hearing aids connected to the iPhone, only two of the three options are available. The wide spectrum option, which we’ll come to in a minute, is not.

I’m just waffling away to let you hear how I sound in the standard mode when I’m talking on the iPhone. Now I’m going to bring up control center and flick right?

Automated Voice: Selected FaceTime button portrait M mode.

Jonathan: Now we’re going to go into what Alex calls Mic mode. It is of course mic mode and we’ll go to the top of the screen-

Automated Voice: FaceTime, heading

Jonathan: -and flick right.

Automated Voice: Selected standard button.

Jonathan: We are on standard mode right now.

Automated Voice: Voice isolation button.

Jonathan: Now I’m going to double tap and you probably won’t hear too much difference here. I think it’s a slightly lower frequency response codec, but we are in a very quiet studio environment here.

Automated Voice: Selected voice isolation.

Jonathan: Now I’m in voice isolation mode. The idea here is that I think it narrows the microphone beams that mobile do some noise cancellation. If you’re in a noisy environment, and you want to reduce the background noise to make it easier for people to hear you, you can go in here in Control Center whenever a FaceTime call is active and enable it. The next one really will be good, I think, if in a meeting situation and you’re a bit further away from the mic. Let’s choose this, this is wide spectrum button. I’ll double-tap.

Automated Voice: Wide spectrum button selected. Wide spectrum.

Jonathan: Hello from wide spectrum mode. Now, to test this theory, what I’m going to do is put the phone on the desk a bit away from me. I’m a few meters away from the phone now and we’re in wide spectrum mode. Now, if I flick left.

Automated Voice: Voice isolation button.

Jonathan: We’re in voice isolation mode now. If I double-tap.

Automated Voice: Selected voice isolation.

Jonathan: See if you can hear a difference between wide spectrum mode and voice isolation when I’m further away from the device. If I flick left–

Automated Voice: Standard button.

Jonathan: There’s standard I will double-tap that.

Automated Voice: Selected standard.

Jonathan: Now at some distance from the device, this is what standard mode sounds like. Now I’m back on my microphone recording directly into REAPER, and we’ll take a look at the way that the screen is laid out in FaceTime on the computer.

PC: FaceTime personal Microsoft Edge.

Jonathan: We’ll navigate by heading.

PC: FaceTime heading level one,

Jonathan: And just explore this page.

PC: Displaying +64 heading level two call controls. FaceTime call, one person leave button.

Jonathan: You can leave the call of course.

PC: List of four items, full-screen toggle button microphone on toggle button pressed.

Jonathan: There you go. You can mute your microphone if you need to. I don’t know if there is a keyboard shortcut for that.

PC: Camera on toggle button pressed. Open participant list in settings menu button menu.

Jonathan: You can open a list of participants,

PC: List end.

Jonathan: That’s what we have on the FaceTime call. I suppose you can just Alt+F4 out of this. We’ll try.

PC: 12 points.

Jonathan: It seems to have just had me disappear at this point. Before we leave FaceTime, there is one other way that you can schedule a FaceTime meeting that is really simple and it’s one of those Apple it just works things that could make FaceTime an option for you for just quickly getting people together. To demonstrate this we’ll open Apple’s own calendar app. Open calendar. Here we are on the calendar app.

Automated Voice: Search button, Add button.

Jonathan: There’s the Add button which I will double-tap.

Automated Voice: Text field is editing title, word mode insertion point at start.

Jonathan: It’s asking me to give this appointment a title. I’ll type Mosen at Large test and we’ll flick right.

Automated Voice: Clear text button. Location or video call.

Jonathan: Just double-tap this.

Automated Voice: Location search field is editing entire location or video call. Word mode insertion point at start.

Jonathan: You can type a location as you always have, but if you flick right–

Automated Voice: Current location, video call heading FaceTime.

Jonathan: Just double-tap this.

Automated Voice: FaceTime Cancel button.

Jonathan: What happens now is that if I save this appointment in the notes field of the appointment, there’ll be a FaceTime link, and it’s as simple as that. If you like using Apple’s own calendar app, this does not work with third-party calendar apps like Fantastical. If you go in here, and type the name of the event, and set the location to FaceTime video call, and then add the recipients, it is a really easy way to set up a FaceTime call with many people.

Although it is not in the initial release of iOS15, where FaceTime will really come into its own is when they add the SharePlay feature. SharePlay was working for a while with iOS15 beta, and then they took it away, I think they’re doing a little bit more refining, but they will bring it back in time for the new feature in Apple Fitness+, which is the Apple Fitness+ groups feature where you can work out together on a SharePlay FaceTime call.

The way this works is that you can get as many people as you want together on FaceTime. Then as long as you all have subscriptions to the same thing like Disney+ or Apple TV+, you can listen to and watch things together. It also works with Apple Music. I got a bunch of people together on a FaceTime call and we just hang out and listening to Mushroom FM.

It’s very cool because if somebody speaks, it’s like the audio ducking feature in voiceover, where the music ducks down so that you can hear the person who is talking. Everybody has control over things like play and pause and skip forward and back. It’s a nice way to have a virtual party in these pandemic-y times actually because you can assemble a playlist and then all add to that playlist, skip, do all sorts of things with SharePlays. I think this will be a big hit in the blind community.

Being able to watch a Netflix video with someone on the other side of the world or in another city, it’s really fun. That SharePlay coming in won’t be there on the first version of iOS 15, but hopefully, it is not too far away. One of the most controversial things that has happened in iOS15 is the changes that have been made to Safari. Actually, they are so controversial that Apple has started to if not roll them back, at least give people a choice about whether they stick with these new defaults or not.

What you’ll notice when you go into Safari, is that the address bar is not only at the bottom of the screen now, but that whole section where you get the address bar, the reader view, various other things is collapsed. A lot of people rebelled against this, and bombarded Apple with feedback to say, “This is yucky.” You can change it back, and I for one have changed it back. We’ll go into Safari settings now. Open Safari settings. Let’s navigate by heading.

Automated Voice: Safari heading. Allow Safari to access search, heading general, heading tabs, heading.

Jonathan: Tabs as the heading you want if you want to change this back, and here are the settings that seemed to be working okay for me, if you want more of a shall we say retro look to this.

Automated Voice: Tab bar button.

Jonathan: We’ll flick right.

Automated Voice: Selected single tab button.

Jonathan: I have single tab selected.

Automated Voice: Landscape tab bar on.

Jonathan: That is how I have it set up. When I go into Safari, we’ll do that now. Open Safari. Now our page is loading. If I go to the top.

Automated Voice: Page Settings button, address in [unintelligible 01:06:48]

Jonathan: There we go. I’m checking on a package there, and the address bar is at the top and page settings are there at the top as well, so it looks very traditional. If I double-tap the address bar.

Automated Voice: Page setting address [unintelligible 01:07:02] address tap.

Jonathan: And just flick to the right.

Automated Voice: Voice Search button.

Jonathan: There’s a new voice search button, and this works very similar to dictation on your iPhone in Edit fields, except that it does timeout automatically. If I double tap Mosen at Large podcast. I didn’t do anything, it just goes pings, and now when I flick right–

Automated Voice: Read results heading, Mosen at Large Apple podcasts on the App Store. Heading level three link.

Jonathan: There we go. Mosen at Large is there in the search. That’s a quick way to perform a voice search. They’ve also added pull to refresh in Safari, which is pretty cool. I’m not sure if it’s the most efficient way to get it done with voiceover though, because if you perform the three-finger flick down gesture, it will scroll unless you’re at the top of the page. What will happen is if you flick down, and you’re halfway down the page, it will say page three or whatever page two or whatever. When you get to page one into another three-finger flick down at that point, it will refresh the page.

One of the things I’ve enjoyed about using the web with Safari is the adblocking extensions. Some of them do a little bit more than just blocking ads, they can get rid of social media links and general clutter that can add a lot of verbiage for blind people. While I do sympathize with the fact that many websites are ad-supported, and do suffer, if you engage with adblocking technology, the thing is some of it’s so obnoxious. Sometimes I hear things that interrupt the reading of speech, banners with really bad ARIA that interrupt what you’re reading.

In a situation like that, if you can block that, it could be the difference between being able to use the content on the site and not. If you’re going to use extensions like this, it’s a good idea to subscribe to a site that you appreciate if you can, because journalists and other content creators have to make money somehow. While extensions are being extended, if you will, in iOS 15 and Safari to do a lot more than just adblocking and general cleanup. This is going to make Safari for iOS a lot more like Chrome or Edge or indeed Safari on the Mac where extensions have been available for yonks.

To have a look at this, let’s go back into Safari settings. I’ll invoke the app switcher.

Automated Voice: App switcher Safari app settings active.

Jonathan: Double-tap settings.

Automated Voice: Options of settings.

Jonathan: Let’s look for the general heading.

Automated Voice: Safari, allow Safari search general heading.

Jonathan: Flick right.

Automated Voice: Autofill but favorites. Block pop-ups extensions button.

Jonathan: We’ll double-tap.

Automated Voice: Allow these content blockers. Heading, no extensions installed. Heading, extensions customize the way Safari works. Extensions can change the appearance or behavior of web content, add buttons to Safari, and more. More extensions button.

Jonathan: We’ll double tap more extensions.

Automated Voice: App store today, back button.

Jonathan: Now we are in the app store and you get your extensions from here. Just like you get pretty much any other third-party thing for your iPhone that is software. Here are some extensions that you can take a look at at your leisure, but that’s where to find them. Go to Safari settings in the settings part of your iPhone and navigate to the general heading, and then you’ll find the extension feature there. It’ll be well-worth exploring in the coming weeks.

We’re going to talk before we go about iCloud Plus, but even if you don’t have iCloud Plus if you’ve just got the free iCloud account, there’s some really good stuff happening in Safari to protect your privacy this year. They have a thing called intelligent tracking prevention, and it prevents trackers from profiling you using your IP address. An IP address is essentially a unique identifier on the internet and your router or whatever you are connected to the internet with has to have an IP address.

There are various sites that can tell you what your IP address is, and indeed various utilities that can tell you it, even on your device, but one website that I’ve used from time-to-time that tells you is IPChicken.com. Interesting name that. If you go to IPChicken.com, you’ll be amazed by what it can find out about you just by your IP address. What your internet provider is, approximately, where you are located, and there are unscrupulous people who want to get hold of this information and build a profile on you.

In Safari settings, you can go to privacy and security, and then there’s a feature there called hide IP address. Now it is possible that this could break some things. If there is a site that legitimately relies on your IP address, because they use Geo-blocking or something of that nature, then it’s possible that it might give you unpredictable results, but try it and see how it works for you. It is a free feature in iOS 15’s, Safari. Let’s take a look now briefly at iCloud Plus, and first a definition, what is iCloud Plus?

It is a suite of features that are available to you if you subscribe to one of Apple’s paid iCloud plans. To be honest, it’s difficult not to cause they only give you five gigabytes for free, and that is not a lot these days. If you have any iCloud paid plan, you have access to iCloud Plus. What can it do for you? Well, let’s take a look. Open iCloud settings. Here is the storage heading. If I flick right iCloud Plus 354.4 gigabytes of two terabyte mute. All right, I’ll double tap.

Automated Voice: iCloud Plus 354.4 gigabytes of two terabytes used.

Jonathan: That confirms that I have iCloud Plus, we’ve got the two-terabyte plan that we do share with our family sharing members because we are nice like that. One of the new iCloud Plus features you’ll find in the screen if you navigate through it is this.

Automated Voice: Private relay data on button.

Jonathan: Let’s take a look at iCloud private relay. Now iOS 15 is still calling this a beta at this point because there are some rough edges so use this at your own risk. I’ll double tap I’ll flick right.

Automated Voice: iCloud private relay keeps your internet activity private, private relay hides your IP address and browsing activity in Safari and protects your unencrypted internet traffic so that no one, including Apple, can see both who you are and what sites you are visiting.

Jonathan: You may well ask. Didn’t I just say that there was a feature in Safari that hides your IP address? Well, yes there is. iCloud private relay is much wider than Safari. If you don’t have the iCloud private relay feature because you don’t have iCloud plus you may like to enable it for when you use Safari but if you do have iCloud plus you may find that iCloud private relay gives you a much more system wide level of protection. iCloud private relay in beta right now.

I think it’s fair to say that it does have benefits in terms of protecting your identity, but it does have some downsides because the kinks are still being ironed out. The other big iCloud Plus feature is this. If you have used the sign in with Apple option, you’ll be familiar with the way that this works. If you choose to sign in with your Apple ID, you’re given the choice.

When you create a sign in with Apple and a partnership essentially with a third party that they can either have your actual Apple ID email address or Apple can create. Well, what you might call a burner email address. It’s an email address that forwards to your real one. The third party never knows what your real email address is. Therefore, if you find them to be too spammy and you just want a sever the relationship completely, sometimes third parties don’t respect that, all you have to do is delete the address and they can’t get in touch with you anymore, genius.

What if you want to do a similar thing, but you don’t want to sign in with Apple or signing in with Apple isn’t an option. For example, I suppose you could use this if you meet somebody and you’re not sure whether you trust them sufficiently and you want to give them an email address that’s disposable. You could use it for that as well. We’ll double tap this.

Automated Voice: Selected. Hide my email button, iCloud drive hide my email, keep your personal email address private by creating unique random addresses that forward to your personal inbox and can be deleted at any time.

Jonathan: It took a wee while to come up, but it has now.

Automated Voice: Create new address button.

Jonathan: You can create it. If I double tap this.

Automated Voice: Create new address. Hide my email heading.

Jonathan: And flick right.

Automated Voice: Keep your personal email address private by creating unique random address that forwards to your personal inbox and can be deleted at any time. Ray under [unintelligible 01:15:54] force Nomanloud.com. Important if you need to reference or manage this address, you can find it in iCloud settings continue button.

Jonathan: Then you can go on and create that email address, which I’m not going to do but as the little prompts so handily said, if you want to have a look at all the email addresses that are active, you can go iCloud settings and choose the hide my email option, and you’ll be able to review, and if you want delete any of those email addresses. The other feature that Apple has added to iCloud Plus is the ability to use your own domain name for iCloud email. I own several internet domains.

One of which is mosen.org, which is a family domain email, so many of my family members have mosen.org, email addresses, and they think it’s pretty cool that I can give them those. At the moment they are hosted with the company that hosts a lot of the other websites that I manage, but it would be possible to keep that hosting for the web but to tell Apple, to host my email. Now there would be some advantages in doing this. One of which is that Apple of course will push email right away to your iPhone.

If somebody sends you an email to your iCloud account, it’s almost instantly received to any I device or for that matter Mac. That’s pretty nice. You can do this with a Microsoft exchange server as well. It might be a little bit more tricky to set-up. Now there is going to be some complexity in getting this going, because if you’ve got a domain name already and you want to move from your default hosting company to hosting of your email only while leaving everything else the same, you’re going to have to go and change some specific records in your domain name.

Your host will hopefully help you to do that if it’s a good, reliable host. The best coverage I’ve seen of this feature so far is on Mac stories, which is a really good geeky website and they get into the nitty-gritty of the detail of these things, and they do talk about how to get this going, if you would like to give it a try. If you’ve always wanted to own your own personalized domain so that you’re not on icloud.com or gmail.com, but you have a domain that’s personal to you, then you may be able to get this set-up and have the very high quality in my view, iCloud email service powering your own domain name with your own personalized addresses. Pretty cool iCloud Plus feature.

Let’s have a look at a couple of things relating to Siri. One of the big ones is that for many people, most Siri data is now handled on device. There are a couple of advantages of this. The first is that Apple and a number of other companies, including Amazon and Google have been pinged castigated and pilloried for listening to people’s conversations. They say it’s all in the name of science.

They’ve got to do it to improve their algorithms, but a lot of people have felt nervous about this. If you have a newer device with an A12 processor or greater, so iPhone 10 started with the A12, I believe then you will be able to use this offline Siri. Now, nothing special has to happen. If it’s available, the files will just be downloaded. The first time you run iOS 15 and you should notice a significant improvement in speed. This is the second big advantage. Sometimes Apple’s Siri servers, which get many, many hundreds of millions I think of requests a day, get hammered. We’ve probably all been there where we’ve asked Siri to do something basic, like launch an app or something like that. It says, working on it, and then you think I should have just done the thing myself. Well, those days are passed for those people who have access to Siri on their devices.

Now because some Siri queries are handled on device, and we’ll talk more about specifically which Siri queries we are talking about, it means that more Siri now works without an internet connection. If you’re on airplane mode if you are somewhere where cellular service isn’t available, you will have some limited use of Siri for functions pertaining to the use of your device. That’s very cool. Even if you have a newish iPhone, this may not work for you depending on whether Apple has enabled this for your language. In the English New Zealand language, for example, it does not work. All your Siri requests still have to go to Apple and be processed off your device.

What you can do is try and find another language that works for you that is supported. In the case of those who speak English, New Zealand, you may want to try English, Australia, English, UK, English, United States, one of those languages, they all have the offline support. What I found was that by changing to English, United States, Siri didn’t seem to be any less accurate and I’ve got the on-device processing, which is a significant speed improvement. The way to check this is to go to Siri settings and then navigate to the section that says.

Automated Voice: Siri and dictation history button.

Jonathan: If you flick right, and you get a message telling you that Siri will help you get things done and all the verbiage that’s been there for years, then you don’t have the offline processing. If you see this.

Automated Voice: Voice input is processed on iPhone, but transcripts of your requests are sent to apple.

Jonathan: Then you do have the offline processing. It clearly says that Siri requests are processed on your device. That’s the way to tell whether you have this and if you don’t, it’ll be because you’re using an unsupported language or an unsupported device. Now, not everything is going to be done on your device. If you ask for a Wikipedia entry or something that clearly requires something to be looked up on the internet, then the same behavior as you’ve had before will occur. But if you’re launching an app, if you’re asking for a system function, things that can happen on your device, like bringing up a contact, something like that, then you will notice a significant speed improvement.

Another useful Siri-related feature in iOS-15 is the ability to share content with Siri and invoke the share sheet. I’m going to go into Lire, my RSS reader, and pick a random article.

Automated Voice: Index lead $12.2 million seed in source for a data play to make supply chains, greener supply chains can be in complex logistical.

Jonathan: Well, I’m not sure if Bonnie’s going to be interested in this or not, but I’ll double tap on it.

Automated Voice: Index loading ellipse. [crosstalk]

Jonathan: I can now say to Siri, “Share this with Bonnie,”

Siri: Send this weblink to Bonnie.

Jonathan: I won’t do that, but that’s how easy it is. I can just share anything with Bonnie from my RSS reader. Basically, anywhere that you can invoke the iOS share sheet, you can share with a contact. Unfortunately, it’s not quite as effective as I would like because what I’d like to be able to do is go, “Share this on Twitterific.”

Siri: Twitterific hasn’t added support for that with Siri.

Jonathan: Well, that’s interesting, and I wonder if it ever can because I would love that. At the moment it appears only ready to work with contacts. I’ll just check that though. Share this on Facebook.

Siri: Facebook hasn’t added support for that with Siri.

Jonathan: That implies that it’s the third-party app developer’s problem and that they might be able to fix it and I certainly hope that’s the case because being able to share articles easily with a contact is super useful, but if you could do it on social media as well, that would be great because as you know if you follow Mosen at Large on Twitter, I do like to share quite a bit of technology information that way. Next, I want to take a brief look at the voice memos app in iOS 15. Over the last few years, it has received some love, which is great to see. It’s become an increasingly capable app.

You can now share your voice memos across devices and I know that many iPhone users who contribute to this show with an audio contribution, which I certainly really do appreciate use the voice memos app because it’s built into the phone, it’s easy to make a recording and then just share it via email to me at jonathan@mushroomfm.com and it really livens up the show when we get those contributions. The voice memos app is important and there are a couple of really cool features that have been added in iOS 15 to the voice memos app. Let’s take a look. Open voice memos. We’re now in the voice memos app.

Automated Voice: Back button, all recordings, padding.

Jonathan: I’m in my all recordings folder. I’m going to make a new recording. I’ll go to the bottom of the screen.

Automated Voice: Record button.

Jonathan: Right there is the record button. It’s a very simple app to use. When I make this recording, I’m going to pause for a few seconds. Don’t adjust your sets, I’m doing this deliberately to show you a feature. I’m going to double-tap to start the recording and when I finished, I’ll flick right to the done button, which is just one flick to the right, and then double-tap that so we can have a play with what we’ve recorded.

Hi, it’s Jonathan and I’m recording this on my iPhone 12 Promax, which is soon to become, not the most current phone, but that’s okay because it meets all the needs I have right now. That is a big change for me though. I wonder if I should order the phone after all? No, I don’t think I will. I really don’t need it. I’m going to save this.

Automated Voice: Stop button.

Jonathan: Now we’ve stopped it and if we go to the bottom of the screen.

Automated Voice: Record button.

Jonathan: We’ve got the record button back again. Now let’s go to the top of the screen.

Automated Voice: Voice Memos, all recordings, edit button search, search–

Jonathan: It has my address there, which I’m editing out.

Automated Voice: 8:25 AM. More actions button, track position, zero seconds of 35 seconds. [unintelligible 01:26:05] playback settings button. Actions available.

Jonathan: That’s playback settings and if we flick down, we’ve got some actions here.

Automated Voice: Favourite, Delete, move to folder, more actions, edit, title, activate, default.

Jonathan: In this instance, activate is what I want to do.

Automated Voice: Reset playback settings, dimmed button.

Jonathan: You can reset these settings if you make any changes. I’ll flick right.

Automated Voice: Options, heading, close button, playback speed, heading, playback speed slide, 1X adjustable.

Jonathan: We all know many blind people are used to listening to audio at a clip and if you record say a lecture with the voice memos app, it’s right there on your phone so why not? You may even get an external microphone or something to make that even better then you can playback at a faster speed. We’ll go up here.

Automated Voice: 1.25X, 1.5X, 1.75X, 2X, 2X.

Jonathan: All the way up to 2X.

Automated Voice: 1, 1, 1 O.5X.

Jonathan: You can go down as well to 0.5 X. That could be useful if you are copying something down. If you’re writing something down, say from a lecture or somebody’s giving you some information, maybe you handed over your phone. People do this. They hand over their phone and get people to dictate information and then you can copy it down and you may like to slow it down in that eventuality.

Automated Voice: 0.– 1X.

Jonathan: I’m back to 1X and I’m going to flick right.

Automated Voice: Skip, silence, switch button, off, moon, portal.

Jonathan: Now you know why I was pausing because we’re going to see how good the skip silence feature is in the voice memos app. I’m going to toggle this on.

Automated Voice: On

Jonathan: Flick right.

Automated Voice: Enhance recording. Switch button off.

Jonathan: Why not? We’ll turn that on too.

Automated Voice: On. Switch button on.

Jonathan: There seems to be some sort of issue there, but we’ll go to the top.

Automated Voice: Reset playback settings button, options, heading, close.

Jonathan: Go to the close button. Now we’re going to [crosstalk] play this back and see if it skips the silences.

Automated Voice: Play button.

Jonathan: We will play it.

Automated Voice: Actions have–

Jonathan: Hi, it’s Jonathan and recording this on my iPhone 12 Promax, which is soon to become not the most current phone, but that’s okay because it meets all the needs I have right now. That is a big change for me though. I wonder if I should order the phone after all. No, I don’t think I will. I really don’t need it. I’m going to save this. Now, as you heard, that really did skip a lot of the silence there, but it didn’t sound unnatural. It sounded a bit processed to me, but the skipping of the silence, it wasn’t so clammed together that it sounded unnatural, but there was definitely– All of that silence stuff that I recorded was just skipped over seamlessly.

I paused my recording, this podcast recording, and then I went back in and turned off the enhanced playback setting, but I’ve kept the silence detection on, and actually, to my ears, it sounds much better. Let’s replay that with the enhanced recording feature off, but silence detection still on. Hi, it’s Jonathan and I’m recording this on my iPhone 12 Promax, which is soon to become, not the most current phone, but that’s because it meets all the needs I have right now. That is a big change for me though. I wonder if I should order the phone after all? No, I don’t think I will. I really don’t need it. I’m going to save this.

That sounds much better to me with the enhanced features off You can have a play with this, but the silence detection works a treat. I briefly want to mention Shared With You, which is something that you will see increasingly in Apple products. The idea here is where it’s relevant. You’ll see content that has been shared with you. For example, if you open the TV app, and people have sent you links to TV shows, then you’ll find a shared with you section there and all those TV shows are there so you can think, “Somebody recommended something to me to watch, maybe I’m in need of something to watch something new.” You’ll find what friends have recommended to you. The music app works the same way. If you go onto the web, with Safari without any tabs loaded, so you get the default start page, there’s a shared with you section there. You’ll see web-related content that has been shared with you that you might like to take a look at.

Apple historically has struggled with social things. We all remember ping from iTunes. Then later in Apple Music, there was another thing that they’ve abandoned. They’re trying to dip their toe in the social water in this unique way. I think this way will work. It’s cool when you’re at a loose end and you want something to do to bring up, say, the TV app and find all those shows that others have told you, you should watch. Those are just some of the features in iOS 15 that I think are particularly noteworthy.

I have to say, I think it’s an impressive release this year, there’s a lot of really quality of life type features that have been added to this release. If you want to find out a lot more about iOS 15, then the best place to do that is a website called MacStories.net. Despite the title of the name, there is a lot of iOS content on there, you can really immerse yourself in being educated about how to make the most of the shortcuts feature, which is a great power tool in iOS. They have a lot of great articles.

If you’re willing to pay for their premium subscription, you can get a weekly newsletter called club MacStories, which has a lot of good stuff every week. It’s brilliant. Every year, they come out with the most extraordinarily comprehensive review of each new version of iOS. I would get iOS without the i out before they got theirs out. I would often look to see whether I’d missed too much when the MacStories Review came out of each operating system. Luckily, I generally didn’t, but they go into a lot of terrific detail.

Keep an eye out for that. It’ll be available very soon as I record this podcast at MacStories.net and you won’t find a more definitive look at iOS 15 anywhere. Should you update right away? Well, that’s a very personal decision. Whether you’re blind or whether you’re sighted many people would say to you any major update to an operating system, whether it be iOS, Windows, macOS, is going to have some bugs. If you’re nervous about that, if you really depend on your device to get things done, you may want to wait it out.

Obviously checking social media will tell you some of the critical bugs, and whether they would be showstoppers for you because one person’s showstopping bug is another person’s minor annoyance or perhaps they won’t even notice it at all. The good news is that you’re not going to be under any pressure this time to update to iOS 15. When you go into the software update section, when iOS 15 is released, you’ll be given the chance either to update to iOS 15 now, or to stay on the iOS 14 track for now and get important security updates.

We just saw one of those earlier in the week with iOS 14.8 which patches a really important zero-day exploits. You can stay on 14 for a while and get those fixes upgrading perhaps to 15.1 or 15.01 when it is out. You do have that option. I think by and large this release seems to be pretty stable. Is it bug-free? No software is bug-free but there’s also a lot to like in this release. If you go ahead and take the plunge and install iOS 15 let me know how it works out for you. Drop me an email Jonathan@mushroomfm.com with an audio attachment or write it down. Let me know what you’re enjoying and what you’re concerned about. We’ll get a lot of user feedback I hope on iOS 15 and I hope that this brief overview of some of my favorite features has been helpful.

Automated Voice: Be the first to know what’s coming in the next episode of Mosen At Large. Opt in to the Mosen Media list and receive a brief email on what’s coming so you can get your contribution in ahead of the show. You can stop receiving emails any time. To join send a blank email to media-subscribe@mosen.org, that’s media-subscribe@M-O-S-E-N.org. Stay in the know with Mosen At Large.

Jonathan: Let’s get some reaction from you to recent Apple things and also tidy up some loose ends after the Apple California streaming event. What we do is jump on right after that event concludes and record our initial thoughts and get that published. There is always new information that is published after that. Let’s have a look at some of that now. It seems that the star of this particular show has definitely been the iPad Mini that has the A15 chip in it. It is the same chip that’s in the iPhone 13 but it’s not that simple, because there have been some geeky testing done on the iPad Mini, the new sixth-generation with the 5G’s, and the USB-C port. It all sounds really good.

What they found is that Apple has downclock, the A15 chip, compared to the iPhone 13. Now, by downclock, I mean that it is slower. Why would Apple do that? Probably to make the battery life on the iPad Mini a little longer. Now, it’s not going to be hugely so I doubt that in real terms, you’re going to notice much of a difference to be honest. This is probably more of an academic thing. Just be aware that it is a little bit slower than the iPhone 13. Real-world, you probably won’t care. If you are looking at buying an iPad Mini then be prepared to get yourself a few dongles because as we mentioned in the previous podcast, it has USB-C. I think this is a good bit of pain. Sometimes pain can be good, I suppose. It is worth making the transition to USB-C. I wish Apple would just do it for everything and get it over with because they’re going to have to sooner rather than later. If you’ve got a few lightning accessories, you’re going to have to get some lightning to USB-C dongles. There is another dongle that you are going to have to get that we didn’t know about when we produced the last podcast, and that is that if you use wired headphones, there is no headphone jack in the new iPad Mini.

If you want to use wired headphones, then you’re going to have to use a USB-C to 3.5 adapter. It’s like the lightning to 3.5 adapter that you’ll be most likely familiar with, it’s just got a USB-C port instead of a lightning one. Also no mmWave 5G on the iPad Mini. This is a really fast standard that has only been supported in the United States by Apple and that continues by the way. I was asking the question in the last podcast where we talked about this event has mmWave been extended to other countries? The answer is no, it has not. With the iPhone 13 range, you can now have two isms running at the same time. Let me talk about this because I think this is excellent. Have you ever lost a SIM card. When I used to travel a lot just to save money, I would have local SIMS in countries that I visited regularly.

When I was on the plane to the United States, I would take out my tiny teensy NanoSIMS card from New Zealand and pop in an AT&T GoPhone prepaid SIM so that I had a US number when I landed, and those nano SIMS are tiny. I know that in the United States, people don’t seem to play with SIMS as much as some other countries. When I buy an iPhone, for example, it comes SIM-free. You have to take your SIM out of your old iPhone and put the SIM in the new one and do the transplant. They’re so small. You have to make sure you’re in an area where you’re not going to lose that thing and trying to do that on a plane well, that had its challenges.

eSIMS are a great technology. It stands for electronic SIM. A SIM, if you’re not familiar with this is essentially all the brains that gives you under the GSM system, your phone number, your identity on the cellular network. It stands for Subscriber Identity Module. For a couple of generations of iPhone now you have been able to use an eSIM and a physical SIM unless you’re in China. In the Chinese version of the iPhone, they have two physical SIMS, which is something that my shiny new Android device sitting here on my left also has two physical SIMS. I’m not sure if it supports each SIM or not.

With the iPhone, you could be running both of those lines at the same time, the physical SIM and the eSIM. You could not run two isms at the same time. Now with the iPhone 13 range, you can. That is really brilliant. It means for example, that if I was heading to the United States, I know that T Mobile makes extensive use of eSIMs. When Bonnie has flown back to the United States, I’ve set her up this way from here, provisioned the number got the eSIM all organized and it was sent to her phone so that the moment she landed she didn’t have anything to worry about. She could turn on her phone and have a New Zealand number and her US number working at the same time. The New Zealand number remained on the physical SIM, the US number on the eSIM. Now you can have two e-SIMs running at the same time which makes it far more convenient. That is a great feature in the iPhone 13, but I accept its niche. It’s a bit of a niche thing.

The good thing about the dual SIM, whether they’re physical or not, is so many companies these days want you to carry around a work phone and a home phone, and that’s a bit laborious. One more device to charge, one more device to carry around with you. If you’re IT people will play ball and let you have your phone numbers on the same phone, then you can have two different lines, maybe one on a physical SIM and another one an e-SIM, or now with the 13, both on e-SIMs, then at least for phone call purposes, you can have your home phone and your work phone on the same device.

It is interesting what we were expecting from all of the usually reliable pundits that didn’t materialize. The Apple watch is still the same shape, no complete redesign. In fact, it’s all pretty weird with the Apple watch this year because there’s no release date on Apple’s site and pretty sparse information. I’m not sure what’s up with that. Also remember all the talk about the satellite stuff. Some very reliable pundits were talking about those. I did make mention of it in the last couple of Mosen at-large episodes, that just didn’t materialize at all.

There was talk that there was going to be the ability for you to communicate in an emergency via satellite. That’s not in the iPhone 13 either. It sounds like the iPhone 14 could be a really big release with a new form factor, probably touch ID under the screen, maybe the satellite feature, MM wave being extended, much more meaty, beefy features than just an upgraded camera, an upgraded processor, and an upgraded battery. Gosh, let’s hope so.

We’ve also found out that iPhone 13 has the same amount of RAM. This is not storage we’re talking about, but the memory where all the computing functions take place as the previous generation of iPhones. No RAM upgrade this year. Also, if you are a fan of the iPhone SE and you thought, “Huh, I’ll wait and see what’s in the iPhone. Maybe I wait and see what’s in the new iPhone, in general.” The iPhone SE the 2020, the second-generation has had its 256-gigabyte variant discontinued. Right after the Apple event, it vanished from the Apple store.

There are rumors that there’s going to be a new generation of iPhone SE in 2022. We don’t know whether that’s true given how badly the rumor mongers have done in recent times. I wouldn’t bank on it, but that’s what people are suggesting, that there might be a new iPhone SE 2022. Will it have a physical home button still? I guess that remains to be seen because that seems to be the primary motivator for some blind people going for that.

Let’s go to some listener comments and we’ve got a few from Petra. I will try and synthesize them into one. She says, “Hello Jonathan, I agree with you. I am disappointed with the iPhone offerings this year. I have an iPhone SE second-generation or 2020 and thought I would upgrade so I would have better battery life, LiDAR and better AirTag navigation support.” You are right, of course, the iPhone 13 and 13 Mini do not have LiDAR.

“I would have to get the 13 pro, 12 or something larger than the mini and more expensive. I’ve been pretty happy with my SE, but I will go into an Apple store when the new phones are on display and have a look. Do you have any idea why LiDAR isn’t in these phones? Do you think it’s worth the extra cost and larger size? I’m not really sure how often I would use it.” Thank you also, Petra, for your comments about the podcast. I appreciate that.

The reason why LiDAR is not in the iPhone 13 is purely and simply a product positioning decision. I’m sure it costs a wee bit extra to have the LiDAR stuff in the phone, but not that much. Apple has decided that LiDAR is a pro feature. If you care more about the size, then I wouldn’t worry about LiDAR too much, to be honest. LiDAR at the moment is more in my view for blind people a proof-of-concept thing. There are one or two exceptions to this, especially in the era of social distancing, as I mentioned on the previous episode, episode 148.

“I did have a great experience with LiDAR last week, particularly because for me, it can be hard to hear in environments that are noisy or whatever. I was able to follow somebody at a safe social distance, thanks to LiDAR. When they moved, I was able to tell that and readjust. It was really quite impressive,” but how often are you going to hold your camera out in front of you looking like a bit of a knit so you can use the LiDAR? It is really cool and I can’t wait to see how it goes, but is it one of those features that I couldn’t possibly live without? No, it absolutely isn’t.

If the size is more important to you, then go with the 13 Mini and most of your objectives with the exception of LiDAR will be met. You’ll get better battery life than your SE, which has a reputation for poor battery life. You’ll get the precision finding, which is not available in the SE. It will be a very worthwhile upgrade for you. There is battery to consider, even if you went to the 13 rather than the Mini, you’d get bigger battery life. Of course, if you went all the way to the Pro Max, where you would have your LiDAR and you would have your amazing battery life, that sounds like a killer in terms of battery life.

All these things are about trade-offs, aren’t they? Based on everything you’ve said, I think you’re likely to find the 13 Mini a good choice, but it sounds like you’re on the right track. Go into an Apple store, have a look at them, decide what you like, what feels okay in your hand. When Bonnie has my 12 Pro Max, she says, “Gosh, this is just too big for my hands.” Heidi feels the same way. That’s why I bought her an iPhone 12 Pro because it’s just too difficult for her to hold. All sorts of factors to consider.

I still like the Max size, although I don’t like the feel of them on the 12 range as much as I did the 11. The squarish shape has made the phone just feel so much chunkier and bigger to me, but I like it for the Braille screen input. That’s not to say that you can’t do Braille screen input on a smaller phone. Of course, you can, but a lot of it would depend on hand size and your degree of comfort. I also do like the battery life. Petra is also asking, “What about the AirPods then?” Yes, we were expecting AirPods 3 to be announced at these events.

They were not, but who knows whether you can believe these usually reliable sources about anything Apple anymore? It’s suggested that they are ready to go and that there will be a second Apple event in the near future which will unveil new Macx with an M1X processor. This is the second-generation of Apple Silicon Max, and we are expecting AirPods then, but perhaps AirPods 3 are the new air tags. Look how long it took to get those.

You asked about the cameras. Do they protrude or are they flat? No, there’s a camera bump. There is definitely a camera bump. You’ll feel the cameras on the back of the phone. Petra says he appreciates the way Heidi explains things and even anticipates what we might want to know. How do you pay for things without Touch ID?” asks Petra. Well, Touch ID is a form of biometric authentication and Face ID is another. It’s the biometric authentication used in all new Apple iPhones, except for the SE second-generation.

What you do is you double tap the side button on the right of your phone when you’re ready to do an Apple Pay transaction, and then you authenticate with your face. You hold the phone out in front of your face and double tap the side button, you’ll get a wee bit of haptic feedback and the Apple Pay transaction can proceed.

Stan -: Greetings, Mosen at Largers. This is Stan Warren Latrelle in Medford, Oregon. I do have to share my views on the Apple event that took place last Tuesday. I listened to it and like many of you, I was quite underwhelmed by the whole thing. I think part of the reason why that Apple didn’t do different things a little bit differently than I would have liked; I think part of it is that there’s still problems probably with their supply chains and how they can do things. I was hoping for a fingerprint sensor on the new iPhone 13, and unfortunately, that did not happen.

One of the reasons why I’m considering the upgrade either though I should be less likely to, but the reasons that I’m eligible for one, but more importantly, the battery is a little bit better than the 8 and I thought it didn’t have a problem with it. I realized after receiving the phone, that I had to turn off a lot of features just to save battery consumption. Even with that, it’s just I’m going to zero. If you’re showing a sighted person something and you turn off the brightness, well, there is no other way to say it other than that.

The reason I’m going to upgrade possibly is the camera situation. I think the camera would be better for doing things like recognizing text and getting text in the picture, so to speak, in terms of scanning. Also, I like the idea of LiDAR and what it can do, and what has the possibility of doing, and that’s surely something that is of importance to me. I think it could mean a lot in terms of GPS and things like that. I think better performance for seeing AI and apps like that, and even Envision AI and that sort of thing. I’m really excited what I’ve ever heard about iOS 15, anyway.

Jonathan: Good on you, Stan. I’m sure that you will enjoy the upgrade, it’s going to be a big one for you, and I think it will be a beneficial one for you given the device that you are coming from. LiDAR is interesting and I think where LiDAR will really start to come into its own is when Apple’s glasses come out. We don’t know when they are due but we know they are being worked on, that and the Apple Car. Oh, my goodness, I can’t wait until that comes out.

LiDAR in the glasses, I think will be much more useful but it is really good to play with. I guess, if you really want to get into using LiDAR regularly, then you can use similar devices to those which some Ira users are now using given that there are no Ira glasses anymore, where you put the phone in a harness or something like that to make sure it’s angled correctly. That may well work and it may be beneficial if you’re doing a lot of traveling. I hope you enjoy that upgrade. It’s always nice to get a new shiny thing that you can see will improve your life.

“Thank you,” says Kathy Blackburn, “For the special podcast recapping the Apple event. I try to find the event stream in my TV app on my iPhone 8 and couldn’t find it. I found a link from a source I trust on Twitter but I could never get anything to play. I did look at the coverage from the Washington Post. After hearing your podcast, I have to say that nothing about the new devices attracts me. Had Apple decided to put a fingerprint sensor on the new phones, I would consider upgrading. For now, I don’t need either a new phone or a new Apple Watch.”

Thank you, Kathy. What you do on your phone to listen, if you don’t want to use the YouTube app, and you do want to get the audio description, is just go to apple.com in Safari. You will find that normally when the events are on, it’s right there on the Home page, and if it isn’t, there’s a really easily findable link that you can go to. It was in the TV app on the Apple TV which is how we watched it, but normally when I’m watching on an iPhone, I just go to Apple.com in Safari and it’s right there.

Marissa says, “I caught some of the Apple event on September the 14th 2021, I personally have not seen much innovation on the side of Apple. In terms of the products, it seems like there is nothing new and exciting. It’s always the same designs for the iPhones and iPads. Sure, this year, the Apple Watch got a bigger display. Series 7 finally has a software keyboard now that FlickType is gone. The pricing is still the same on most items. The internal specifications are always going to be faster year after year. I am one of these people that if my current devices work well, I don’t see any reason to upgrade.

While Apple touts their products by saying, “Made in California,” we all know that they’re most likely produced in China and shipped to California, and then Apple charges an outrageous price for their products. The only good thing about Apple products that I’ve noticed is they tend to last quite a long time. That is of course if you take care of them. I really wish with the events that it would be similar to the podcast where you can skip around portions you don’t want to see or that are not of interest to you.

For example, when they start talking about all the games that you can play, and all the graphics and things like that, I just want a quick rundown of all the products and that’s it.” Thank you, Marissa. Yes, the watch keyboard is a very controversial decision and it has prompted Apple to make further comments on the Flicktype situation. We covered the story extensively when the developer of FlickType said that he wasn’t going to be able to maintain the accessible keyboard that so many blind people use because of a dispute with Apple where they claimed that the app did not comply with Apple’s guidelines.

Now in a discussion with Apple Insider, Apple has agreed that they were wrong on that front. They do dispute quite a few things that the FlickType developer is asserting but they acknowledge that they got it wrong when they did not approve that update, fairly mediocre update to FlickType just with a few bug fixes, and started this most recent controversy. In that interview, they said the previous version of FlickType remains in the app store, and that he is welcome to submit a new version which will be approved.

A couple of days later, the FlickType developer responded to a journalist working for The Verge and he has said the following. “I will be delighted to bring back the accessible FlickType keyboard for iPhone when Apple finally fixes their broken third-party keyboard APIs on iOS and allows developers to fairly compete with Apple’s own keyboard. They must also ensure that every single reviewer has basic voiceover training. We keep getting rejections due to reviewers not knowing or even understanding how to use VoiceOver.”

He continues, “I’ve already poured thousands of hours developing my app, working around campus keyboard API issues, and dealing with app review. I’m really looking forward to Apple’s improvements and will promptly resubmit the FlickType VoiceOver keyboard when sufficient progress has been made in these areas.” I guess I feel a bit conflicted about this. On the one hand, I understand his frustrations, on the other, it does seem unfortunate that we do have some blind people who are now highly dependent on FlickType for efficient use of their iPhones.

Not everybody knows Braille, so Braille screen input isn’t an option. If you’ve been totally blind since birth, you might not know how to do the handwriting thing and even if you do, it probably isn’t as efficient as FlickType. It is unfortunate that blind people are collateral damage in this war between FlickType and Apple, but at the same time, what he’s asking for is not unreasonable. It should be the case that VoiceOver is understood by those reviewing apps.

If you, like me, wrote to Phil Schiller, and express your concern about this, congratulations, advocacy does pay off. Sometimes people think there’s no point in writing, people will do what they do, but we have now got an admission from Apple that they got it wrong and I think that is a significant victory. Writes Rebecca, “This is the year of the Mini. iPad Mini and the iPhone Mini 13. However, no Touch ID on the iPhone. I am glad I bought the fifth-generation iPad Mini. The latest iPad Mini is slightly larger. I’d like to see the iPad Mini have a case with a keyboard included. I wonder if keyboards could be connected to the USB-C port?”

Yes, Rebecca, they absolutely could and they can also be connected to that little side port, that special Apple charging port that belongs to iPads that is on the iPad Mini as well. She continues, “I think the motion feature that follows the user during video calls to keep them in focus is neat for totally blind users. I wonder how big the iPhone 13 Mini is and why Apple is selling it given the poor performance of the iPhone 12 Mini.” Well, I think the reason why they’re selling it is because it’s basically the same insides.

They’ve got the casing, they’ve done the manufacturing, the tooling, it’s really no big deal given that they are not updating the form factor to just keep it on the market. “I will not give up my iPhone SE 2020 anytime soon,” says Rebecca. “The amount of discussion on photography and displays bored me because I have no usable vision.” You can be a photographer these days and be totally blind. I take photos. You can read the Judy Dixon book, Capturing and Sharing the World. I remember that’s what it’s called now and you’ll get some great tips on being a photographer.

Anyway, she continues, “I would have been happier if we got a new iPhone Mini with Touch ID and Face ID. I wish the new watch supported Bule Coat’s monitoring. I look forward to seeing what next year offers.” Yes. We are going to be waiting for that Blue Coat’s monitoring for some time it seems. There’s quite a lot of tricky tech involved in getting this right. We expect next year, but who knows anymore? We expect to see next year a temperature sensor. You’ll be able to assess your body temperature with the next Apple watch.

How small is the Mini? Well, let’s take a look. Width wise, it is 2.53 inches. That is 64.2 millimeters. It is 5.18 inches high. That is 131.5 millimeters. The depth is 0.30 inches or 7.65 millimeters. The weight is 4.97 answers. That’s 141 grams.

Iona: Hi, Jonathan, greetings from Montreal. This is Iona, and this is my first audio submission. I thought I would take a moment and talk about iOS 15 beta and my experience with it. It’s funny, you mentioned that it’s one of the more stable ones, and I agree except that they broke a feature that is very, very important to me, and they broke it so completely that I found myself having to return to iOS 14. What I’m talking about is Siri shortcuts. I really like Siri shortcuts. I use them for all sorts of convenient tasks. I could talk about that a long time, but I will not do so here.

What they broke was the ability to fine tune shortcuts using variables. They made it very, very hard to edit a shortcut and add magic variables or variables. This might become a bit technical, but basically, if you want to create shortcuts that are more customizable and a bit more complex, it became as far as I can tell, basically impossible. I submitted the feedback to Apple and I do hope that any other blind people using shortcuts in this way will do so also because really, I had to return to a phone using iOS 14 so that I can get my work done.

The other thing that I had problems with and that decided me to return to iOS 14 was because I got that Apple battery pack that you reviewed a few weeks ago, the MagSafe battery pack. That one was very cool. I like the design and everything except that it was not charging my phone, or it was charging it so slowly that if I was using any GPS app, even if the phone was locked, the charge was going down. I had no idea if it was beta, if it was some problem with battery drainage on my phone, or if it was the battery pack. To clear all that up, I had to return to iOS 14, and good news is that the battery pack is working better now.

These are my thoughts on iOS 15 beta. I love it in general and I hate what they’ve done with the shortcuts. Thanks again for a wonderful podcast and all the best.

Jonathan: Thank you, Ioana. The first thing I want to do is sincerely apologize because for the longest time, I have been mispronouncing your name. I’m so, so sorry. I hate doing this. I’m glad you sent in this audio contribution. I know how to say your name. That’s really interesting about the shortcuts. I’d love to hear more from people about that. When you have the time, please do feel free to send in a contribution as long as you want about how you’re using shortcuts, any tips and tricks that you’d like to pass on. I know that listeners would appreciate that.

Speaker 3: What’s on your mind? Send an email with a recording of your voice or just write it down. Jonathan@mushroomfm.com, that’s J-O-N-A-T-H-A-N@mushroomfm.com, or phone our listener line. The number in the United States is 86460 Mosen. That’s 864-606-6736.

[music]

Jonathan: Now it’s time for more adventures in Android.

[music]

Jonathan: If you’ve been listening to my podcast for a while, you’ll know that I have dabbled in the Android ecosystem several times. I think the first Android device I had was running Gingerbread that was pretty slow and clunky. Then I got a Google Nexus 6 and I did quite enjoy that I could see some innovation going on in the Google ecosystem. The one big thing that I could not get on well with was the Angular gestures in Talkback. They are bad for my mental health. I know that there are people who are using them day in and day out, and they don’t have a problem. That is okay.

There are probably things that I do with my iPhone I don’t think twice about that some other people find difficult. Different strokes and all that stuff. Okay. The Android Angular gestures were a showstopper for me. They made me very grumpy. Then in 2017, I got a Samsung Galaxy S8. I was doing Mosen consulting then, and it was important that I could evaluate the accessibility of apps and provide advice. I decided if I get the Galaxy S8, which had the Samsung Galaxy Talkback on with multi finger gestures, then I could make some really good progress.

If you go back into the Blindside archives, you can still search and the Blindside Podcast still publicly available out there on the internet. You will see a series that I did called The Blind Person’s Guide to The Galaxy, and people have said to me that they appreciated that. When Mosen Consulting closed, I didn’t really have a need to have an Android device. I’m an iPhone user. I can’t see myself changing from an iPhone as my primary device. The reason for that is that I have made for iPhone hearing aids. I have seen the evolution of iOS accessibility features, and I think they are quite impressive.

Of course, there were problems and we talk about those as problems, but it is a very functional device for my needs. Also, I’m a Braille user. Braille is really important to me because I am actually quite productive on my iPhone. Particularly, with my mantis which has worked out really well for me. From time to time, people approach me saying, “Could you do a review or a demonstration of a certain Android app?” I’m delighted to do that if I can. In fact, we did one earlier in the year where my son, Richard, came over. He got the Galaxy S8.

I actually gave it away to him because I think his Android device had some problem with it. It had seen better days. I said, “Look, for the most part, my Galaxy S8 is sitting in the drawer. You may as well have it.” He brought the Galaxy S8 over and we did that demo, but it has been in the back of my mind for a wee while now that Talkback supports the multi finger gestures to just have another Android device to dabble in the ecosystem to see what’s going on because I am interested in this stuff. I like to keep current.

I’m not doing this with any expectation whatsoever that an Android device will be my primary device in the foreseeable future. For that to even be a possibility for me, Braille would have to be a part of the screen reader. It would have to be at least as robust as the Braille in iOS is. Look, Braille in iOS has some problems. There is scope for some churn in the Google ecosystem if they got the Braille right. Right now, they are far behind what iOS is offering in terms of Braille. That’s the context of these Android experiments that I am now doing.

When I was underwhelmed by the iPhone 13 range and I knew that I was not going to spend the money that I had budgeted for the iPhone this year, I decided I could actually spend a little bit of that on an Android device and Chronicle what I’m experiencing. Now, I consider myself a power user of the iPhone, but I obviously don’t dabble in Android very much. I am going to give you my impressions as I go through this process, and come up to speed. I certainly welcome people’s advice and possibly, corrections. If I am not fairly representing something, then I’m certainly happy to be corrected. I hope that Android listeners will tune in droves and send in responses.

I’m quite happy for as much Android discussion to go on, on the show as we have iOS discussion, even more if Android users want to get into this. Perhaps, by having an open and honest conversation about strengths and weaknesses, we can inform people and also clear up any misconceptions that I might have and that other diehard iPhone users might have. Now, the first thing I had to decide was if I’m going to buy an Android device, what device should I buy? I actually think the answer to that for many people may be a Google Pixel. It’s the device that’s manufactured by Google. It’s running Stock Android, and the devices are quite respectable. There’s a good range of them.

In New Zealand, Google Pixel is not sold. Some places import them, but you can’t actually buy them. If you want support, that could potentially be an issue. I’m happy to deal with that. The other reason why I ruled Google Pixel outright now is that I know the Google Pixel 6 is on the horizon. I thought I might buy a cheaper Android device and just dip a toe in the water. If I’m finding myself enjoying it, maybe I’ll get the Google Pixel 6 when it comes out. With that in mind, I first looked at a Nokia X20, because it is running standard Android or what they call Stock Android.

What this means for those not familiar with the way that Android works, is that manufacturers do have the ability to modify Android because Android is open source. Some manufacturers give the operating system their own flavor. That can potentially have some accessibility challenges. While there’s a plethora of devices out there, you do have to be mindful of this. When I read the review of the Nokia X20, some people said it was a bit sluggish and that made me a little nervous because I thought, “Man, if it’s sluggish without Talkback running, maybe it’s going to be sluggish with it running.”

Some people recommended a device by a Chinese manufacturer called OPPO and I was looking at the A94. Now, OPPO devices are great value for money. The build quality is excellent. Right after the Apple event, when I was absolutely certain that I wasn’t going to buy an iPhone this year, I ordered an OPPO A94, and it arrived the next day. I decided that I’d have some bonding time with Richard, my Android-using son. He’s the black sheep of the family. He’s the non-iOS using child out of all of them. He came over on Friday and we set this up. When I got it out of the box, given that it only costs NZ$549.

You might be able to ask your voice assistant of choice to do the conversion from NZ$549 to the currency of your choice, but it’s pretty cheap given what you’re getting. It’s a 5G smartphone, it’s dual SIM, it’s got a pretty fast processor on it. If you’re Googly inclined, you can Google the OPPO A94 and find it, and you’ll find that it’s a very high spec phone. When you get it out of the box, you find that it actually does come with a charging brick and it’s a 30-watt charging brick. It’s really fast. It also has a 3.5-millimeter headphone jack in it. Imagine that. It also comes with a little protective case.

You get a lot in the box and with the deal that I got it for, you get a redemption thingy where you can get a bluetooth speaker as well. We set this up and the first thing I noticed when I powered the phone on was you guessed it a little bit of haptic feedback. I have no idea why Apple refuses to put this in an iPhone, but it is so reassuring when you press the button and you get the haptic feedback to tell you, “Yes, there is juice in this thing. Yes, the phone is starting up.” Because if you’re a blind person, how else are you going to know? You have to hope that it’s powering on and that in due course you’ll be able to do the setup process.

I held two fingers on the screen. I think the preferred way to get Talkback running is probably now to press the volume up and volume down button. I’ve learned that from experience over the last day or so. Holding two fingers down on the screen on this one did actually cause Talkback to start up. The first thing I was reminded of was what a great setup experience you get with Android. The way that Talkback comes up and takes you through this very helpful tutorial, introducing you to the gestures is wonderful. Apple would do well to copy something like this.

The moment I had Talkback up and running, I performed a three-finger single tap on the screen. I did this because I’d done my research and I know that when you do a three-finger tap if you’re running a device that supports multi finger gestures, you will get the Talkback menu. My heart sank because I did not get the Talkback menu, but I completed the setup process. Most of which was really quite doable, except that I could not scroll through my list of Wi-Fi networks. Flicking around wouldn’t do it. I had to drag my finger around the screen and I did find the network that I wanted. That wasn’t so bad.

I then got into a really strange position where I had the terms and conditions, the end user license agreement from OPPO on the screen, and that was completely inaccessible when I flicked through it. It would just make the sound to tell you that you were on an element, but you wouldn’t hear anything on the screen. I did manage to find the ‘I agree’ button having not actually understood what I’m agreeing to because that was totally inaccessible. There were some issues right from the get-go regarding accessibility.

When I got it set up, logging into my Google account and doing all of that kind of thing, I updated all the things I could find to update. There was a really big operating system update. There was a Google security update, and we got all the way up-to-date. We also, of course, made sure that we were running the latest version of the Google Accessibility Suite. Still no multi finger gestures for Talkback. I verified this by going into the list of gestures and there were no three-finger gestures showing up. I then did a bit of Googling and found that some people said if you go into the advanced Talkback settings and choose developer settings, you may find them under there. That you have to enable the gestures. That option was not available either.

Now, when I can, I buy from Noel Leeming here in New Zealand. The reason why I buy from them is that they are amazingly good to me. At least the stores here in Wellington are. If something like this happens and I need to exchange or get a refund, they have never given me a hard time about that. Especially, if I explained that there is a genuine blindness reason for doing it. I knew that if I had to put up with these Angular gestures, I’m just not going to use the thing at all and not get familiar with it. I can’t stand those Angular gestures. Okay. It’s a showstopper, it’s a deal breaker.

I took it back. Thank you, Richard, for driving me there and got a refund because what I then decided to do was go with my original plan A and buy the Nokia X20, which has Android 11 and it’s running Stock Android, no modifications at all. This also has a 3.5 headphone jack in it. The build quality is great. I have to say when I powered it up and heard that good old Nokia sound, it really was quite a nice feeling to be back with a Nokia phone again. I pressed the volume up and volume down keys to enable Talkback and it’s a two-step process that tells you to do that again, hold them down for three seconds if you really want to enable Talkback.

I understand that people sometimes enable Talkback accidentally. In fact, if you Google on Android Talkback, most of the results you get are about how do you turn the jolly thing off, which is a little bit sad. I got Talkback up and running, the very first thing I did was a three-finger tap to see if I had the gestures working. I did not. Yes. Then I set up, updated, did the same things I did before. I checked the menu of gestures and then I went in and checked the advanced settings. There was no way it seemed for me to enable the multi-finger gestures on Talkback.

Yet again, it was time to take another device back. I have to say, though, I really liked the build quality of the Nokia X20. I couldn’t really tell how fast it was performing because we didn’t do much with it, but it does have a fingerprint sensor on the power button, and it’s quite sensibly laid out. It also has a dedicated Google Assistant button and it just seemed like a really nice phone, so nice that it even made me think, “Can I live with the Angular gestures? Do I really have to have the multi fingers?” That’s how dramatic it all was, but no, I don’t want to live with the Angular gestures.

Back into the car, we go into the Richard Mobile in a very cold, sleazy Wellington day, and device number two goes back. This time, I relied on a good old Mosen At Large episode from good old Nick Zammarelli, because I’ve heard good old Nick Zammarelli on this very podcast demonstrating his Samsung Galaxy S21, which definitely has the multi-finger gestures. By this time, Richard is saying, “Are you sure this is going to work, dad?” I said to him, “Richard, I’ve heard it working on Mosen At Large, it must work.” We got the Samsung Galaxy S21 home, which to be honest, was a lot more money than I was hoping to have to spend on an Android device for these experiments.

We got it all powered up and pressed the volume up and down keys, and we got Talkback going. The first thing I did was a three-finger tap, and as they like to say, third time lucky. In America, they say third times the charm, I think so. I am now rocking a Samsung Galaxy S21 with Talkback that has multi-finger gestures. It’s early days and I’ve got lots to do, but it’s really interesting that a much more expensive phone did not come with a charger. It does not have a 3.5 headphone jack. The Nokia X20 did, by the way, in addition to the OPPO device that we bought, it had a 3.5 headphone jack.

I just think that it’s a little bit cheeky to be honest, but anyway, it is what it is. As people like to say, it just comes with a USB C cable in the box and I use my apple charging brick to keep it charged. I haven’t had a lot of time to use it, but I do have some initial impressions. When I was setting up the device, I had to enter my Wi-Fi key and establish a pin, that kind of stuff. I was flabbergasted to find that when I tried to do that on the Galaxy S21, when you are moving around the keyboard, instead of telling you what you’re moving across, it just says bullet, bullet, bullet. Now, this is a security feature, where it only speaks passwords if you have headphones connected.

What I want to know is why on earth, why on earth would a manufacturer turn this on by default? It wasn’t on by default in the Oppo or the Nokia, and it shouldn’t be on by default, because the dilemma that I had was that I don’t have headphones that can plug into this thing because I don’t yet have a USB C to 3.5-millimeter adapter. Now I was able to get at the Talkback settings and change this but that’s because I’m a reasonably geeky person, and I have used Talkback before but it just seems to me a very, very poor decision on Samsung’s part, presumably, to turn this feature on by default when someone’s trying to set up the phone.

The other thing is that I had quite a time setting up the fingerprint sensor. The fingerprint sensor is under a particular part of the screen, I guess I’d describe it as center-ish right, maybe bottom center-right or something. You can’t detect that tactually because it’s under the screen. It did take me a while and you get used to anything new, I guess over time but it took me quite a few goes to get my fingerprint registered. I then found that rather than powering on the device and then trying to unlock it, the thing to do is just to feel around on the screen until you find the fingerprint sensor, and you get a bit of tactile feedback, and then you will unlock it without having to press the power button.

I haven’t done too much else with the device at this stage because I spent an inordinate amount of time trying to work out how to get a device with multi-finger gestures working. Just to say before I go that I did call Google Accessibility support, and I called it through Be My Eyes. Thank you Be My Eyes for doing that and thank you Google because as a hearing-impaired person, the audio quality was so good and somebody who spoke fluent English, and was able to understand the issues I was raising. I had an absolutely brilliant first call to Google Accessibility support, disability support.

The man who I spoke with was interested, he understood what I was getting at. My question for him was, “How do you tell which devices support multi-finger gestures with Talkback?” Because it seems to me that if you purchase a reasonably capable Nokia device, running the latest version of Android, Android 11, and it clearly in its everyday life supports multi-finger gestures, why don’t they just work with Talkback? I get that the Oppo phone which runs ColorOS, their own variant of Android, was always a longer shot. I was pretty relaxed about having to take that one back. It was always a bit optimistic, I suppose, that that would work, but why wouldn’t Talkback do multi-finger gestures on a Stock Android device with pretty capable hardware?

He to his credit said it’s a really good question, and he was going to try and find out for me. I will try and find out because it seems to me what must be happening is that Google is essentially saying in Talkback, if a particular device is identified, enable multi-finger gestures. I would have thought given the culture of Android, it should be possible to enable those multi-finger gestures even if you get a warning, enable these at your own risk and take your chances, but I could not find them. They were not in the list of gestures; they were not under developer settings in Advanced.

That was unfortunate. I didn’t really want to get the S21, to be honest, but I am pleased that I am rid of the Angular gestures and now can get on with becoming familiar with the Android experience. There will be more adventures in Android in future editions and I certainly welcome people’s recommendations for apps, for launchers, any tips and tricks, and we’ll talk Android in subsequent weeks.

[music]

I’d love to hear from you. If you have any comments you want to contribute to the show, drop me an email written down or with an audio attachment to Jonathan, J-O-N-A-T-H-A-N@mushroomfm.com. If you’d rather call in, use the listener line number in the United States, 864-606-6736.

[music]

[02:24:50] [END OF AUDIO]