Podcast Transcript: Mosen At Large episode 165, workarounds for Ubers buggy apps, running on a treadmill without holding on, the latest version of iOS Access for All is out
This transcript is made possible thanks to funding from InternetNZ. You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.
Jonathan: I’m Jonathan Mohsen. This is Mohsen at large the show that’s got the blind community talking. Today, workarounds for those experiencing bugs with Uber’s apps, blind people running on a treadmill without holding on and Shelly Brisbin has updated iOS access for all. She joins us to discuss this revision.
Jonathan: Nice to be back with you. As I record this, it is really humid and just muggy here in Wellington. It has been that way for a lot of the country at the moment but I guess one expects that in February at the height of summer but the weather is pretty bizarre. I hope all’s okay, where you are. I know that in some parts of the Northern Hemisphere, there’ve been quite the opposite conditions in recent times with a lot of blizzards and things going on. Stay wrapped up and warm if that’s you.
I wish we could do an exchange. I’d be happy to just exchange a little bit of this hot weather for a little bit of the cooler weather that you’ve got. We could find a happy medium, perhaps you could Dropbox me some snow or something like that. That would be good.
Before we get underway with another bumper crop of useful contributions, I did want to mention that my radio show on Mushroom FM, the Mohsen Explosion has had a bit of a facelift. I don’t think I’ve mentioned it on the podcast since I’ve done this, that the Mohsen Explosion has been going for 20 plus years now, very long time. In recent times it’s been on Mushroom FM, every weekday at 2:00 AM and then repeated at 2:00 PM US Eastern time. If you go to the Mushroom FM website, we’ve got some great programs there and the schedule is displayed in your time zone. It makes it easy to convert. You’ll be able to find out when 2:00 AM or 2:00 PM North American Eastern time is where you are assuming you’re not in that time zone.
This year we’ve ramped it up quite a bit. We’ve got a lot of interesting little bits of news and trivia. It’s all pretty light though and we have this quiz that has got quite a few people talking everyday called The Brain Exploder. It really is a fun thing because we ask you some pretty obscure questions that get you thinking about what the answer might be. Of course it’s all interspersed with the four decades of magic Mushroom memories that Mushroom FM is famous for. We play music from the 50s right through to the 80s. If you haven’t checked out the Mohsen Explosion lately, I encourage you to do that. It’ll be great to have you along. We will take requests from those decades. We love to say hi to listeners. Of course there’s plenty of other great programming on Mushroom FM as well.
Hello to Alexander who says Hi Jonathan, thanks once again for the podcast. Also the great feedback from your listeners is always great. I also updated my iPhone 8 to iOS 15.3 and see the Braille panning issue. What I have not seen so far is the freeze issue some people seem to get.” Lucky you Alexander. He says , “The freezing issue may be hard to track down. The panning issue should be easy to resolve by Apple as this is reproducible. I also agree with those saying Apple should pay more attention to their special functions for disabled people otherwise they may lose their outstanding points in the community. Hope a new beta of 15.4 will have some improvements in the Braille area.”
Thanks Alexander. I’ve got the 15.4 beta2 installed on my iPhone now and there’s another odd anomaly going on that wasn’t there before now. This relates to the apostrophe character. Whenever I read an email or a tweet or any document with an apostrophe, I’m now getting this very long string substituting for the apostrophe instead of dot three. This only seems to have started with 15.4 beta2. Now I have been able to get around it by substituting the default system Braille table for the Lib Louie Braille table. When I do that, everything’s restored and I can get the apostrophe back without it taking up a very large number of characters and it really slows me down obviously when I’m reading the things like this podcast.
It could be something unique to me because I have been fooling around with Braille tables lately, trying to be helpful and seeing if I can find a pattern to reproduce these lockups. I also have seen the panning issue persisting in 15.4 beta2. I have not yet had a lockup since installing it but sometimes I have been able to go for a few days without that lockup happening. I can’t call that fixed yet but that’s a very encouraging sign.
If anyone else has anything to report on this, please feel free to be in touch. The email address Jonathan@mushroomfm.com. You can attach an audio clip or write the email down. The listener line number 86460 Mohsen 864-606-673-6.
Christopher: Hi Jonathan and all I heard the podcast today about problems that people were having with the Uber app. I believe I may have found a solution, which I’m actually going to show you. This works both with locations that are already set up in the app, if you’ve got your favorite destination set up but likewise, it will also work just as well if you type in a destination and then hit search. But the key here is, after you’ve entered your destination, do not use the search button. Instead, take one finger up to the top of your phone. Then run your finger down until you start hearing voiceover read the destinations. I’m going to open the Uber app now.
Voiceover: Uber back button.
Christopher: I’m going to go back to my main menu.
Voiceover: Back button, back menu button.
Christopher: I’m going to choose ride.
Voiceover: Turn way to button, turn one ball three, 201 Ligon Street Carlton. Set the mood on your ride. Tell your driver your conversation preferences in the app ride in comfort button.
Christopher: It’s giving me a promotional message.
Voiceover: Wintu button turn one ball three, 201 Ligon Street, Carlton Vic 3,053 button.
Christopher: It’s telling me a restaurant in Carlton. If I double tap on that, that will select it. Then everything else will work as per expectation. What I’m going to do now is close and reopen the Uber app.
Voiceover: App switcher, Uber active closing Uber app, switcher, voice memos, active. Home, social folder, Uber. Uber menu button.
Christopher: Now for the purposes of this, I’m just going to choose that I want to request a ride.
Voiceover: Turn Wintu button turn one ball three 200 win button.
Christopher: It’s remembered my last location. That’s great but in this case, we want to put in a different destination. I’m going to hit the where to button.
Voiceover: Destination search field is editing, where to insertion point at start.
Christopher: However, instead of typing in something at this point, I’m going to just–
Voiceover: Hotel Espanada 11b Espanada St. Kilda button.
Christopher: That’s a famous hotel in St. Kilda. Let’s select that.
Voiceover: Selected hotel Espanada back button.
Christopher: Basically from here, the Uber app will work as you would expect. You can make all of the normal selections and it will work. I hope that gives people a guide as to how to put a destination in. It works just as well if you search for a destination.
Jonathan: That’s Christopher Sims with that information. Thank you, Christopher. I can confirm this works. The short version of this is don’t flick, don’t flick left and right in the Uber app, explore by touch and then everything seems to be okay.
That is nice to have that workaround and hopefully the issue will be fixed in due course. It is interesting though that it is only affecting a few people but for those this is affecting it’s a bit debilitating so don’t flick around that app just explore your screen by touch. Very handy tip, Christopher, thank you.
Staying in Australia, Tristan Claire says, “Hi, Jonathan. The following is in response to the segment on the last podcast, dealing with accessibility problems in the Uber and Uber Eats app. I have a couple of possible workarounds that I’d like to share with you and your listeners.
“Like you, I experienced the problem with Uber, where it was impossible to access a destination. Not only had all my saved and recent options disappeared, I couldn’t write one into the text field. I ended up uninstalling and reinstalling the app and normal service was resumed. It was a few weeks ago now so I forget whether I had to log back into my account or not. So I suggest that people have their password handy if it’s not already in a system like a one password.
“The Uber Eats workaround is a bit more tricky. Unfortunately, the problem you encountered with the option to text or call your driver seems to be related to the double-tapping action in voiceover. As you know, that option says dimmed when you tap on it. The workaround I use is as follows, navigate to the part of the screen that says message driver, voiceover will announce it as being dimmed, Spatially, it’s towards the bottom left corner of the screen. Once you’ve found it keep your finger hovering above the location, then use a different finger or your other hand to turn off voiceover by triple clicking the side button on your phone.
“I don’t suggest you use Siri to turn off voiceover, because you then have to contend with the Siri button, which you probably don’t want to do when you have no speech. Once voiceover is off, single tap the part of the screen you navigated to earlier. You should hear the noise that signifies that the onscreen keyboard has appeared. Turn voiceover back on, and you’ll be able to text or call your driver as normal. Braille screen input and keyboard shortcuts will work and the send button will no longer be grayed out.
“I’m not saying this workaround is a substitute for proper accessibility practices. I imagine it would be fairly hard for people to use if they don’t interact with the touch screen of the phone. I just thought I’d pass on what works for me in case anyone else feels hungry for food from their favorite restaurant. As always, thanks for the interesting podcast.”
Well, thanks for the hint Tristan. Yes I do sometimes use that method myself when something is dimmed that I know jolly well shouldn’t be, and it can do the trick. Unfortunately uninstalling the Uber app and reinstalling it didn’t fix the issue for me. Christopher’s workaround is working for me, but when I got your message after Christopher’s and I thought I might be able to uninstall and reinstall and have it go back to normal, I was thrilled. Only to have my hopes dashed. Oh no, because it’s not working for me.
I didn’t have to log into Uber again when I reinstalled the app which was curious, because typically Apple’s sandbox approach does remove everything, but perhaps it was because I had the Uber Eats still on my phone. Maybe if I uninstall Uber and Uber Eats and then reinstall Uber it will work. If I get brave enough, I might try that, but certainly just uninstalling and reinstalling Uber didn’t fix the pesky problem for me, but thank you. I’m glad it worked for you, and that is a very good hint that not can just apply to Uber Eats, but also to a range of apps where you might experience an accessibility issue like this.
Every time an Uber update comes along I get excited. I test it with great anticipation only so far to have my hopes dashed. I do hope Uber gets it back to normal soon. In the meantime, for those people in Australia and New Zealand I have rediscovered Menulog which was an app that I first used in Australia before it was available here in New Zealand. It wasn’t perfect from an accessibility perspective.
Now, Menulog is a fantastic accessibility experience, and they actually have a lot of restaurants on here where we live that Ubers Eats do not. I think because their rates are a little bit more competitive they don’t take as much of a cut as Uber does. In a way it’s been a blessing, because I’m reacquainted with Menulog and we’re using that a lot more than Uber Eats now.
Brian Gaff is responding to several things that have come up on the podcast of late, and says that, “Re funny inaccessibility issues on apps. This sounds like it might be settings related to me. If you recall, I had issues with an app due to the picture description and graphics on screen et cetera. I’m sure you’ll have thought of that, but maybe something else is now misbehaving. You all could be using the same versions of the software, but it’s configured differently.”
It is a good thought Brian. Unfortunately it is not the case in this particular one with Uber. I obviously checked things like making sure that the screen recognition hadn’t been inadvertently turned on, which can cause all sorts of bizarre behavior when it’s on and you don’t need it on. It is nothing obvious like that I’m afraid.
He says, “I do wish Amazon would fix their devices tab jumping into introducing favorites as it means that if I go there I have to shut the app, then reopen it and go to settings and get to devices from there instead. I did tell them about this soup drinker bug, but just got a thanks for your feedback. Treadmills. Now I’ve not tried this, but something came to me while I was having a doze. How about some bungee cord either side of where you were running or indeed all four if you want, making a cell where the bungee just touches your thigh at some point as you drift. Obviously somebody would need to make some holding frame that was held down by the equipment base to make it safe, but it would hardly be high tech, would it?
“The biggest issue with all exercise gear is the apparent total indifference of the industry to make them accessible in the first place. I think the word rehab is often used nowadays for people who are addicts of drink or drugs or gambling or whatever it might be. It’s another way that words creep with their meanings. I have not heard of people here being refused services because they made a bad comment, but I guess it probably happened in the past often in the medical setting. Even now, although the accessible information standard for the NHS has been the law for six years, the ineptitude of health services means we seldom get it.
“Much of it is down to lack of staff training, poor communication and the contradictory issues around the Data Protection Act and the Accessibility Information Standards. The lawyers want it all governed by the data protection bubble as they are more likely to end up in court over that as information lack is not enforceable with anything but a slap on the wrist.
“I do find that few blind people want to actually stick their heads up and campaign compared to those with other disabilities. Maybe it’s the fact that charities like RNIB do not really support us enough, and have a sighted CEO that is part of this issue.”
Thanks Brian. Well, historically blind people have been right at the forefront of advocacy in the disability sector. In fact, I think it has caused a little bit of contention from time to time, because blind people have won privileges and legislative victories a little ahead of others with disabilities. Andrew Walker writes in on the subject of treadmills and he says, “Hello, Jonathan here is my take on positioning on treadmills. My treadmill is located in my garage. Above the bed of the treadmill I have fixed a heavy duty hook to an equally heavy duty rafter. I have a long bungee tied on the arms of the treadmill and looped around the hook making an inverted print V shape.
“The bungee is positioned so that when I am running my hands brush the bungee. Since the bungee slopes upwards at a steep angle, it is immediately evident if I’m moving laterally to either the side and it is therefore easy to stay in the middle. Obviously the bungee is positioned so that it is possible to reach the console and controls located on the arms of the treadmill without being too close to the front or back of the treadmill.
“A note of caution, however, it is possible to do this I think with rope or code rather than using bungee. I think this is probably safer. If there is a lot of tension on the bungee and should it become detached for any reason or even worse, should the hook break loose this could cause serious injury. If using a bungee I would recommend using as little tension as possible and ensuring that the fastenings are secure.
“The advantage of using a bungee is however that it is possible to use a second bungee or a strap to pull the sides of the bungee inwards, so that the slope of the bungee is not so steep. If you can imagine a print capital A, the second strap or bungee is located where horizontal line is and pulls the sides closer together if necessary so they can intercept the runner’s hands at the best possible points.
“Using this system I can run on my treadmill with the same action as I run with my guide runner. When I use a treadmill at my local gym, the console itself is at the correct height for me to use as a reference point, but this is only possible, because the moving bed of the treadmills at the gym continue under the console. Every home treadmill I’ve come across has the console at the end of the bed of the treadmill, and to try touching the console when running results in kicking the structure of the treadmill. I hope this makes some sense. Keep up the fine work,” writes Andrew.
Thank you Andrew that’s a very interesting solution, and an email on this subject from Robin Williams who writes, “Hi Jonathan, you asked if any of your listeners are able to run on a treadmill without holding onto one or both handles. I am able to do this although it took me quite a bit of practice. As a former athlete England Blind Football aka soccer, it’s probably fair to say that my athletic ability is above average, without meaning to sound arrogant. I think it’s really a matter of confidence.
“My advice would be to start slowly. Even at a walk and practice not holding on, then build up the speed very slowly. I also find it easier to run on a fairly narrow treadmill. That way if I do go slightly off-center and clip the side I can adjust back to the center quickly. I also prefer not to attach the safety cord to my person as I found I would often catch it with my arm, which would result in the treadmill stopping very quickly. This might sound counterintuitive, but I think the safety cord actually increased the risk of nasty accident for me, although that’s just in my personal case, and isn’t something I’d necessarily recommend in general.
“I would also certainly be holding onto a handle if trying to control my phone. I tend to just put some music on and switch off while running. I’ve been up to speeds about 19 kilometers per hour running like this, although I don’t last long at that speed, especially now that age is catching up with me. As you often so rightly say, though, we all have our strengths and you don’t really lose a lot by holding on to a handle, especially if you’re not worried about your performance and how fast you’re running.
“My main motivation for not wanting to hold on was I found that I was running at a slight angle and causing an imbalance in my back. Sidenote, I really try and get out on the roads wherever possible, rather than using a treadmill, simply because I’ve become bored with treadmill running over time and the roads present a different challenge.
“This does require reasonable weather though and one or more reliable guide runners who are able to run at a quicker pace than you. These aren’t always easy to come by. All the best with your exercise and thanks for great content you produce.”
There you go. An alternative strategy from Robin. Start slowly and work your way up.
[music– You rehabilitated yourself.]
Yes, you rehabilitated yourself because we’re going to be talking about that word rehab that Dan objected to so strongly. “Hello Jonathan, This is Mike from Rochester New York, although I’m not an eye device user, I’ve been a subscriber to your podcast for several months now and I’m really enjoying it.” Well, that’s good to hear Mike. Thank you because we don’t just talk about eye things. For example, what we are talking about now.
“I just want to take a brief moment,” he says, “to comment on something that Dan from Illinois spoke of on last week’s show regarding whether or not people should continue using the word rehabilitation or changing it to something else. For me personally, I don’t mind the word at all. If someone’s being rehabilitated, it doesn’t necessarily mean something’s wrong with them. It just means that something unfortunate has occurred in a blind person’s life, which has caused them to go through a retraining phase of sorts.
“It’s really not much different from using the word blind versus handicapped or for those of us living here in the United States using the word Indian versus Native American.” I think some people might argue with that but anyway, so Dan from Illinois, while I can certainly understand why the use of the word rehabilitation might be offensive to you and many of our fellow blind counterparts, these are really no more than words. I really don’t think companies and others who use such a word mean any offense. Try not to let it bother you.
Ad: On Twitter, follow Mosenatlarge for information about the podcast, the latest tech news and links to things we talk about on the podcast. That’s Mosenatlarge, all one word on Twitter.
Tim: Welcome to the third episode of–
Speakers: Tim’s new hearing aids.
Tim: In the second episode, I talked about rechargeable versus non-rechargeable hearing aid batteries and the batteries of the Signia Motion charge and go hearing aids. This week is the Signia Motion charge and go a good hearing aid for blind users?
I think the answer is yes, but my intermediate– and I must stress intermediate, conclusion is that Oticon is better. I will start by discussing two things I can say from an objective perspective. The first is that streaming from the iPhone, and especially the voiceover speech on the iPhone, in combination with auto audio is great.
It’s at least as good as my Oticon hearing aids, if not better. The sound is very stable, no interruptions, even with the use of other Bluetooth devices, especially my Rivo, it works just fine. I didn’t try it with an Apple Watch or some other Bluetooth devices, which I know can sometimes cause interference, but I think it’s good.
At first, I didn’t like the tone of the sound too much, but my hearing professional did some equalization in the hearing aid settings for the iPhone sound. We added some bass and then it was great. The sound mixes in nicely with the environment, assuming of course that you set the relative volume right. All in all, great experience, expect good results. I didn’t test it with a streamer on my computer, but I assume that will be equally good.
The other objective issue is the accessibility of the Signia app and that is not nearly as good. The app is somewhat usable. Creating an account is not entirely accessible, but I think most people will get there. What’s also interesting, there’s a remote assistance feature in the app so your hearing care professional can remotely control the settings in your hearing aids, which means that you don’t have to physically visit him for every change you need to make.
We didn’t test this, but I think most blind people will manage to use this function. You can change the settings of the hearing aids to a large extent, but some features such as separate volume control, or even equalization that you can do from the app are just not accessible and some features just don’t work. The app is not great.
Just assume that for most things, you will have to rely on the iPhones built-in hearing aid controlling functionality when you’re blind, which works fine. You will get some results from the app. It’s not as bad as other hearing aid apps that I’ve seen, but they didn’t specifically consider accessibility like Oticon does seem to have done because the Oticon app is really mostly, if not completely, by now accessible. Signia is very far from completely accessible. And it’s one of the main reasons for returning the hearing aids.
What I will get to now are more subjective matters. How does it sound and how do I like the programs? To start with the programs, the hearing aid has a lot of options for programs. Of course, you’ve got the automatic program, which on a properly configured hearing aid should really be all you need except in exceptional circumstances. My current Oticon hearing aids,
I have some programs on them, but I almost never switch away from the standard or master program.
Signia’s programs include noisy rooms and echoey rooms. I didn’t get to properly test those programs because in the Netherlands we were in a pretty strict lockdown. Although I did travel, the one thing that I didn’t encounter was spaces with a lot of people talking at the same time.
The other programs that I did get to try a bit, that were interesting were walking, which is designed for situations where you are walking with another person and you want to talk with that person. Like what I thought initially, it’s not designed to make you hear your environment, which is obviously relevant for blind people, but actually to perhaps take away the environment and make sure that you understand your walking partner or partners. That program didn’t really work.
There was also the outdoor sports program, which I suppose is designed to make sure that you hear somebody hitting a football or a coach yelling over the field or whatever. When I was walking with that program, it didn’t work better for me than the program. I wasn’t really impressed with those two programs for outdoor use.
As to the general question, are those hearing aids good for spatial orientation based upon your ears? I’m not the best judge of that because I do have some usable vision. I don’t depend on my ears for orientation as much as a totally blind person does but as far as I can tell, the Oticon hearing aids work better but again, that is a subjective opinion. I can imagine that people with different hearing impairments, or different needs, or different preferences would like those hearing aids for spatial orientation. I’m not saying they’re doing bad they’re doing good I’m just saying, at this stage, I think Oticon is doing a bit better given my situation and preferences.
Then as for the performance of the master program, in non-mobility situations, again, it’s doing very well. I could definitely work with those hearing aids very well, but I like Oticon better for now. I think the explanation is that Signia and Oticon have different philosophies. Signia is clearly geared more towards inexperienced users. One thing that I can tell that from is their voice feature which is supposed to make your own voice sound less irritating, it’s suppresses your voice a bit. It’s nice to have it on, I noticed some improvement but for me, as I’ve been a hearing aid user for over 25 years it’s not really relevant. I’m so used to hearing aids that six functionalities that might be fun, but they’re not very interesting. For beginning hearing aid wearers or older people, such things are extremely relevant. I think Signia has given the comfort of such inexperienced users a lot of consideration in their algorithms. I understand that Oticon’s philosophy is rather that the user’s brain will filter out the sounds it needs, that is more suited towards experienced users.
One annoyance that I had with the Signia hearing aids in the first week was that loud noises, such as [claps hands] or loudly putting a glass on the table [bangs glass] were pretty much distorted. Upon discussing this with my hearing aid professional, we discovered that this was due to the sound smoothing feature, which makes such noises less annoying for users, but which for me was just annoying because I’m so used to hearing those noises and to dealing with somewhat louder noises. Of course, hearing aids have limiters built in but explicitly filtering out those louder noises doesn’t help me and actually prevents me from hearing what’s happening in my environment. After we turned off the sound smoothing feature, the hearing aids did sound a lot more natural to me, they performed a lot better.
Another issue was that the Signia hearing aids are making an attempt to very much focus on the speaker, especially in a group. The idea, I think, is that it’s really able to pinpoint a speaker amongst background noise, it really cuts out that background noise altogether. It’s really amazing how it does that and focuses on the speaker. It’s a really cool feature. It sounded almost as though the speaker was wearing a microphone. Potentially, this could be very powerful for certain users in certain situations. Again, I didn’t get to test it properly because I never was in very busy environments, so this could be a reason to maybe try it again when I can get into noisy environments. Nevertheless, it also had the disadvantage of overdoing this sometimes. Sometimes when I turned on a tap it would consider the tap to be the speaker, and then it would totally focus on the water streaming out as the tap totally fading out the rest of my environment, which obviously is not what I want.
It’s entirely possible that we would have been able to tweak the settings to improve this for me, so that could be something for follow-up tests. Nevertheless, even with this strong focusing, in the limited tests that I was able to do with speakers speaking a bit farther away, I didn’t have the impression that I could understand them better. They were a bit more isolated, but I didn’t necessarily understand more of their words than when I would use my Oticon. Overall, it doesn’t convince me but, again, it is very interesting. Small remark, no Signia does not have direct audio input. If you need that, it’s a no-go.
Overall verdict? Very good hearing aids, great audio from your iPhone, not so great but somewhat workable app, and absolutely very interesting sound, features definitely worth giving a try. We always need to try different hearing aids and not everybody likes the same aid. It depends a lot on all the parameters that are into play in your specific situations. So definitely worth considering.
Next time, I’ll discuss Oticon receiver in-canal hearing aids. Even though I’m very skeptical, I want to try them even if it’s just to understand the present state of technology, because technology does evolve all the time.
Jonathan: Fascinating. Thanks so much for that, Tim. I agree with you about the Oticon philosophy and I don’t know whether it’s just that it suits advanced hearing aid users or experienced hearing aid users. I think also it suits a lot of blind people who do need to know about their environment. A lot of these hearing aids are very clever filtering out noise but for those of us who are blind, some of that superfluous annoying background noise is actually critical information. That’s why for me, when I switched to Oticon in 2019, it was a game-changer to still be able to hear well in many situations, but not be deprived of my environmental sounds.
This next contribution comes from Steve Bauer.
Steve: We have heard a lot of talk lately about future things coming to the Apple Watch. A lot of health-related things, maybe check your blood pressure, or your oxygen level in your blood, or maybe even your blood sugar level. I have now been encouraged by my doctor to start monitoring my blood sugar levels. Of course, I’m not wanting to prick my finger and do it that way, but there is a new device on the market that has really worked nice for me.
Before I get into my demonstration of this new device I want to state that I am not employed in the medical field, I’ve never worked in the medical field, I have no medical background, and I do not have any connection at all to the manufacturer of the device. If you think it is something you would like to try, then I would encourage you to contact your doctor to see if it will work for you.
The device that I want to demonstrate is the FreeStyle Libre spelled L-I-B-R-E 14 day sensor. It is called a CGM device or Continuous Glucose Monitor. It automatically tracks your glucose day and night, and you can check your glucose level at any time without having to stick your finger. It will detect trends and patterns for you and your doctor to review, which can help inform you and your doctor on making decisions on medication activity and nutrition.
CGM can help you better manage your diabetes. Highs and lows can often go undetected, even with multiple daily finger sticks. The sensor uses a thin flexible filament inserted just under the skin which measures your glucose by the minute. The FreeStyle LibreLink app works with VoiceOver quite nicely on the iPhone. More comments on the app in a moment. You can upload and store your glucose data in the cloud. It makes glucose reports easily accessible for you and your doctor.
The FreeStyle Libre 14 day sensor is manufactured by Abbott Labs, and it is suggested that you check your glucose at least once every eight hours to avoid gaps in your daily graph. The sensor measures glucose every minute and stores readings every 15 minutes for up to 90 days. The sensor can be worn while showering, bathing, and swimming as long as it is not submerged more than three feet down or kept underwater for longer than 30 minutes.
The box that it comes in is about four and a half inches wide, three and a half tall, and two inches wide. We tear off the little strip in the middle and the box opens. There’s a paper that tells you stuff. A couple of pieces of paper anyway. There are two items. There’s one, which is a sealed compartment, which contains the actual sensor and then the applier, the applicator, whatever you want to call it. You’ll also find in the box, two small packages of alcohol sheets that you can wipe your arm with before you install the device.
We untwist the applicator and open it up. Now, I take the seal off of the part that contains the actual sensor and take that aside.
Now, the applicator has a little pointer on the side and you line that up with the edge of the part that has the device in it. Now, it’s connected. Now, I pull the device out. Now, you don’t want to touch the inside because that’s where the little needle is. I pull my shirt up and I go to my arm where I want to place it and I press it against my arm and push down and it’s on. It is that simple.
The Libre– and again it’s spelled L-I-B-R-E, Link app is, well, I guess, fairly accessible, although, on the surface, it doesn’t look that way. A lot of the information is displayed graphically, but yet if you click the item, which is not a button, it’s just an item followed by a graphic image, if you click the item, then information will be displayed in text. So it does give you some information.
The really cool thing about this app is at your doctor or your doctor’s nurse, with your permission, you do it through email and through the app, they can actually log into your account with the freestyle Libre people and look at your readings and see how you’re doing. That is really cool, I think, to be able to have your health professionals help you take care of yourself that way.
Let’s bring up the app.
Steve: I will double-tap on it.
Voiceover: Link, sensor in, eight days, menu.
Steve: All right, it’s telling me my sensor is in and eight days. It’s been in eight days already. That means I’ve got six days left on this particular sensor. I’ll swipe to the right.
Voiceover: Sensor icon, button.
Steve: It’s the sensor icon, which we go past that. Then there’s just a silent spot. There’s nothing there.
Voiceover: Check Glucose button.
Steve: Then there’s the Check Glucose. I’m going to double-tap on it.
Voiceover: Alert, ready to scan.
Steve: All right, it’s ready to scan, but just to be cool, I’m going to turn off voiceover because I’ve got the speech option turned on in the app because I like that. I’m going to turn voiceover off.
Voiceover: Speech off.
Steve: Now I’m going to double-tap right where I’m at. Then I will move the phone to my arm where the sensor is and you’ll see how quick it goes. I’m going to double-tap the phone now and move the phone.
Voiceover: 90 milligrams per deciliter and changing slowly.
Steve: There you go. It was that fast. Now, I’m not really sure why you’d want to, but with this app, you can check your, your blood sugar levels every five minutes, whatever you want to do. You’re no longer having to stick your finger. That to me is the main reason that this is so cool. Plus, it is accessible and you can completely install it yourself. You need no sighted help. You can run the app and handle everything yourself and manage your own diabetes without having to bother other people or be frustrated.
Now the sensor is maybe about an inch and it’s a circle. It’s about an inch wide and it’s very small and it fits nicely under your clothing. The cool thing is that you don’t have to take your shirt off to take your reading. The iPhone picks up the signal from the sensor and you can do it right through your clothing and do it quite quickly in public and nobody really knows what you are doing. Right now, each sensor will last 14 days and the app will tell you how many days you have left. Then it will tell you when the sensor does need to be replaced.
Now, one thing that people ask about is how expensive is it? Well, it all depends on your insurance. Medicare will probably not cover the cost of this device unless you are a severe diabetic and taking insulin every day. However, your health insurance may cover part of it, but I have had more success using Good RX to cover the cost of the 14-day sensor. Now, I’m not going to talk about prices because they vary constantly and it just doesn’t make sense to deal with numbers. But let me tell you this, using Good RX saved me a lot of money. Also, I have made several attempts to speak with someone from Abbott Labs about the freestyle 14-day sensor, and to learn more about it, to talk about future plans, and so on. Unfortunately, none of my requests have provided me with a phone call from anyone in the company. Regardless, I think it’s a cool device, and if you think it’s something you’d like to have, talk to your doctor about the Freestyle Libre 14 Day Sensor.
One other interesting point is that should you wind up with a sensor that is defective, you can contact the company and they will need information from the app, such as the serial number of the device, and some other information that you’ll be able to provide them right from the app. They will provide you with a replacement at no cost. There is a lot of information out on the web about this, and you can learn more at freestyle.abbotts, that’s A-B-B-O-T-T, .us. If you’d like to call and speak to a representative, the toll-free number is 855-632-8658. Again, it’s 855-632-8658. I hope this information will be of help to you and that if it’s right for you, you’ll be able to get it and better manage your diabetes.
Ad: Be the first to know what’s coming in the next episode of Mosen At Large. Opt into the Mosen media list and receive a brief email on what’s coming so you can get your contribution in ahead of the show. You can stop receiving emails anytime. To join, send a blank email to firstname.lastname@example.org. That’s email@example.com. Stay in the know with Mosen At Large.
David: Mosen, Happy New Year to you. I’ve been in [unintelligible 00:48:07] for a week with my dad and it’s been good. Anyway, I wanted to ask what happened to the Radio 2000? Because I can’t find any South African cricket commentary anywhere, not on TuneIn or iHeart or Rover. I was wondering whether Gary G or something who lives there. I was wondering what’s happened to the South African radio commentary. I can’t find it online anywhere. You’d think with the streaming aids, there should be more your commentary online to listen to.
Jonathan: That’s David Harvey with that question. Thank you, David. I don’t know the answer, but we do have some South African listeners who may be able to tell us about listening to South African cricket over the internet these days. How do you do it? It could be a Stephen Jolly question. The walking talking encyclopedia of sports radio coverage on the interweb. He might be able to help us with this as well.
The one observation I would make, and I think I’ve made it on this podcast before is wouldn’t it be great if the ICC, which for those who aren’t familiar is the international governing body of cricket, took the same approach that other sports like MLB and NHL and all those alphabets soup acronyms, take to their game. And I realize that cricket is international so it’s a slightly different kettle of fish, but if there was one subscription you could pay to the ICC to have one fully-accessible app where you could get all the radio coverage of cricket in the one place, man, I would be down with that as my kids say.
To Auckland, New Zealand, we go where Sally is writing, “I was sent an Omnisense cane tip from my friend in Australia. I’m finding it very noisy, but good tactile feedback. Any reviews from other listeners?” I’ve not heard of this, Sally so I’d be interested in learning more. Let’s go across the ditch and hear from Dawn Davis who says, “Hi, Jonathan. Thought I would give you my opinion on the Beatles, Get Back. Generally, I thought it was fantastic. However, at times I found it most annoying that another voice was used to cover the actual conversation when someone decided that it was too soft. Apart from the fact that I didn’t like the person they used on these occasions,” oh, I hope that person isn’t listening, Dawn, “the main audio description was very well done. I find that I have become quite spoiled lately and don’t like not having any. Hope you had a happy new year,” says Dawn.
Thank you so much for writing in, and the same to you. TV has evolved, hasn’t it? When TV started, it was literally radio with pictures and a lot of radio series made their way over to TV, and there was a lot of dialogue and usually, you could get the context, but that’s not the case now. Audio description is often essential. It makes the difference between understanding what the deuce is going on and not. In the case of this Get Back thing, I think what’s happening is that they’ve made a judgment call that the audio quality on the tape is so bad that they need to put subtitles on the screen. An audio describer is reading for you the subtitles when they pop up on the screen. Absolutely wonderful enthralling documentary that Peter Jackson did and I’ve watched it, I think three times now.
To Canada, we go and Rebecca is writing in and says, “Hello, Jonathan, happy New Year.” Same to you, Rebecca. She says, “I would very much like to know about resources for blind people to learn basic carpentry skills. I’m tired of sighted people patting me on the head when I pick up a hammer or ask about drywall anchors. Any tips from the community would be appreciated.” Thanks so much, Rebecca. Well, I’m definitely not qualified to make any kind of comment whatsoever on this. It does remind me of a show that we had on ACB Radio when I was directing that called The Blind Handyman Show, hosted by Phil Parr and friends and that was amazing, the stuff that they would talk about, but if anybody can direct Rebecca to some resource that might help somebody who’s blind with an interest in carpentry get started, please feel free to chime in.
David van der Molen is writing in and says, “Hi, Jonathan. I’m hoping someone in Mosen At Large land can answer the following Zoom questions for me. One, when using JAWS 2022 with Windows 10, is there an accessible way I can change my Zoom name as others see it? At my place of work, we’re sometimes asked to change our names to things like our favorite food or around Christmas, we would be asked to change our name to that of an elf.” Well, I hope you picked Louie, David.
“I always have to ask the host to do that for me.” Yes, there’s an easy way to get this done, David. When you are in your Zoom meeting, press Alt-U to get into the list of participants and find yourself. See, some people go to far distant lands to find themselves, but all you have to do is press Alt-U in your Zoom meeting and you will find yourself in that list of participants, and then if you tab through, you’ll find an option that says more options for your name. If you activate that menu, there’s a rename button there, and if you choose that, you’ll be able to type your name and press enter, and voila, that’s the actual French there, you are renamed. But David has a second question so wait, there’s more.
“Two, is there an accessible way to choose which breakout room you want to join? I seem to be able to find the breakout rooms, pick one, confirm that I want to join that room, but I don’t end up getting moved to the room I selected. If these two Zoom issues can’t be dealt with using JAWS, would iOS handle them better?”
Thanks, David. I will defer to the Mosen At Large audience on the breakout rooms because the only time I’ve been involved in Zoom breakout rooms, I’ve been zapped or raptured into one, not of my choosing. It’s been a situation where the host has apportioned different people to breakout rooms. I’ve not experienced this one myself, perhaps others can comment.
Ad: Mosen At Large podcast.
Jonathan: This email comes from Keith Wessel. He says, “Hey, Jonathan, longtime ACB member, and listener to your work. I’ve got a question that I hope you can help me with or possibly give me a lead. I’m wondering if you know of any accessible VU meters out there, either software or hardware. I’ve been doing sound, radio, live and studio work since the ’90s. I took a break when I got married and my boys were little, but I’ve just started back into it in the past couple of years, doing a couple of podcasts and running sound for the live streams of my church’s services.
“In the past, I usually did audio work with my best friend who is sighted. He lives in another state now, though, and that all feels like a lifetime ago. Since I’m working solo nowadays, I’m looking for a way to sanity-check my levels. I’m pretty good at just keeping a reference level, then keeping the sound at that level, something I’m guessing you can also attest to yourself, but it would be nice to have some equipment to confirm that my levels are loud enough, but not peaking. I can’t find anything online that will do this, software or hardware. I reached out to APH today thinking their APH Connect Center might know of something or someone.
“In talking to the APH employee, who was trying her best to be helpful, you came to mind. Do you know of any accessible VU meters or if not, do you have any suggestions beyond just getting a reference level and sticking with it? The best friend who I mentioned above is now an engineer with National Instrument and he and I have talked about building an accessible VU meter that buzzes or something similar when the levels exceed a certain volume. Before we go down that road, though, I thought it might make sense to reach out to someone who’s been doing this stuff a lot more than me in recent years. Thanks for any suggestions you might have.”
I’m pretty sure there are some gadgets that will help you here, Keith, in a live setup, but what I do is all in software, whether it be broadcasting or recording, and there are a number of options. REAPER itself, which is a wonderful tool for blind people to use, thanks to the work that’s been done on Osara, and of course, Jim Snowbarger has done some fantastic REAPER scripts for JAWS. That has a tool in it called Peak Watcher, and that is fully accessible. It allows you to check your levels on a regular basis. There’s also a thing called Accessible Peak Meter, which you can install, and when you’re recording it’ll beep when you go into the red, when you go above a certain threshold.
If you Google on Accessible Peak Meter, you should find it there. Those are two things that come to mind. There may be others who have suggestions on this, so by all means, let Keith have those via the podcast. You can drop me an email to firstname.lastname@example.org. The listener line 86460Mosen in the US, 864-6066-736.
An email now from Brian Blair who writes, “Jonathan, I’ve been listening to your Mosen At Large podcast sporadically for the past year or so and enjoy it very much.” Thank you, Brian. “I wanted to comment on the interview you did about JAWS tandem. I am one of the JAWS users who has asked for the ability to initiate a tandem direct session on a host computer, or at least have the potential host be listening so that I could initiate a tandem session remotely. This is an option that NVDA remote has had from the beginning, and I think it’s a huge plus in the practicality of a remote session.
“The only other viable option I know of for blind users to initiate remote sessions is with remote desktop under Windows. I have used this for a number of years with great success, but it has its drawbacks. I work for a radio station and when it’s my turn to monitor the stations and fix issues, I need to access a number of computers remotely. Unfortunately, since we’ve gone to Windows 10, I’ve found that several processes running on our broadcast automation host machines get interrupted when a remote desktop session is initiated or terminated.
“This is simply unacceptable. What I have done is to open tandem direct sessions on the computers I need to access and type in a meeting ID. Unlike tandem center sessions, which would only remain open for 15 or 20 minutes waiting for the remote connection, tandem direct sessions will wait much longer for connection, maybe days. The problem is, that once I use one of these connections, I can’t use it again unless I physically go to that computer.
Another clunky workaround I’ve found is to start NVDA before I close my remote tandem session, then use NVDA to initiate a new JAWS tandem direct session, then close NVDA.
“I understand Freedom Scientific’s concerns about security and I also am very concerned about security. I have only run tandem direct sessions over a VPN or over my local network. I really think Freedom Scientific should incorporate functionality into tandem direct to allow a remote user to initiate a tandem direct session on a host. I don’t know of other viable remote control options for blind computer users. It seems to me that this could allow blind people more employment options, especially in light of challenges caused by the pandemic.
I don’t normally use NVDA for remote access, because I’m running applications that have custom JAWS scripts. I’ve tried running JAWS and NVDA concurrently, but this causes some keyboard conflicts. Again, thanks for your podcast. I agree with you, it would be great to have JAWS Tandem allow for unattended remote sessions, and perhaps you’ve hit upon the compromise. If they enabled it for Tandem Direct sessions, and not tandem center sessions, perhaps that might be a middle ground. I would love to see that happen.
In the meantime, I did want to come back and explore this issue that you are facing where you say that when you go into a remote desktop session with Microsoft Remote Desktop, you find that some processes are terminating. I’m not having this issue with the Mushroom FM PC and I’m wondering if the issue might be a configuration one with the remote desktop session. You can go in on your remote desktop clients that you’re going to be remoting in from, and set up the properties for the session ensuring that the local audio stays on the local machine.
I have seen things go very badly wrong if you remote into a radio station PC, and the setting isn’t set up correctly so that suddenly the audio was moved from the local machine to your machine. If you’re just doing maintenance, you probably don’t need this. It doesn’t affect your JAWS speech, which is coming through a different channel. If you’ve tried that, and you know about that, my apologies for telling you how to suck eggs (chuckles] Just in case it makes a difference, I thought I would throw that in there.
Charlie: Hey, Jonathan, I just want to ask something quickly. I want to go into commercial radio stations, maybe nationally or locally but my problem is reading things such as WhatsApp messages and Facebook messages from the station, of comments or things, comments that listeners would bring to the show or give to the show and stuff like that and reading it out live? What is the best way of doing it because a BrailleNote is just too expensive to get?
I’ve tried to actually do something and try and maybe take JAWS and let it come through my earphones and not through the mixer itself. Is there a special rig that I must put in or is it something else that I must do in order for just JAWS to come in through to my earphones and not through the mixer itself?
Why am I saying this is because I’m plugged in right now, I’m sitting in front of a six-channel mixer. There’s four microphones on this one, a broadcaster microphone, it’s the works one, and then you have your telephone, and then you have you line in that you can plugin in here. This desk, because I’m sitting in front of this, comes in two parts. The computer is using the line ins of the second part of the mixer. I don’t want my speech actually to come through like this onto my mixer.
Jonathan: Okay, Charlie, let’s see if we can get you to radio stardom. I’m going to list these options in order of preference. In other words, the best one first. Assuming that you are a competent Braille reader, as a Braille reader myself who worked in commercial radio for a while, I know that I always will prefer Braille where it’s available.
There are a number of things you could do. You mentioned that a BrailleNote is really expensive in South Africa. When you talk about BrailleNote, I’m not sure whether you are talking about Braille displays generally, or the specific BrailleNote product. Yes, the BrailleNote is expensive, because it’s a note-taker product as well, but it may be overkill for what you want. If you can simply have a Braille device with a simple scratchpad on it, or that interfaces with– I was going to say your iPhone. Don’t go there right now [chuckles] but some computer or something like that, then that may be all you need and they are quite a bit cheaper than your full BrailleNote.
There are products like the Brailliant, and the Focus displays, and those ones that are around about the same price point. Even if you’ve got a 14-cell Braille display, that may work. You could also go to the Orbit. My only concern about using the Orbit in a radio context is it can take a wee bit of time to scroll each line and they are quite noisy. If your radio studio is using a condenser mic and they tend to be more omnidirectional, then people may be able to hear the [unintelligible 01:04:44] noise of the cells popping up.
I can appreciate that a Braille device may be out of reach for you personally, but I wonder whether there might be service clubs available like Lions and Rotary, that would be willing to fund it for you. I’m not really familiar with what is done in South Africa in that regard, but certainly, in many other countries, you can go to a service group, if you really need to get funding for something. You sound like a pretty personable guy, you’d probably give a very pleasant presentation, and you may be able to get a Braille display but let’s say that that’s not possible. There are certainly broadcasters who’ve gotten very good at hearing this speech and reporting it back.
I did a piece for FSCast when I ran that some years ago, where I interviewed Nas Campanella from the ABC in Australia, that’s the Australian Broadcasting organization. She’s totally blind and she doesn’t read Braille and she’s moved on to full journalism now. At the time, she was reading news for Triple J, the Youth Network in Australia. She would listen to eloquence and repeat it back. I said to the ABC, “Can you give me a feed of Nas reading the news, and what she’s hearing through her headphones?” We played that on FSCast, and that episode will still be in the archive somewhere. People can do this and they get really good at it over time if they practice enough.
That comes back to your question, “How do you do this when everything’s coming through a mixer without it going out on the air?” You need the right kind of mixer. If you’re in a radio station environment, I’m pretty confident that you would have the right kind of mixer.
I have never worked in a radio station that does not have a feature called Pre-Fade Listen. Most radio stations I’ve worked in also have the ability to send audio through another bus that doesn’t go out on the air, because this is how people record phone conversations. Say, you’re doing a competition and you want to record the lucky caller. Just in case they say something outrageous, like “I love soup” or something and you wouldn’t want to broadcast that. You typically record that while the previous song is playing.
Many studios do have that second bus, but whether it’s Pre-Fade Listen or a second bus, you keep the fader all the way down, and you push the Pre-Fade Listen button, or you press the bus button, and you will be able to hear your speech coming through your headphones but it will not go out on the air. Indeed, that’s what I do with my Allen & Heath ZED-22FX mixer in the studio. It’s got both Pre-Fade Listen and another bus. I have JAWS running in the background and I can hear JAWS but you can’t. It’s a good system.
Now, if you have a mixer that doesn’t offer that, the next step you could do is to have a separate audio interface, just a little cheapy sound card will do, plugged into the PC. You just plug a pair of earbuds into the headphone jack of that audio interface and you route your screen reader to that audio interface, that’s not in any way connected with the mixer. Now it can be a bit annoying to have another pair of earbuds in as well as headphones, but it’s doable, and you’d be able to hear your speech that way.
I hope that helps and I wish you the best of luck in your broadcasting.
Jonathan: iOS Access For All is a venerable guide to iOS accessibility from the perspective of people with a range of disabilities and the iOS 15 version is now out, from the venerable Shelly Brisbin. Now I don’t get to use venerable very much and I lost my Thesaurus, so I get to use it twice in one thing. Shelly, it’s wonderful to have you back on the podcast. Thank you so much.
Shelly Brisbin: Well, thanks for having me. Venerable?
Jonathan: Venerable. [crosstalk]
Shelly: I tell you, I did find when I was putting together the materials to go with the book, when I wrote out that it was the ninth edition of the book, I was even startled. I was like, “Wait, did I do that? [chuckles]
Jonathan: Yes, it’s amazing, isn’t it? Yes, you’ve been at this for a while. How did you find the writing of this one? Was there a lot to add?
Shelly: Well, there were sections where there was a lot and then there were a lot of things that didn’t change. There were brand new features like Focus and some of the iPad Multitasking features that counted as brand new features. Obviously, there are some VoiceOver features where I had to write a whole new section but there is a lot that’s the same.
I think we’re seeing a very mature operating system, where the updates that happen, there was no point at which I was tempted to write a whole new chapter. Let me put it that way.
Jonathan: I know that people were a little bit underwhelmed by Apple’s revelations last year. I must admit, for the first time in a very long time, I didn’t buy new Apple hardware because I had the 12. I didn’t think the 13 really warranted it for my use case but I have to say, I was really impressed with iOS 15. Maybe there wasn’t the number of new features that you might typically find but what they added I thought, was particularly kind of meaty. The Focus thing is something that’s changed my life significantly for the better.
Shelly: I feel Focus is something you either love or don’t get. I understand Focus but I haven’t personally made friends with it if that makes any sense. I appreciate what it’s trying to do and I think it’s great. I just haven’t entered it into my own personal way of working.
It’s a great idea and Focus, for those who haven’t used it, allows you to essentially customize people’s access to you, it’s not just a blanket, “Do not disturb.” You can say, “Well, this person can come through” or “This app can come through, but this person and this app cannot,” and for periods of time. You can associate them with automation, so it’s a great idea. I find the interface to it a little bit hard to grok, but I love the idea.
Jonathan: We have an ongoing dialogue on the podcast that pops up from time to time where we say, “Why is it that so many blind people who are still pretty proficient iPhone users own a victory Reader Stream. One of the answers that have come back in the past is that you can be reading in Kindle or in Apple Books, and you get a notification and it interrupts your reading and it’s really disruptive.
The nice thing is with Focus, I’ve now got it set up so only the most important notifications from people who I need to hear from, or from apps that I really trust to be pretty sparse on the breaking news alert, get through. I find that incredibly beneficial.
Shelly: Yes, as I say, I think it’s a great feature. I think that the interface probably could use a little bit of work. They give you some default focuses. I guess I feel the Apple hasn’t– And I don’t think they need to hold my hand personally. I guess I wish they had provided a little bit of an easier access for folks who might not have used this thing before. I think for a power user though, it’s great, because you’re like, “Okay, I instantly get it.” Knowing how to use it in your own life, I guess, is the big thing. You gave a real concrete example just there.
When I’m making podcasts, it might be the same thing. If you’re reading a book, you have a specific app that you want to give priority to and eliminate access from other apps. You don’t want Twitter bothering you while you’re reading a book, but at other times you might be following your notifications on Twitter and you might want to hear them.
Jonathan: Right. Similarly, if I’m recording an interview with you, for example, I don’t want anything to interrupt that. That would just be absolute sacrilege, right? If I’m sitting here reading emails from listeners and recording that part of the podcast, I don’t mind being interrupted by a few important people. It’s really quite an amazing feature, but you do make an interesting observation about Apple’s documentation or handholding. I guess if they did a better job of that, you and in earlier times, I would have been out of business.
Shelly: I suppose. That’s true. I do think about that because I make it a habit to download the manual for each iOS version and they do put one up in the bookstore for free. I do make a habit of downloading it, usually so that I can find a reference in case I don’t know what the name of something is, but rarely do I use it much anymore, to be honest. I think back, in the beginning, I might have.
Though they have beefed up the accessibility section, for example, it really hasn’t kept up and they really don’t do a good job of walking you through “Okay. These are the options that you have. This is how to use them.” It’s, a little bit haphazard, but Apple isn’t focused on documentation. They’re focused on making a device that’s, hopefully, intuitive for most users.
Jonathan: One of the observations people have made over the years is that there’s a lot of goodness going on in Apple Land with accessibility. Typically, every year there is something of significance. It might not be for blindness, but typically, Apple is adding something good for some disabilities, some impairment type, or another. It gets very little attention at the keynote to the WWDC, that thing.
It was a different approach that Apple took last year, where on Global Accessibility Awareness Day, they essentially preempted WWDC and the forthcoming version of iOS, iOS 15, by telling people a lot of the new accessibility features, well in advance of any other talk they were doing about the operating system.
Shelly: It was super interesting that they did that. I gave them a lot of props for it at the time. When people in the mainstream world, my podcast friends, or whoever asked me about those announcements, said “Hey, what do you think of assistive touch on the watch or the other things that they announced?” I said, “Those are great features, but the most fascinating was how Apple chose to announce it.”
I did everything I could to express my delight that they had done it and encouraged them to do it because Global Accessibility Awareness Day is one of those days where big companies like to be seen doing the right thing.
Apple was really smart about making these announcements, both about how they’re changing things in their stores, they have this sign time feature, where you can actually get assistance with a sign language-based assistance in an Apple Store. That’s great.
What’s really exciting for most people are the product-based announcements. They did a lot of those at the Global Accessibility Awareness Day. Of course, it was funny because they never said coming in iOS 15, because they hadn’t announced iOS 15. Apple gets backed up in its own secrecy. It’s like “Coming later this year,” but it was great to see. I was happy they did it and I hope they keep doing it.
Jonathan: Yes, I hope this is a tradition now that on Global Accessibility Awareness Day this year, we will get a little bit of a hint about what’s coming in iOS 16 for accessibility. It was a very good move.
Let’s talk about the book in general because you’re serving two audiences, I guess. You have people who are buying this year on year to really get a handle on what’s changed in iOS, and then you will be getting people who want to understand how to use their iPhones. How do you balance those audiences and their needs?
Shelly: Well, I would say a third kind of audience are people who are training others. I have a lot of teachers, a lot of educators. What I do is the level of comprehensiveness, the level of detail I provide and that includes not only the material itself but the table of contents, is hopefully something that makes it easier for people to find exactly what they’re looking for and not feel like they have to wade through. The structure of the content is in step by steps and tips, and very accessible– Forgive the expression of, a description of how to do things.
All of that is wrapped up in chapters and table of contents entries that are organized to be very specific. If you’re wondering how to type on your iPhone with VoiceOver, it won’t take you very long to find out how to type on your iPhone. You’re not going to be bothered with information about Focus or iPad Multitasking unless that’s what you want. That’s where I start.
The second part of that is, I try to address new users by adopting a fairly conversational style. The nice thing about self-publishing is I don’t [chuckles] have an editor telling me, “Oh, you’re being too casual” and I can use humor in on occasion. In fact, in the voiceover chapter, specifically when we’re talking about how to do basic things like how to use touch typing or how to edit text, I use some language in there that’s actually intended to provide encouragement to somebody who might find it difficult because I know there are a lot of things about using VoiceOver to edit text that I found difficult when I first did them. At the risk of alienating somebody who might be a power user, I’m going to try and help the folks out who may not have as much experience.
Jonathan: It’s a very delicate tightrope you’re walking to try and cater to all those audiences, isn’t it?
Shelly: Yes, very [chuckles] much so. I feel the feedback I’ve gotten has helped me do it because when I did the first couple of books, I was writing– I’m a professional writer. I’ve been doing this a long time. I know how to provide guidance for how to use tech, but I hadn’t worked with content that had to do with accessibility or an audience that was focused on that.
The feedback I got that told me that my approach was good was on things that really surprised me. For example, there’s the section in the apps chapter where I explain how to use the phone app. How basic is that? Most of us can figure out how to make a phone call, but I had people telling me that they learned things that they didn’t know about how to use the phone app in a way that made sense for them. That was just me digging in and thinking “If somebody’s never used it, if it’s my mother who has macular degeneration, or if it’s somebody else who hasn’t had very much tech experience and might feel little threatened by it, then I’m going to try and speak to them as best I can.”
Again, using tips and using other content organizations, I’m also going to hopefully give advanced information to somebody who might not need to read all of those steps, but who’s like, “Oh, that tip was helpful. I learned the little thing that I didn’t know before.”
Jonathan: The Mac has really gone ahead in leaps and bounds with the M1 and subsequent processes. Catalyst means that you can now run a lot of your iPhone apps on the Mac and I think that is a huge selling point for Mac. Is the convergence getting to the point now, that you might extend the book to cover macOS as well?
Shelly: It’s probably something I should do and people have asked me just in general, whether I would write about VoiceOver for the Mac. To be honest, it’s a matter of time and just a matter of not being sure whether I would get the support in the community to do it. It’s probably the thing that if I did it, I’d have to have, [chuckles] probably some folks to help do some more of the editing and more of that work. I would do a Kickstarter or GoFundMe or something like that but it’s absolutely something I have to consider.
Even if I don’t write a full-fledged book that covers the Mac as a platform, I’m absolutely going to have to address “Okay, well, if you have a Mac and you’re using these iOS apps, how do you interact with them differently?” Even though VoiceOver is called VoiceOver on the Mac, it’s very different than the experience on iOS, because it’s not a touch-based interface.
Jonathan: Yes. I find that people get such a positive experience from iOS that often it inspires them to try the Mac. Not everybody stays, though.
Shelly: Right. I guess some of that probably is people who have long experience on the PC side and they’ve used JAWS or NVDA, and VoiceOver on the Mac behaves differently. You come from the iOS, you’ve had this experience and you think, “Oh, well, the Mac must be just like this.” Not only is it not just like this, it’s not just like JAWS or just like NVDA.
I don’t know what to suggest Apple do to change that. I think the convergence of iOS and macOS is going to end up being good for people who want to use macOS as blind users, but I also think we have to see what’s going to happen in terms of the Mac.
People are always talking about, are we going to have a Mac touch screen at some point? At the point that we do, then I think all bets are off and it’s a really different ballgame because you have, “Ah, I use two cliches in one sentence, good for me.”
Jonathan: Venerable, I tell you.
Shelly: Right, exactly. That’s why I’m venerable. I feel that’s going to change everybody’s relationship with a computer-based operating system if there is a touch screen. I’m not even necessarily advocating it. I love the Mac. I think it would be a super different experience to work on it in a touch screen environment but I do think for voiceOver users, it could potentially be a great thing.
Jonathan: Yes, they’re adamant they’re not going to do that. That would be a significant reversal for Apple if they ever did a touch screen Mac now because they’ve just been so insistent about it.
Shelly: It’s true and I’m not saying that I think they will. The iPad comes closer and closer to the Mac all the time, whether it’s improving the Multitasking or having more keyboard shortcuts. Even if they don’t make something called a Mac that has a touch screen, I think, there is going to come a point when you’re going to have to, as an iPad user, confront the fact that it’s becoming more and more like a computer. That’s probably the direction it’ll go that instead of the Mac coming toward the phone, the iPad will come towards the Mac.
Jonathan: I know that, like me, you are watching Apple developments carefully and when you’ve been around long enough, you know which sources are trustworthy. It does seem like Apple is making an attempt to get some augmented reality, virtual reality headset product out this year, but it’s by no means certain at this point that they’re going to succeed with that timeline. What if any impact, do you think that product’s going to have on the blind community?
Shelly: It’s hard to say what the initial product would do because it’s going to be both a proof of concept and a gaming device. I think for us, ultimately, what we probably want is a device that would help us with navigation and magnification and tasks of daily living, that thing. I just don’t feel like that is in the cards for the first iteration of this product.
Even if the product has some accessibility, I just can’t imagine that it’s going to be a fully functional system that’s going to give us the navigation and magnification assistance that I think a lot of us want.
Jonathan: There is talk of a Glasses type of product in the offing, a bit further down the pipeline from that original headset. That’ll be interesting.
Shelly: Yes, that’s I think where really interesting stuff happens because the initial headset, even if it isn’t as bulky as an Oculus or a Gear or something like that, it’s going to be intended for a different experience than walking down the street or reading your bills. Glasses, probably, more closely associated with the iOS operating system, too, because you’ll probably be bringing apps into the Glasses in some way. In other words, you’re going to be doing things that are beyond gaming and speculative “Hey, I wonder how this couch would look with this carpet in an Ikea app” or something like that. I feel that’s the thing to be excited about.
I guess what I’m going to be watching for is what they do in the first iteration in terms of accessibility. Are we going to be able to use it at all? Even if we don’t choose to buy it, or if it doesn’t make sense for us, are there going to be things on there that make it possible for developers to create apps that we can use? Is there going to be an encouragement that says, “Hey, this is a platform that, as it moves forward, you’re going to have an opportunity to benefit from?”
Jonathan: My gut reaction to that comment is to say, “Look, we’ve just become so used to everything being accessible with Apple that it’s got to be but then, while you were finishing your sentence, I looked back at Apple Fitness+ for example. They really dropped the ball on that, and I was surprised and disappointed by that because I was looking forward to Apple Fitness+ but really it is quite hard to do a lot of those workouts because they don’t contain any audio description.
Shelly: Right, I know a couple of people who are blind, who are fans of Fitness+ and they’ve made it work for them, but you’re right. I think Apple will sometimes say it’s accessible, but that means in a literal sense. There’s a screen reader that can read things that the screen has on it, but that doesn’t necessarily make the actual features accessible. Fitness+ is a perfect example.
Again, this headset, I can imagine you might be able to access the menus, but then can you actually do anything with the content once you get in there. If you’re talking about augmented reality, are we going to augment the reality of the visual aspects of your living room or a gaming space or something like that? That’s why I say what I’m looking for is what accessibility they put in the first version because that, to me, is like a marker that says, “Okay, you’re not going to get what you want or expect right away, but we do have it in mind.” They always say, of course, “Don’t tag on accessibility at the last minute. Design from the ground up.”
I’m looking for them to design the hardware from the ground up to support what we need, whether it’s a VoiceOver screen meter or whether it’s something else, as opposed to just saying, “Oh accessibility, that’s a thing we’ll do later.” That’s not the Apple way and I hope that that’s not the case.
I agree with your Fitness+ example, I think that’s more of an issue with services than with hardware because– Oh, I remember how much consternation I had about the Apple Watch because until I got my hands on the first one, [chuckles] I was like, “Is it going to be accessible?” and they didn’t really address it. Sometimes that happens too, is that accessibility exists, but they haven’t explained it.
Going back to what we’re talking about before with Global Accessibility Awareness Day, what I would hope Apple would do, is give people a heads up about “Yes, the accessibility is built into this hardware, into this operating system, more to come. Watch this space.”
Jonathan: Yes, we’re looking for some implicit roadmap statement.
Shelly: Yes, and roadmaps are hard to get from Apple. [laughs]
Jonathan: Yes. We hear of course that on the roadmap for quite some time, even before the iPhone’s development, has been the concept of this Apple Car. You up for one of those when they’re available?
Shelly: Sure, why not? That sounds good. [laughs] I’m not as scared of self-driving cars as I should be probably, but [laughs] I love the idea of a vehicle whose destination I can control, that’s always a good thing.
Jonathan: Yes, look, I’ll be in line for whatever first self-driving car comes out from any manufacturer that I can buy, but I do have this little scenario in the back of my head that pops up whenever people talk about the Apple Car. You can say, “Hey, Siri take me to the local park” and it will say “I found something on the web about take me to the local park, Take a look. [crosstalk]
Shelly: That’s right.
Jonathan: Now, tell me about the formats in which iOS Access For All is available for those who haven’t bought a copy before, and how you buy one.
Shelly: I sell the book from my website, which is iosaccessbook.com. I sell an ePub version and a PDF version. If you’re not familiar with ePub, it’s the same format used by the Apple Bookstore and you can actually buy the book in the same format from the Apple Bookstore. I just get fewer dollars, but the links to the book are on my site as well as to the Apple Bookstore on their site.
I also sell a combination ePub and PDF version. It’s two books but it’s a zip file because there are a lot of people who really like both formats and so there was a demand for it. For US$25 you can get either the ePub or the PDF version from me. For US$30 you can get a zip file that contains both the ePub and the PDF. From Apple Books, you can spend $25 and you’re getting the ePub version.
Jonathan: Now with a great voice like yours. Did you ever consider an audio version?
Shelly: [chuckles] I considered it, it would take a long time. This version of the book is 214,000 words, so I don’t know. People used to ask me, and of course, with ePub you can make– Which is how DAISY Books work. You can make an audio version that has bookmarks and everything. It’s entirely within my power to do it, but it’s not necessarily within my time to do it.
Jonathan: Yes, that would be a huge undertaking.
Shelly: Oh, it would. Maybe I could get guest readers to come. Jonathan, do you want to come read a chapter?
Jonathan: I could be persuaded to read a chapter. [chuckles]
Shelly: There you go.
Jonathan: Yes, because there are people who just learn best with narrated audio.
Shelly: I’m one of those. I absolutely appreciate that point of view. Anytime I can turn text into audio, I do, always. [chuckles]
Jonathan: An audio that is searchable in the way that you’re describing is the gold standard, isn’t it, because then you’re not slowing down by trying to find the right spot of audio. You can just search for what you’re looking for. [crosstalk]
Shelly: 100%. Yes.
Jonathan: Well, it’s another great read and suitable for a raft of people, whether they’re new to iPhone or seasoned users who just want to make sure that they’re making the most of what iOS 15 is offering. That’s iosaccessbook.com. It’s always good to chew the fat with you and catch up on Apple things. Thank you so much for coming on the podcast.
Shelly: Thanks for having me. It was great to be here.
Ad: What’s on your Mind? Send an email with a recording of your voice, or just write it down, Jonathan@mushroomfm.com. That’s J-O-N-A-T-H-A-N@mushroomfm.com or phone our listener line. The number in the United States is 864-60Mosen. That’s 864-606-6736.
Jonathan: Let’s take a look at some apple things. Holger’s writing in and says “Hello from Bella and me. Hope you had a great holiday. I just got my iPad 9th generation and a smart keyboard. I like it so far. I’m using an iPhone 12 Pro, and iOS 15.3. When turning mute on, VoiceOver speaks notifications. Is this a bug? No issues with iOS 14. Also, in the series seven and watchOS 8.4, when I added the unity face and added the complications I use, VoiceOver didn’t speak them. I only got the time. I checked the other faces and VoiceOver speaks complications.”
Thanks, Holger. I haven’t checked out that watch face. As for the mute feature, I think what’s going on here is that when you use that mute option on an iPad, it’s only supposed to mute the sounds. It’s not supposed to mute VoiceOver speaking notifications. If you want that to happen, then you should go into Focus mode and set up a Focus, be it just plain old do not disturb, or you can be more selective and that will stop VoiceOver speaking the notifications. If you go back in the Mosen At Large archives, you’ll find quite an extensive look at the Focus mode, which I think is a brilliant feature in iOS 15.
You may also want to have a play with the auto speak notification setting, which could have an impact on the experience that you get. To do that, you go into accessibility settings. That reminds me to say that in recent versions of iOS, I can’t do that with Siri anymore. I used to be able to say to Siri, “Open accessibility settings,” and it would take me right there. Now, when I do that, it takes me into the general settings. I don’t know whether anybody else is experiencing this. It is a fairly new one.
Anyway, go into accessibility settings and then VoiceOver and then verbosity, and there is an option there called autospeak notifications. You could try toggling that and see if it gives you the experience you want. Most importantly, I hope you’re enjoying that shiny new iPad.
Peter: Hi, Jonathan, it’s Peter in Robin Hood County, wishing you and your family and all Mosen At Large listeners, a Happy New Year. Let’s hope that the coming year is better than the last one.
For those people that like tracking their sleep, there is an app called Sleep++. This was mentioned on iOS today. For more information, go to your App Store or Play Store or wherever you get your apps from. I’m glad you’re back, mate. Sunday mornings haven’t been the same, but I’m glad you enjoyed a very good break, and we look forward to some more interesting podcasts and some more interesting tech news.
Jonathan: Oh, no pressure then Peter, no pressure. Thank you so much for the welcome back. I bought Sleep++ when it first came out for the Apple Watch because I thought that sleep tracking on the Apple Watch would be really cool. This was before Apple introduced its own fairly basic sleep tracking, I have to say, into the Apple Watch product.
I just got out of the habit of using it because there were one or two accessibility issues from time to time. It just got fiddly, but your message prompted me to check Sleep++ out again. I am so glad that you sent that message because it’s a night and day difference now. They have this automation mode, so you don’t even have to have the Sleep++ app installed on your Apple Watch if you don’t want to. You can still do it the old manual way, but all you have to do to get Sleep++ working is, wear your Apple Watch when you’re sleeping. It is simple. Nothing else to do.
You’ve got to set this up. You’ve got to go into the Sleep++ app on your iPhone and turn the automatic mode on. Once you’ve done that, it does a pretty good job of detecting when you are asleep. It has this really cool new feature as well since I last looked at the app, where it shows your readiness score. How ready are you to take on the day when you wake up in the morning? It simply expresses those as a percentage value. Very cool. I think my readiness score today is 90% so I’m rocking it. That’s what I’m doing.
It will tell you how long you slept for, it will tell you when your best sleep was, it will tell you how many minutes you were fitful for. It is really a very cool app. Do check it out if you’re interested in sleep tracking. It’s way better and way easier than it used to be. I’m really glad that, thanks to you, Peter, I’ve rediscovered and reinvestigated Sleep++.
Just while I’m on the subject, a lot of people say to me, “Yes, but when do you charge your watch?” If you’ve got the watch series 7, of course, they charge so quickly. I haven’t got it, but that would be a big advantage of getting the watch series 7. Even in earlier times, what I find works is, when I’ve closed my Move ring and my Exercise ring and my Stand ring for the day, and typically my Stand ring is the last one to close because I do my 30 minutes of exercise as soon as I get up in the morning.
I get up at 5:00 AM by 5:15 or 5:30 at the latest, I’m rocking the treadmill, the weight machine, the rowing machine for half an hour. I’ve done my exercise goal by 6:00 AM. The Move goal is pretty easy to get done throughout the day, and normally by 5:00 PM, my Stand goal is also complete. I put it on charge.
I also put it on charge when I am having a shower or a bath or a sauna. We do have a sauna at Mosen Towers and that’s really great for your health as well. By the time I’m ready to sleep, my watch is at 100%. I put it on when I’m sleeping and I just keep it on when I need it. That works really well for me. It’s not too much of a pain at all to get into that rhythm.
TI Emulator: Hello. Jonathan, how are you doing today? I am the TI-99/4A computer from Texas Instruments. Actually, I am at my computer. I am really running in an emulator called the classic [unintelligible 01:36:17] harmless [unintelligible 01:36:19] come when the last time you heard me [inaudible 01:36:25] hardware.
Joe Norton: Hello, Jonathan. This is Joe Norton in the United States here at Dalton, Georgia. I didn’t figure you wanted to hear too much more of that TI speech. It was just something I was playing around with, with a TI Emulator. I don’t know if you had one of these things or not, but years ago, of course, some of us thought that this would be our computer because it came with a speech synthesizer. At least you could get it with one, but we soon discovered that the speech synthesizer was limited to its built-in vocabulary, which we gave you a sample of here, or you could get a couple of text to speech options, but they had to be programmed and each of them had various disadvantages to them.
We found out quickly that this wasn’t really going to be as useful a computer as we had hoped it was. Like most other people, I graduated to something that I could put a screen reader on. Anyway, I’ve been playing around with a few retro things in my spare time, not that I have a lot of it. Anyway, I’ve been playing with some DOS emulation and with the combination of some old and new technology, I can even do something like this.
?Speaker: See [unintelligible 01:37:30] greater. Hey [unintelligible 01:37:32], do you want color? Why? Pitch hikers guide to the galaxy. You wake up, the room is spinning very gently round your head, or at least it would be if you could see it, which you can’t. It is pitch black. Darkness, zero, zero, greater.
Joe: I edited that for brevity because again, I didn’t figure you wanted to hear the whole startup sequence of this game and whatnot, but maybe something you’re familiar with. Daniel gives him a little bit of an attitude. I always thought that Daniel have always had a funny attitude, a smug attitude almost, I could describe it. I’m not exactly sure what the best way, maybe you’ve got an adjective for Daniel that you can throw out at me.
Wanted to mention that I’m using a Surface Pro 7 and I don’t know if anyone’s done a review of it, but I’d like to mention a few things that actually make it a fair, really blind, friendly device, or at least it’s easy to use as a blind person.
This is a tablet-type device. It has a touch screen on it and it’s fairly small for a laptop, but then again, I don’t necessarily need a large. I wanted to describe the unit. It has on the left side as it’s oriented towards me right now– I’ve got it standing up on its kickstand. On the left side, there is nothing but a jack for a 3.5-millimeter plug that’s for headphones. Also, a microphone can be plugged in there as well.
It uses the three contact thing, like the iPhone or Android headset use, but the keys on the headset don’t do anything, so that doesn’t really figure into things. On the top of the unit or probably on the side of it if you were using it in portrait mode, there’s a power button, and then a volume up and down key. Here’s one of the things that, as a geek, I find it very interesting.
This is one of the easiest laptops I’ve ever had, where I could boot into a USB stick that might have Linux or some other operating system on it. All I have to do is power the unit down completely, do a complete shutdown, that is. Once the unit is totally shut down, I can insert a USB stick, hold down the volume down button, and press the power button. After a few seconds, release the volume down button and it boots into whatever operating system it finds on the USB stick. That is very easy to work with.
Most of them, you have to go into a menu and even figure out how to get that boot-up menu. In a few devices you may even have to go into BIOS and enable it. That was one thing I found nice. Let me just finish the description though. I digress, as they say.
On the right side, there’s a USB C port, underneath that is a regular USB A port. I think it is USB 3.0 anyway. At the very bottom, there is a magnetic connection for the charger. Underneath the kickstand, hidden away, there is a slot for a micro USB card.
Another thing that I like about this laptop is the little type cover that comes with it. There’s the keyboard and it’s a fairly standard, small laptop keyboard, but one of the nice things about it is, normally with most of the laptops, the first thing we all run into is how to get those darn F keys to become function keys because everyone seems to think that you want them to be media keys and of course, a lot of us don’t want that.
Well on this type cover, the nice thing is you don’t have to go into BIOS or anything else to get that changed. The only thing you do is tap the FN key by itself. You just tap it and that toggles the setting from function keys to media keys and back. If you don’t find that it’s doing what you want, you can just tap the FN key and that should fix things. One way I can tell if I’ve got it set the wrong way is if I press Alt+F4, everything mutes because that’s the mute button on the media keys. If I’m trying to close a program and I suddenly find I have no sound, I usually know why that is and can get it fixed fairly quickly.
That might be an inconvenience for some, but I found it very easy to work with. In fact, most of the time I use a USB keyboard that’s plugged in. It’s a wireless Logitech Wave keyboard because I like to use a full-size keyboard with a numeric pad and on and so forth.
Another thing it has that I find annoying is that mouse pad, of course, which I never use. It isn’t too hard to disable that because I can just go into Windows 10’s settings and Windows 11 will do the same thing. I can just find the touchpad in there and turn it off and that takes care of that. This device I have came with Windows 10. It came with eight gigs of RAM, which some would find on the cramped side. Not only that, it also has only 128 gigs of NVMe storage. Again, I don’t have many demands that I place on this unit so it’s doing what I wanted to do, and I’m having a lot of fun with it.
I bought it back in August 2020. I had to go back and look and see when I ordered it. It’s been a fairly good laptop. I had to be careful with it. Usually, I have it sitting on a desk or sitting on a table or something like that. It doesn’t really get moved around too much. I don’t carry it with me a mobile for the most part and things like that because I did have one case where I dropped it and it did a little scratch on the corner of it. I said, “No, I better be careful with this thing because it’s not the most rugged unit that I’ve ever had.” It’s probably not as rugged as say, a MacBook Pro or MacBook Air, but I doubt if you really want to give a serious drop on the concrete to either one of them.
Anyway, I thought I’d say hello and congratulations on the presence that you got. I know you’re having a lot of fun with those by now. I hope that you get your flooding situation under control. Hopefully, you do have that under control by now. Water is good, but you want it to stay where you want it to be. You don’t want it to go in places where you are trying to live at the same time.
Oh, so one final thought I had. The other day, I was asking Alexa what time it was in a couple of places. I asked at what time it was in Honolulu, then I asked at what time it was in Auckland. It turned out that the time in Auckland was one hour behind the time in Honolulu. Of course, I know it’s a day ahead so I was just thinking, “A person in Honolulu if they had a friend in New Zealand, wouldn’t have to worry about what time it was. They would just have to worry about what day it was because if they call you, say, on Sunday morning at nine o’clock, they’re going to get you Monday morning at eight o’clock. They might catch you on your way to work or something like that. Whereas they would be possibly off work if it were a Sunday, depending on the schedule, of course.” That gave me a whole different perspective on the time difference.
Just touching bases with you. I’m glad to see that the podcast is still going and Mushroom FM is still going. I hope everything goes okay for you all and take care. I’ll talk to you later.
Jonathan: Thank you so much, Joe. Always good to hear from you. Glad to hear you’re still out there yourself. Yes, Mushroom FM is still here and the podcast is still here. I’m glad you are, too. [chuckles] You asked for a description of Daniel. I think particularly with the premium Daniel voice, the description that comes to mind for me is pompous. He does [chuckles] sound a bit pompous but those info context, adventure games are great, aren’t they?
Thanks for the review of the Surface Pro 7, as well. I had the original Microsoft Surface, which came out in, I think, 2012. I got that when I was working with Freedom Scientific because we were looking at the whole tablet thing, running JAWS on a tablet. It was a really viable option even then. Of course, it’s become more powerful, it’s become more polished.
I got a Microsoft Surface laptop, I think, it was called. Their branding is quite confusing to me because you have the Surface Book, but I think the thing I had was the Surface laptop. It was the one where you could actually pull the screen off. [chuckles] Don’t try this at home. You could pull the screen off and the screen became a tablet. It was a very nice computer.
I thought I would buy one because it would be a bit like buying an Apple computer, that if Microsoft develops the hardware and the software, it might be a really good fit. That was the first time I ever encountered the dreaded Realtek hibernation stuff with the audio. I called Microsoft Tech Support about that and they didn’t really seem to care very much. I didn’t appreciate at the time because I had come from a Toshiba computer that didn’t do this, that it was actually quite widespread.
You would think that with Microsoft’s commitment to accessibility on their own devices, they would find a way to get around that horrible Realtek Audio driver hibernation issue that infests so many computers. I returned that particular laptop, but it’s really good to hear about the features that you like of that Surface.
Jonathan: I love to hear from you so if you have any comments you want to contribute to the show, drop me an email, written down, or with an audio attachment to Jonathan, J-O-N-A-T-H-A-N@mushroomfm.com. If you’d rather call in, use the listener line number in the United States, 864-606-6736.
[01:46:51] [END OF AUDIO]