Podcast Transcript: Mosen At Large episode 166, revolutionary blindness navigation with Biped, review of the WeWalk Smart Cane, ongoing iOS Braille issues and more

This transcript is made possible thanks to funding from InternetNZ. You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

 

[music]

Jonathan Mosen: I’m Jonathan Mosen, this is Mosen At Large, the show that’s got the blind community talking. On the podcast this week, cutting-edge blindness technology called Biped, my review of the WeWALK Smart Cane, customer service that’s anything but service, and my spotty nephew has a guardian angel.

[music]

Jonathan: It’s great to be back with you. Thank you for making time for Mosen At Large in your week, I know that you’re busy and if this is your first time listening to Mosen At Large, a special welcome to you. It’s our 166th episode but I’ve been podcasting since all the way back in 2004. The industry has changed a lot but a lot of the fundamentals are still the same. When I go to my podcast hosting company, which is called Pinecast, you may remember that we’ve spoken about Pinecast before and interviewed its creator and I look at the number of people who listen to the show, the thousands of people from around the world and where they come from, I think, wow, and it’s a little bit overwhelming.

I try to just remember that every week, I’m talking to you, just you and I hope you’ve had a good week. By the way, if you would like to try your hand at this podcast malarkey, Pinecast is really good. It’s reasonably priced. It’s fully accessible and the developer, Matt, really cares about accessibility. You can score yourself a couple of free months of Pinecast while also supporting this show by using my affiliate code for Pinecasts. You can try it obligation-free and if you want to do that, then when you sign up for Pinecast, when you’re asked to enter a referral code, you can just enter R-DAC7DA. Even my Apple Watch is excited about the prospect of you signing up for Pinecast. It wants me to breathe, calm down, no doubt.

That’s Romeo-Delta, Alpha, Charlie, 7, Delta, Alpha. If you do want to sign up to Pinecast, and you didn’t get that code, just drop me an email, I will send you the referral code because it’s good for you, it’s good for me, what they call a win-win outcome. Well, as you will hear in the show, water has figured quite a lot in my week yet again and I had an interesting experience last night, which means that I’m up quite early this morning. What I do with the show is I produce a little bit each day and I do a little bit in the morning and a little bit at night. That’s the only way with my busy lifestyle that I can make it all work.

I’m up really early this morning getting it polished off because last night I had quite an interesting experience. My wheat treat at the end of the day with my evening meal is a glass of kombucha and I highly recommend a New Zealand company called Good Buzz Kombucha. I have no idea if they export their product overseas, but it’s great. Richard and I used to make our own kombucha and it was quite a bonding time if you used to listen to The Mosen Explosion on Mushroom FM, you will remember our stories from Richard and me of making this kombucha.

One day, Bonnie and I were just watching Netflix minding our own business, next thing you know we hear this massive explosion from the kitchen and we were so traumatized by this. We actually said, “What the actual soup was that?” I’m sorry for the profanity, but man, and it turned out to be one of the kombucha things exploding, so that was fun. Now we just find it’s easier to buy the quality Good Buzz kombucha, which comes from a little place called Tauranga here in New Zealand and it’s really good. I have my glass of Good Buzz Kombucha and like a dutiful person, I was rinsing out the bottle before putting it in the recycling bin last night.

Maybe I wasn’t being mindful or something but I turned the tap on a little bit too aggressively and my shirt got soaked with the water from the tap. No big deal. I thought nothing more about it until it was time to put my phone of I on charge for the night. I know there are all sorts of gadgets like wireless Qi chargers, and those MagSafe things but I still like how quickly the iPhone charges when you use the big wattage charger and you plug it directly into the lightning port. I connected my phone to the lightning port and it made a little alert pop-up sound and I thought that’s unusual.

Then when I checked the notification, that little dialog that had popped up on the screen, it said, “You have got water in the lightning port.” Then they had an emergency charge override button thing but it basically said we don’t recommend you do this and so I did a bit of Dr. Google and I found the article on the Apple support knowledge base thing. Apparently, this is something that you get, if you’ve got a 10S or newer if you’ve got a bit of liquid in the lightning port, and it’s designed to protect your phone. I thought what should I do, because I’ve always heard you’re supposed to maybe use a hairdryer to blow it dry or maybe you put it in rice, all these things.

The Apple knowledge base article was really clear about that, it said, “No, do not do any of those things.” It said all you have to do is you can turn the phone lightning port down and tap it gently to help the water do its thing and come out of the port, otherwise, just give it time. It said you can charge your phone using a Qi charger and I do have one of those, otherwise, just give it time. That was stressful, man. I knew that it would probably sort itself out and it actually did. I left it I went to sleep and when I woke up quite a bit earlier than I was hoping to wake up, which is why I’m in the studio at this very early hour, I thought I’ll check it again, and lo and behold, it worked.

That’s pretty cool, isn’t it? Because there was a time when if you got liquid in your phone, it would trigger the secret liquid sensor that Apple had and they would say we’re not going to honor the warranty because there’s liquid damage or something like that. We’ve come a long way in terms of dealing with water on these devices but it certainly was something I wasn’t expecting when I just did my routine of putting the phone on charge overnight. Good on Apple for detecting it and letting me sort it and here we are again, problem solved. Crisis averted.

Time for more of the continuing story of the latest experiences people are having with Braille in iOS. I should say that I recorded the last episode, episode 165 a little earlier than I normally do because my weekend was taken up with the Disabled Leadership Now campaign so I recorded it just a little ahead of iOS 15.3.1 coming out in which they actually did make mention of the lock-up issue with Braille being fixed. Certainly, I have not had a lock-up in iOS 15.4 either with beta 2 onwards so that is really good. As I put this together, I’m just starting to use beta number three, we’re up to now.

Let’s get into a few responses on this issue because clearly, it pushes some buttons for people when their productivity or their safety is affected by issues like this. Starting off with Pete de Vasto, who says, “Hi Jonathan, I like so many others have also experienced the frustrating occasions where my Braille display freezes while working with iOS 15. I actually have two displays of Focus 40 5th Generation display that I got last year, and a Vario Ultra 20. Interestingly for me, the hang-ups only occur with my Focus display, not my Vario Ultra.

I’ve been following the discussion about how folks have gotten around the situation, which apparently so often involves doing a hard reset of your phone. When my Focus freezes, I have not had to reset my phone, believe it or not, all I have had to do is to exit VoiceOver and turn my display completely off then restart both. Here are the steps I take that have worked for me, and you should do them in the order I have listed as timing is important. One, turn VoiceOver off on your phone in the usual way by triple tapping either the side button or the home button, depending on your phone model.

Two, completely turn off your Braille display, not by putting it to sleep, but completely shutting it down. Three, wait about 30 seconds before continuing. Four, now first, turn your Braille display back on and make sure it is completely up and running. Five, finally restart VoiceOver on your phone as you normally do by triple tapping either the side button or the home button. After a few seconds to a minute, my Focus display once again is communicating with my phone and I can continue on from where I left off. Once again, these steps seem to reliably work for me, but they may not work for everyone. I’m sharing this with you in the hopes that perhaps even one or two others will benefit.

It’s possible that I may have missed a similar suggestion, so please forgive me if I’ve sent duplicate information. As others have also said on so many occasions, these podcasts are extremely informative, and the community does appreciate all the good and hard work you are doing. Keep it up.” Thank you, Pete, it’s good to hear from you again after all these years. Sadly, those tips didn’t work for me. I’m pleased to say the lockups are gone now and hopefully, they’re gone for everyone else, but I think there may be a difference in the degree to which this bug bothered people depending on whether they were using human interface devices or not.

It’s good to be able to talk about this bug in the past tense, yay, but I suspect it’s HID Braille displays that may have locked up more dramatically than non-HID displays. I could be wrong about that, but that’s possibly what the variable is there because the Focus is not using the HID protocol. Eden is back in touch and she says, “When I first started having the apostrophe problem, I was on the system table. I changed to Liblouis and no help. Then back to system default with bad luck. Today, I went back to the Liblouis, and now it works. Apostrophe looks normal again. Change it a couple of times, and maybe it’ll work for you as well. I already had tried this and nothing has changed but there you go.”

Well, hopefully, that will help those still rocking 15.3.1. If you are on the 15.4 beta 3, I can report that I changed my Braille table back to the UEB system default when I installed beta 3 and it’s all good. It’s working normally again. The apostrophe is rendering properly. Are we happy about this? Yes, we are. Wow. Here’s an email from Nikki Keck. “Hi, Jonathan. I was so excited this week when the iOS update came out, touting prominently the fix for Braille displays. Yes, maybe they fixed the lockup issues, though the jury is still out for me on that.

However, the biggest issues that I have dealt with, as you rightly pointed out, are not fixed. I read Kindle books often with my Braille displays. I have a Brailliant BI 20X, and some Focus displays. Focus, meaning VoiceOver focus, not the Focus display will jump either ahead or back without warning, and I’m left to find my place again, as I’m reading. Also, I like to use the auto-advance mode in iOS. I have had panning buttons fail on me in the past, and I have this compulsion not to wear it out from overuse of the right panning key or thumb key on the Brailliant.

I like to use auto-advance in order to save wear and tear on my panning controls. It also probably doesn’t hurt in preventing carpal tunnel or other repetitive motion problems in my hands. At this point, auto-advance is completely unusable because of this tendency to jump forward or backward in the book I am reading. Another issue I have noticed with consistency and much frustration is the tendency for the Braille display to freeze for a few seconds. I was noticing this, in particular, today while trying to read Bible passages in the Bible Gateway app, though I have seen it in other apps while reading/panning through text.

Again, I can’t use auto-advance reliably at all right now so I was panning through text when every few lines the Braille display would suddenly freeze for no reason. I thought maybe it was locking up but after waiting a few seconds and pressing the right thumb key a couple of times, my Braille display finally started responding again. It moved me ahead the number of times I had pressed the thumb key trying to get the display to respond. Then I had to go back to where I was, which only required panning back once or twice.

However, it is really annoying, because it’s happening to me repeatedly today after three or four lines of text. I haven’t tried the new update with the Focus since it came out on Friday but as I noticed the same issues before this update, and they are not fixed with the Brailliant, I doubt they have fixed with the Focus. Also, since others are still mentioning them, I think the issues happen with all displays.

Personally, I wish Apple hadn’t gotten our hopes up by claiming the Braille display issues introduced in the previous updates have all been fixed unless they truly did so but that is just my opinion and probably won’t even get you a cup of coffee or tea. In any case, I just thought I had to share my experiences and frustrations since you encourage those of us having such experiences with these Braille display bugs to get in touch. Thanks again for the podcast. You always provide lots of information and help those of us dealing with these quirky issues to feel that we are not alone and/or going crazy.”

Thank you, Nikki. It’s great to hear from you. I have to say I had the devil of a job reading that email. I’m reading it using iOS 15.4 beta 3 and very frequently it was not panning when I was scrolling the thumb keys so I had to do a lot of editing because I would sit there reading away push a key wanting to pan to the next line and it just would not. Well, I don’t think I really need to say anymore but thank you for your email.

Advertisement: Like the show? Then why not like it on Facebook too. Get upcoming show announcements, useful links, and a bit of conversation. Head on over now to facebook.com/mosenatlarge. That’s facebook.com/M-O-S-E-Natlarge to stay connected between episodes.

Jonathan: Over 40 years ago, I was fortunate to be part of an experiment led by the world-renowned professor Leslie Kay from the University of Canterbury here in New Zealand. Professor Kay led the field of research and development relating to Sonic guide technology for blind people. Despite being only nine when I started working with Dr. Ed Strehlow, who taught me the technology and observed my use of it, I still remember it clearly. The idea was that using a transducer, you could take the ultrasonic frequencies bats used to navigate and bring them down to a frequency hearable by humans.

By the time I became part of the experiment, that technology was tried and proven. The device I was given was a sonic headband, which was connected to a box with a couple of switches for on and off and adjusting the volume. The headband had little tubes that you inserted into your ear, but they were thin enough that they didn’t block environmental sounds. I was one of a few kids chosen for this project. The idea was to see how young people who were taught to work with this technology from an early age would incorporate it to use into their daily lives.

After some training, I found this technology very useful. I remember being put in an open area where a series of poles had been erected at random and my job was to navigate around the poles without any other mobility aid. I got very good at this. I got used to the different kinds of sounds certain objects made. Thin metallic poles sounded different from thicker wooden lampposts. Trees and bushes emitted a different kind of sound from buildings. I found that the headband gave me information that supplemented what I was getting from my cane and when I eventually attended a mainstream school, I wore it when walking to school for a while until the experiment ended.

I have very positive memories of ultrasound technology and thinking about it now, I’m surprised that Sonic headbands or glasses aren’t more commonly used today, especially when they could potentially be combined with a camera and paired with a smartphone. The ultrasonic space isn’t completely dead though. One player in it is the WeWALK Smart Cane. If you’d like to hear a demonstration of working with the cane and its companion app, you can listen to Episode 76 of this podcast in which Nick Zammarelli demonstrates the app as it was at the time of recording and takes the cane for a spin.

When I heard Nick’s review, it wasn’t one of those moments where I was bowled over and where I immediately thought I have to have this but I’ve kept tabs on the development of the WeWALK Smart Cane following them on Twitter and installing the app. When they offered a 30% discount for Black Friday last year, I decided to take the plunge and try the cane for myself so I’ll share my thoughts on the product and if you have one, I’d welcome your thoughts as well. It’s true what they say a website is a company’s shop window and can be a potential customer’s first impression of a company.

Sadly, the WeWALK website is not a shining example of accessibility. Given that this is a blindness product, it’s a particular concern. For example, at the time I’m recording these thoughts, the homepage contains items that are repeated more than once and a set of videos where the position of the play button doesn’t make it clear what video you’re about to play. Even worse on two systems where I visited the site using Microsoft Edge running JAWS, none of the play BUTTONS actually played any movies. I also found the ordering process ambiguous.

Having completed the order, I wasn’t certain I successfully selected the length of cane I was after so I had to include this information in the notes just to be sure. There’s a hardware and software component to WeWALK. The hardware is of course the white cane with the grip containing all the intelligence. The software, as you’ll hear in Nick’s review in Episode 76 is an app available for Android and iOS. If you want you can download the app for smartphones from the usual places and pay a subscription fee to unlock the features. That way you can use the GPS functionality while working with any cane.

If you purchase the cane, this automatically unlocks access to the app and there’s no need to pay a separate ongoing subscription. Immediately pairing your cane with a smartphone is enough to unlock the full functionality of the app so it’s a very well-thought-through system. Somewhere on the WeWALK site, I read that once you had ordered a WeWalk Smart Cane, you’re entitled to have the app unlocked so you can become familiar with the app’s features while waiting for your cane to arrive. It was here that I first came across what for me has been the disturbing lack of responsiveness from the company.

Initially, I thought that given that I was already registered with WeWALK with the same email address as the one associated with my WeWALK order, they might process the unlocking of my app automatically. After waiting several days and concluding that perhaps this isn’t the way that it works, I used the contact function of the WeWALK app to ask if my app could be unlocked as described.

No one at WeWALK ever replied to that message, so it wasn’t until my cane arrived that my app became unlocked. Unfortunately, I have more examples of the company’s unresponsiveness later. Shipping was fast, the cane arrived quickly on the same day as my ThinkPad and indeed with the same courier. As regular listeners will know, I’m enough of a sad geek that I do tend to read every last word of any user guide for products that I own.

In this case, I have no doubt that I could have assembled the cane without any instructions. It’s super simple to do, and the assembly is well thought through. Still, it was great to see quickstart instructions in the box including a Braille version. You have an ordinary but lightweight white cane included in the box into the top of which you screw the cane’s grip which has all the features that make it smart. This includes the ultrasonic sensor and the touchscreen.

Those smarts understandably require power so there’s a battery in the cane’s grip that needs to be charged. It was disappointing to buy a new device in 2021 that didn’t contain a USB Type-C port. Except for Apple which is on its own proprietary lightning path with iPhone for now, every other device I travel with now uses USBC which makes charging them easy. To travel with WeWALK and charge it I must remember to pack one of the older USB cables. Not only that, you’re probably going to have to remember to pack something else too, another regular non-smart cane. This is because the user guide expressly cautions you against using WeWALK in the rain.

Now while I would like the luxury of canceling all my appointments so I don’t get my WeWALK wet, that’s not viable. If you’re traveling with your WeWALK and you need to get somewhere in a downpour, you’d better have a backpack or a briefcase in which you can store your WeWalk or your regular cane. I think water resistance in a product like this is a must. Running the WeWALK app and pairing your cane with your phone is accessible and easy to do. I smiled when I did this because, for the first time in my life, I had a cane that required a firmware update.

Some of the English in the app is a little confusing, and I found myself waiting a long time for an update that I thought was happening but in fact was not, because the yes and no buttons for the update were ambiguous. Once I worked that out, I got good progress information as the cane updated itself. Weather PERMITTING, you can simply use the WeWALK as a regular cane by not switching it on.

Frankly, if I want an ordinary cane that has no smarts, I find that I’m better off using my good old white cane which used to be named Henry, UNTIL my oldest daughter married someone by that name. I haven’t had the heart to rename either the cane or the son-in-law. Because the technology in the grip of the WeWALK cane makes this thing heavy. I really notice it. The grip is fatter as well as heavier and I don’t find it comfortable to hold. My arm gets sore after a few minutes.

The tip on the WeWALK is also different from what I’m used to and it seems to get itself stuck on uneven surfaces. I presume I can change that for a more conventional tip. If you want, you can also switch the cane on and benefit from its ultrasonic sensor without the need to even pair it with a smartphone. The ultrasonic sensor vibrates to tell you about obstacles you’re approaching at around chest height and above.

The entire cane vibrates but there are also little buttons on which WeWALK recommends that you rest your thumb to give you optimal feedback. When you’re using the cane for the first time, it’s going to feel like you’re being overloaded with a lot of information so you’re going to have to give this thing some time and use it strategically. By that, I mean that if you use the ultrasonic functionality in a very close environment like in an office or your house, the thing isn’t going to stop vibrating.

If you use it in more open environments and adjust its sensitivity to match how fast you walk, there’s no question that WeWALK might save you from a nasty bump on the head or thumps to the chest. I think my perception of WeWALK is influenced by the wonderful experience I had as a child with Professor Kay’s sonic headband because not only could I often tell with that device what an obstacle was by the sound that it was making, the pitch lowered as I got closer to the object. That was liberating.

I don’t find WeWALK’s information nearly as useful, but it’s still useful augmenting the usual feedback you get from your cane with information a regular cane can’t give you. It is very busy since you’re picking up vibrations from things that may be at chest height or above, but are at ground level too with inadequate data to differentiate the two. You can also use WeWALK with any navigation app you like be it blindness specific or a mainstream app like Google Maps and Apple Maps.

When you use the entire package that is to say you have the cane paired with your smartphone and you’re running the WeWALK app, you can control the critical functions of the app either with a touchpad on the cane or with WeWALK’s voice assistant. The idea here is that you don’t have to be distracted by working with your smartphone, you have one hand free and obtain the navigation info you need via your cane. It’s a cool concept.

Basic navigation is achieved as you would expect by flicking right and left on the touchpad. There are other gestures as well including those requiring two fingers. To help you get used to the gestures, there’s an excellent tutorial section where you can practice them. It was at this point that I discovered that my touchpad is frustratingly unresponsive to gestures to the point that it’s just easier to use the phone.

I wondered if my unit might be defective but googling on this issue revealed that others feel the same way. I’ve used various touchscreens from a variety of manufacturers and I do know that some are more responsive than others but this one is for me on the verge of simply not being fit for purpose. Another way to use the cane is with its voice assistant which is actively under development.

Because it’s using Bluetooth to get the signal back to the smartphone which then sends the request to a server, it can be quite slow to respond at times but you can learn to make requests with it. If you’re in a high-traffic area or somewhere where there’s a lot of other noise, the voice assistant may not always be practical. The cane also contains a speaker for the output of instructions.

Because of my hearing impairment, I may not be the best judge of the utility of this, but I suspect that people with regular hearing may have a hard time hearing the speaker in a crowd so be prepared to use an appropriate headset that doesn’t obstruct your hearing in traffic. When you do, you need to pair those headphones with your phone, not the cane. I won’t spend too much time on the app since Nick’s demonstration of it is very thorough.

It has of course been improved since his demo which was recorded in 2020. For example, WeWALK recently announced a partnership with Moovit for better information about public transport. One curious thing about the app that I haven’t been able to figure out caused me to try and engage with WeWALK again with no success. When you search for a destination, you have several ways of navigating to it.

This includes walking, using various methods of public transport such as buses and trains, and rideshare services. Uber and Lyft are supported. We have Uber in New Zealand but not Lyft. We also have Ola here, but the app doesn’t support that. Interestingly I cannot find a way to specify that you are being driven in a private vehicle. Effectively, this is the same as using the Uber option except that you can’t get the directions unless you request a trip.

It’s possible that I’m missing something here because clearly many blind people have sighted spouses or family members or colleagues who might drive them. Of course, you can use other apps if you want directions in this scenario but if I’m going to invest time or create favorites and become generally familiar with the user interface, I want it to work for something as obvious as driving.

When I found this limitation, I had just received an email welcoming me to the WeWALK community and inviting me to ask any questions so I replied asking about this and once again, never received a response. On Twitter, Jeff Bishop asked me over the summer how I was getting on with my WeWALK and I commented in my reply on the company’s lack of responsiveness. Now, this did get a reaction, it resulted in someone from WeWALK replying to my tweet and I yet again asked the question about an option for when blind people are been driven in private vehicles.

At the time of recording many weeks after that exchange, I am still awaiting a reply. There are a lot of good blindness GPS apps out there now. With the former Sendero app now being available free as GoodMaps Outdoors, I hope that WeWALK might consider releasing an API that would allow other GPS apps to support their technology. Since they’re bundling the app free with the hardware anyway, I don’t really see that this would lose them any revenue, and it may in fact be a profit center for the company if they license the API to other apps. After initially playing with it, the novelty of the WeWALK has quickly worn off.

When adopting new technology, there will always be pros and cons and people will weigh those pros and cons differently depending on their own personal preference and lifestyle. As a WeWALK user, I must remember to charge my cane. I have to put up with the unwieldy weights that literally becomes painful quite quickly. The sensor doesn’t give me the specificity of information I had with Sonic devices over 40 years ago. I think there are better GPS apps out there for the blind community. I have to conclude that WeWALK doesn’t meet my needs. Your mileage, of course, may vary and I’ll be interested to see what future generations of this product might be like.

After all, I didn’t use the first accessible iPhone because of the limited range of input options, but that was soon rectified and now the iPhone is a critical part of my life. Who knows what WeWALK might be cooking up for the next generation of this product. A better touchscreen and water resistance surely must be right up there on their list of priorities. The thinking behind it is sound and exciting. They’ve got a lot of backing, including from Microsoft.

The white cane has been wasted real estate for too long. I hope good things are in store for this product and other smart canes that are being developed. If you want to find out more about WeWalk you can go to www.wewalk that’s W-E-W-A-L-K.io. Of course, you can check out episode 76 of this podcast.

Advertisement: Jonathan Mosen, Mosen At Large Podcast.

Jonathan: Listeners have questions, listeners also have answers and that’s one of the cool things about this podcast. Jim O’Sullivan is the first to reply to David Harvey’s question about where the deuce do you get South African cricket on the radio these days? He writes, “Hi, Jonathan. 1 World Sports Radio now have the online rights for international cricket commentary from South Africa. I find their YouTube is the best place to listen. The commentaries also get archived on the YouTube channel.

I assume the forthcoming series between New Zealand and South Africa will be on Magic Talk from New Zealand. I find when the cricket is on Magic Talk is geo-blocked here in the UK and I need to turn on my New Zealand VPN to be able to listen. Handy things those VPNs. I’ve also briefly used that station’s app called Rova. It’s spelled R-O-V-A. On Android, this app seems accessible, but I did need to use a VPN and turn off my location on my phone to play the cricket.” So says Jim in Southampton in the UK.

Thank you, Jim. It will be the last cricket they ever broadcast because Magic Talk is not long for this world. Magic Talk is closing down on the 20th of March. It is being replaced with a new talk station and I don’t think it’s at all clear that they will retain the cricket rights. To the best of my knowledge. They only had a two-year contract with New Zealand cricket and that two years is up. I think what you might find is that going into the next season, the SENZ Sports Network will try to have the cricket rights but I guess that remains to be seen.

Look at this, it is Stephen Jolly from Australia. He says, “Hello Jonathan. I did happen to hear the contribution last week from David about South African cricket. Sadly, Radio 2000 appears to have abandoned this public responsibility. I did tune in on the evening of boxing day anticipating an evening of relaxation, catching some of the South Africa versus India series following a day of MCG ashes.”

I’m sure you were delighted, thrilled with how the ashes went, Stephen, although I do know that even some Australians wished it might have been a little bit more of a contest. Too much cricket, says Stephen, is barely enough right on man. When to my surprise, it wasn’t on. Oh no. An internet search turned up a couple of articles which explained Radio 2000 have decided that long days of cricket commentary would not be compatible with their move to more commercialization, whatever that means.

I was pleased to note that in expressing their disappointment over this cricket, South Africa regretted that the change would be very disappointing for many blind people in South Africa who appreciate radio broadcasts of cricket, so not good. Let’s hope there is a change of mind from the management of Radio 2000. On the brighter side for David and followers of South African cricket, I believe the Magic Talk coverage on the Rova app of the current NZ versus South Africa series will be available without geo-blocking.

By the way, like you, I have for years thought it would be good to have some one-stop place for subscriptions to international cricket radio commentaries. Best to all the Mosen At Large community. Thank you very much, Stephen. It’s been a disappointing summer for cricket because of course New Zealand and Australia were due to do a bit of playing and that has been abandoned due to the Rona.

Tim: Welcome to the fourth episode of Tim’s New Hearing Aids. Before we dive into the Oticon More receiver in canal hearing aids, which I’m currently testing, first a small reaction to Jonathan’s comments regarding Oticon’s philosophy, which we seem to agree is that the hearing aid is supposed to transmit as much sound as it possibly can to the human ear, of course, with reasonable adaptations, but without excessive manipulation, under the assumption that the human brain can filter out the sound it needs itself.

I said that this philosophy is more suited towards advanced hearing aid users and Jonathan rightly remarked that this philosophy is especially suited towards blind hearing aid wearers because, unlike sighted people for whom all that annoying background noise can just distract from the main goal of the hearing aid, which is to understand speech, the goal of a blind hearing aid user is also to hear this annoying background noise because that provides him essential information about his environment, which he does not get through his eyes.

This reminded me of a workshop from Dan Kish, who’s a world-famous eco localization expert in 2017, in Brussels, and the one thing that I took home from that workshop is that humans blind and sighted understand their environment by integrating information they get from all their senses. Sighted people might think that they only use their eyes and of course, they rely a lot more on their eyes than blind people but it’s not true if you walk with your ears closed.

I’ve tried that. That’s simple for me to do that to turn off my hearing aids. I will get somewhere based upon my usable vision but I feel far less confident. I just miss a lot of information. An even clearer illustration of this with my usable vision, in another workshop I was able to drive quite safely on a mobility scooter as long as I didn’t drive too fast. There was also a simulator of a mobility scooter in which I only had to rely on my sight.

As my sight is very limited, I crashed the virtual mobility scooter in a couple of seconds. This very clearly illustrated that besides my usable vision, I rely on the haptic cues of movement, plus auditory cues to control that mobility scooter. Anyway, at the risk of sounding like an Oticon marketing guy, I would say that even sighted people need this annoying background noise far more than they consciously realize.

Investing in learning to appreciate a richer, less filtered sound and learning to filter out everything you need is worth the effort. On to the review of the Oticon More receiver in canal hearing aids, as discussed, the Oticon app is quite accessible but as anticipated, the rechargeable batteries of the Oticon More hearing aids are not adequate for my needs. I think that with moderately intensive use, the batteries will last me 8 to 12 hours, which in practice means that if I traveled to an event which lasts for even a normal business day, I absolutely have to bring my charger and charge my hearing aids in the middle of the meeting or on the way back.

As it is, I have to carry around far too many items and I sometimes forget or lose items which can have really negative consequences. This poor battery performance is probably going to be a reason for me to stick with traditional behind-the-ear hearing aids. If there’s a rechargeable hearing aid that lasts me two to three days, like the Signia rechargeable behind-the-ear hearing aids, which I discussed in the previous episodes, it’s okay because then at least I’m sure that if I remember to charge it at home, it will survive the day, but if I can’t even be confident that the hearing aid will survive for a long day, then it’s just not acceptable for me because going home with dead hearing aids is a safety issue.

Picture, I return home late at night very tired, I’ve missed the bus and I need to walk 15 minutes on dark roads and I don’t properly hear the cars. It could literally kill me. It’s not an acceptable risk. Concerning charging the hearing aids, the charger and the hearing aids themselves have LED indicators, but compared to the Signia charger, the advantage is that you are very unlikely to misplace the Oticon hearing aids in the charger. If you put them in correctly and you’re sure that the USB cable is plugged in, you can be very confident that they will actually charge.

Besides the battery issues, my experience with Oticon More is positive. I will give a lot more details in the next episode, but I can already say that the receiver in Canal hearing aids now work for me. There are no issues with feedback. The connection to the iPhone is at least as good or perhaps slightly better than the connection is with my open hearing aids. It’s a slightly improved version of the open hearing aids, I would say. There was quite a bit of hype around the More that it was going to be revolutionary, but so far, I’m satisfied. It’s an upgrade, but I’m not noticing a lot of difference, but again, I will dig into this much deeper next week

Sally Britnell: Hi Jonathan. I really enjoyed today’s podcast. I really liked all of the tech information, which was quite cool. There’s two things I want to talk about, some tech innovations I’m working on and I have found some interesting applications for, and my hearing aids because I really enjoyed the discussion about the hearing aids. I have the Starkey Livio AI 2,400, I think it’s the Edge version. Anyway, my audiologist spent quite a long time researching what would work the best with an iPhone for me at that time and came up with these having the best connectability and feature set, because I particularly wanted to try and to get it talking to some of the tech I have that isn’t just my phone.

It does have some interesting features. You can control cutting out a lot of noise more than I could with my old Made for iPhone links hearing aid. I quite like the fact that you can cut out a lot of the things like background noise, hissing and adjust those settings more than you could in the others. The feature that I wanted to talk about kind of went with a little bit, what I think his name was John was talking about where it uses AI. It will translate between languages for you and it will do speech to text in the hearing aid app.

The other thing it has is a falls alert, which I don’t use, a wellness score which I don’t use, and using AI and a connection to the, I think Soup Drinker API, it will have a voice assistant and they call it the Thrive Assistant. I actually think it’s just Soup Drinker in disguise and I’m keen to actually go and have a look at the API and see what it does myself. There is one disadvantage I have found to this in that it gives you voice feedback in one ear like many of the speakers do. I’ve tried Apple, I’ve tried Soup Drinker and I’ve tried Google and it’s fairly comparable. The problem I’m having though is the voice assistant or the Thrive Assistant as they call it, you can’t adjust the volume of that.

At the moment it’s only playing it into one of my hearing aids and it’s one of my worst fear and it’s playing it at a very low volume. I can’t actually hear what it’s saying in my hearing aids. There seems to be no settings in the app to adjust the volume of the Soup Drinker integration, and saying that it does say that it’s beta. What I think I might do is actually contact them and have a discussion and see what happens from there and I’ll let you know. The second thing I wanted to talk about was general disability and accessibility work that I’m looking at doing in research.

I’ve made some really interesting connections lately about this and that is looking at analyzing a scene using computer vision and giving recommendations of how to improve accessibility of an environment. I’m starting to work on that research now. If any of your listeners are interested and have any ideas, feel free to get them to contact me. I don’t mind giving my email address out. It’s sally.britnell@aut.ac.nz.

I’m planning on applying for MBIE funding this year to have a look at that. Cross fingers on getting the funding, that’s the problem, but anyway, there’s a lot of things going on in that arena. I will update you more on some of the other projects as time goes by. I’d love to hear more about hearing aids and some of the emerging tech that’s out there because I think a lot of the emerging tech we could actually quite easily in the university setting or the students I supervise could actually do some cool things.

Jonathan: Thanks for another interesting email, Sally. Wow, it sounds like Starkey is doing some innovative stuff. I have not used a Starkey hearing aid. That is one manufacturer that I have not tried. It’s interesting to hear about the work that they are doing with voice assistance. The Oticon aids have IFTTT integration. For those not familiar with that it stands for If This Then That, and it allows you to link what would apparently be disparate things together. We use it on Mushroom FM, we actually use it on Mosen At Large as well to get blog posts out to various places and the hearing aids use it.

Soup Drinker supports IFTTT as well, and that means that if I say to the Soup Drinker it’s time to watch TV, then it sets my hearing aid to the TV adapter program, but it is good to talk about hearing aids and all the technology out there. As far as I’m aware, it seems to me based on what I’ve read and heard that the most open hearing aid in terms of standards are the new Phonak ones, which support good old Bluetooth.

They kind of come up like a Bluetooth headset. That means that you can pair with all sorts of things. I don’t know how well they are working with iPhone these days, but there’s a lot of good innovation happening in the hearing aid space. It’s just a shame that certainly in New Zealand if you want to have the government pay for these hearing aids, you only get it every six years or so now.

Advertisement: For all things Mosen At Large check out the website where you can listen to episodes online, subscribe using your favorite podcast app and contact the show. Just point your browser to podcast.mosen.org.

Jonathan: In episode 165, we had what I think is a really important discussion that was initiated by Scott Rutkowski. In his message he asked the question, what do you do when you’ve got some sort of problem and you can’t break through that frontline support that doesn’t seem to understand your issue or take it seriously enough. Scott was looking at this from an accessibility point of view, and indeed what I want to tell you about starts off first with an accessibility problem, and then it degrades from there, but it really is a problem that I’m seeing these days where you get these companies that try to trim costs to the bone and can’t give you a good quality support experience when you need a resolution.

Last week, I recommended Menulog. Now, Menulog is an alternative to Uber Eats that is available in Australia and New Zealand. It’s been okay from an accessibility point of view, but I never really used it until Uber Eats had a few accessibility issues. I thought I’ll see what else is out there. I checked the Menulog app, and I found to my delight, as I said last week, that it is almost a hundred percent accessible. It really is very good to the extent that I’m sure somebody has actually gone to some trouble to make sure that it’s accessible.

Last week I recommended it and said to people in Australia and New Zealand if you are wanting a bite to eat and you’re finding the Uber Eats experience a bit frustrating, give Menulog a try. Apart from the accessible experience, one big advantage for us in the suburbs of Wellington is that the Menulog field seems to be much wider so we can get a lot more restaurants delivered than we can through Uber Eats. I know a lot of restaurants don’t want to be on Uber Eats because of the cut that they take. I don’t know about the way that Menulog works, but given that there are quite a few more places on Menulog, maybe they treat their restaurant partners a little bit better than Uber Eats does.

What’s not to like? It’s just as well that I’m not a superstitious person though, because almost the moment after I recorded that recommendation, things started going downhill really fast with Menulog. The very lunchtime after I put that little piece together because I recorded that section early in the morning before my job started at about 6:00 AM, I recorded it, and then at lunchtime, I went to order something from Menulog and I found that I could not complete the order because a capture had been installed. Literally, it wasn’t there the day before, and then there it was impeding my ability to place my order.

Now, it did have the checkbox that says I am not a robot, that’s the Google capture feature I believe where if it detects that you’re running a screen reader, it tries to simplify the experience by letting you check the box. Now, I think what was happening is that because I was in some browser window in-app, I couldn’t check that box no matter what I did. I couldn’t double tap on the check box and have it toggle from unchecked to checked. I did try and turn VoiceOver off and try tapping in just the right place but I suspect what’s happened here is that that checkbox is so small. It can be quite difficult to do that, and I wasn’t successful in doing it.

I finally managed to get past the audio capture after several attempts. I am hearing impaired and I find those audio captures very difficult. They are garbled for a reason in the same way that visual captures are garbled. It was not a good experience, and it was just so weird that suddenly it appeared. I checked on Bonnie’s phone, same problem there. The capture was on her phone. It wasn’t something to do with my device. I seem to recall in the back of my mind that there was a web user interface for Menulog as well. Actually, that is really accessible too, so I went to the Menulog website and sure enough, there was the capture.

Now the good thing about this is that the, I am not a robot check box did work and I was able to complete my order on the website. I was curious about whether this capture was there to stay and what the reasons for its implementation were. They boast on the Menulog app and website that if you have an issue, you have a problem, you have a question, you can call their friendly human team anytime on Menulog and get an answer, so I called the number. Now every time I have called the number and I have needed to several times as the saga unfolds, this is what happens.

Operator: Welcome to Menulog. Calls are recorded for quality purposes. If you are a restaurant partner, press one, if you are a Menulog customer, press two.

Jonathan: I’ll go ahead and press two.

Operator: Unfortunately, we are unable to answer your call at this moment as we are currently experiencing very high call volumes. We apologize for any inconvenience caused. Please email your inquiry to enquiries@menulog.com, and we will endeavor to respond to you as soon as possible. This call will now end.

Jonathan: It doesn’t matter what time of day you place the call; it doesn’t matter how regularly you try to place the call, you are not going to get through and talk to anyone. I followed the advice and I emailed enquiries@menulog.com. In the subject, I put suddenly being presented with a capture when trying to check out and I wrote this, “Hi, I am totally blind and hearing impaired, and I use Menulog on my iPhone in conjunction with VoiceOver, a screen reader built into all iPhones.

I have been using and enjoying Menulog recently, but today I have been required to complete a capture before placing an order. This has happened to me twice now. This is causing me considerable difficulty. I am wondering why this capture has suddenly appeared and whether it can be disabled for my account, given the accessibility problems it causes. While an audio challenge is offered, my hearing impairment makes that difficult to complete. If this capture is going to continue, I may have to return to using Uber Eats, which I would prefer not to do. Thank you for reading.”

I received a reply within a few hours of sending that message and it said, “Hi, Jonathan, thanks for getting in touch with today.” That is what it says. It continues. “The capture was installed for a reason. We understand that this has been inconvenient for you. We have forwarded your concern to our appropriate team for review. You should expect to hear back from them in 48 hours. Thank you for choosing Menulog and have a great day.” I think that’s a reasonable response. Clearly, they understood that this needed to be escalated and they acknowledged that the capture was installed for a reason. They acknowledged the existence of the capture. That’s the important thing in the context of what comes next.

I presume there’s some ordering scam going on or something like that, and they were trying to circumvent that. The next day actually, so far less than 48 hours, I got a response from their head office and it says this, “Hi Jonathan,” I won’t give the name, but he gives his name from the solutions team at Menulog, head office in Sydney. “I have been passed on with your concerns regarding the capture request when ordering. We apologize for the immense inconvenience this is presenting. However, it seems that the capture request may be triggered by the VoiceOver app or another phone app that you may have installed, therefore falling out of our control. Thank you for choosing Menulog and we hope you have a great day.”

All this meditation that I’ve been doing has to help, and so I took lots of deep breaths and walked away from the computer and thought about responding to this because this is absolute nonsense. It’s absolute nonsense. I finally wrote back and said this, “Hi, name redacted. Thank you for taking the time to reply. It is disappointing that you would suggest assistive technology is somehow the cause of an issue of Menulog’s making, and I suggest that this issue be escalated to someone who actually has experience with VoiceOver.

I am an IT professional, and I can tell you with certainty that VoiceOver or some other app is not the cause of the capture. In fact, when I raised this issue yesterday, a member of the support team told me and I quote, the capture was installed for a reason. The capture was occurring in into places for me when I use the iOS app and also when using the website on different windows-based devices.

If you are saying, this is not a general change, then the only thing I can think of is that this is being applied to my IP address, which is static. As a technology professional, I can tell you that the developers of the Menulog app have clearly gone to some lengths in recent times to make the app accessible. I have been using it effectively for some weeks. It was only yesterday on multiple devices that the capture occurred. Having just checked this issue again, I am relieved to find that the capture is no longer occurring on either device. I’m really hoping this remains the case.” The next day it certainly was remaining the case.

We had Heidi and Henry over here, my daughter and son-in-law and we were in major crisis mode. As I have mentioned on this podcast before, we’ve had some issues with flooding. Last weekend we got hit by some rain and ferocious wind as a result of a tropical cyclone, and we experienced heavy flooding this time. The previous week, we had a lot of heavy rain in Wellington and we survived that unscathed inside because we did a bit of drainage unblocking work last year. As I think I mentioned in the past on this subject, one of the problems we’ve got is that there’s just been a wee bit of erosion in terms of the lip that goes up from the street to our internal garage.

It’s just enough that it seems to have upsets the equilibrium of the world and water is now flooding in and coming into our house. We do need to do some more work on this. It’s actually quite difficult to find a tradesperson to do this at the moment in Wellington, and it is going to cost a bit, but we need to get it done. Anyway, the water was so bad. We had to place a bit of a mercy call to Heidi and Henry. They stopped off and bought a water vac that we paid for. We did some sandbagging and generally tried to remediate some of the issues that we have been experiencing, so it’s not pleasant.

At the end of all of that, they definitely deserved being fed. We initially looked at Uber Eats, it said there were no drivers available. I went back to Menulog, the capture was still gone and the kids wanted KFC. That’s fine. I had some healthy salmon that I was going to eat, so they were quite happy with their KFC. We ordered the KFC from Menulog and I would say about 11:30 to 11:35 AM and the estimated delivery time was set to between 1205 and 1225. Now, there seem to be two categories of restaurant on menu log. One is where menu log’s using its own drivers. The other is where the restaurant delivers. I believe KFC is in that latter category where they do the delivering, but 1205 came and went without any update on the app. Normally the updates are pretty frequent and accurate.

1225 came and went and it was still saying estimated time between 1205 and 1225, 1235, 1245, all came and went no food. By this stage, they’re all coming into my office where I was working even on a Sunday saying, “Where’s the food dad, where’s the food.” I’m saying, I don’t have any updates.

Finally, I go rummaging around in the menu log app. There is a button that you can press that lets you text the restaurant in a situation like this, where they are doing the delivery to ask where the order is. I pushed that button and it said, ”You should hear a response within about five minutes.” Well, I waited my five minutes and I didn’t get a text response back from the restaurant. By this stage, it’s 1 o’clock, we’ve given up on the idea that the KFC will ever materialize.

Henry goes out in the pouring torrential rain and ferocious wind and picks up McDonald’s from the local McDonald’s, which is now what they’ve decided they want. I then call KFC. You can call them from the app and this thing rings and it rings and it rings. I leave it ringing for some time because I appreciate that those places can be busy, nobody ever picks up.

I start looking for ways to seek a refund of our order. This is pretty easy to do in Uber Eats, and I’ve actually found maybe it’s just how often we use Uber Eats. They are really good about refunding our money. I have not had too many issues with getting a refund from Uber Eats.

While I’m doing that, I’m surprised to see that suddenly after all this time where the delivery ETA was stuck, it suddenly says that it was delivered at 12:10 PM, which it most certainly was not. We had two sighted people looking out the window for a vehicle. We had the ring video doorbell, which records no motion at the door during that time. We had two sighted people who quickly went out in the rain and checked any weird places that it might have been delivered. There was no food, no food was ever delivered. It’s pretty suspicious that about 50 minutes to an hour after they claim it was delivered, suddenly the status changes to delivered.

I call KFC again and there’s no answer from them again. I’m encouraged when I go through the process to call the friendly humans at menu log, to talk to them about this issue. I do, and I get the message that you’ve already heard because that’s all I’ve ever had when I’ve called that number. Finally, I find deep in the bowels of the app that they have an online chat function. I go in there and I find that it is not accessible with voiceover. When you try and double tap on the edit field, you can’t get it to have focus. Again. I did a little bit of tapping around with voiceover off. I wasn’t successful.

I go to my windows computer, log in and after a lot of ferreting around, I finally find the online chat, which is super accessible under windows, even to the extent that it makes a ping every time they send you an incoming chat message. I start talking to them about the fact that we don’t have our food and before I can go any further, they make me confirm my first and last name, my street address, my phone number and the story of my entire life, which is ridiculous because I’m actually logged in with my account to initiate this chat. Anyway, I go through the motions, then they say, we’re sorry, all that kind of speak that these companies have in their chat services. “We will give KFC a call and we will talk to them about what’s going on.”

I say to them, “Well, I hope you have more luck than I have because they don’t pick up their phone.” I’m waiting in the chat for about five minutes and finally ping. I get the message back. There’s the menu log person who says, ”I’ve tried calling KFC but they don’t answer.” I say, “Yes, I know. Can I have my refund please?” They refuse to give me the refund, which is like $34. It’s not a huge amount of money, but it’s the principle of the thing by this stage. They say, KFC’s got 48 hours to respond to this because we need their confirmation that they think they didn’t deliver the order. Essentially they’re doubting the customer.

They’re saying it’s possible that you customer are pulling a swifty. Well, okay, I suppose some customers do. They say they’re going to email KFC, and then they will consider what to do next. I say, this is fine, but you need to understand that I have purchased goods from menu log. Menu log is the entity that I have a contract with in this situation. You promised me that I would get the food and I haven’t got the food. Therefore you really under New Zealand Consumer Law need to just make this right and refund me. She comes back and says, “I understand, we’ll be in touch.” I say, “Will I hear via email?” She says yes and before I have any chance to say anything else, she exits the chat. I might not like it, but I’ll follow the due process.

48 hours go by and of course I haven’t heard a peep from menu log. Around about, I would say 54 hours after the thing was supposed to be delivered, I send another email. This interaction that I had on the Sunday was on chat. Now I’m sending an email because I tried to call them again, of course. What do you think I got? In the subject line of the email, I put still seeking a refund for order number. Then I gave the order number. Then I said this, “Hello? We placed the above order on Sunday. The estimated delivery time remained stuck for a very long time, well past the end of the window, I tried using the option to text the restaurant provided in the menu log app. I did not receive a reply. I then tried to call the restaurant, but they did not answer. Finally, around 90 minutes or more after placing the order, it was marked as delivered, but we did not receive the items.

We have a video doorbell and can confirm no one delivered the order.” I didn’t want to talk about the two sighted people thing, because we clearly understand that they’re the company that blames being blind or blindness technology on things that are not related to that. Then it continues. “We use the menu log online chat, the menu log representative also tried calling the restaurant and received no reply, but she was not prepared to refund my order without a confirmation from the restaurant who will not answer that they hadn’t delivered it.

She said she would email the restaurant and they would have 48 hours to respond. She promised me I would hear back via email. That 48 hours expired a few hours ago. We have not heard anything. As we did not receive what we paid for, we would like a prompt refund of our order please, thank you for your help.” It’s a courteous email and I gave them all the information that they need because I gave the order number. I also sent it from the email address associated with my menu log account. Finally, after about, I would say roughly 36 hours, maybe a little less after sending that message. I get this back.

“Hi Jonathan. Thanks for your prompt response. We’re sorry to hear about your experience with your order. Please allow us to investigate further by responding to this email with the following details. Nine digit order ID complete address linked on the order, phone number indicated on the order. Did you check if there was food left outside your door/porch/in? We can assure you that we take all feedback seriously. Once we hear back from you with the above information, we will investigate this further.” Now this is ridiculous because the order number was right in the subject line.

By the way, it’s eight digits. It’s not nine digits given that I have emailed from the email address associated with the account, the order number will provide all of the information that they’re asking for. I don’t know what purpose they hope to serve. Given that I’m emailing from the address through which the order was placed by making me repeat the street address and the phone number. Nevertheless, I played the game I wrote back and I provided all of that information. I said to them, “If I don’t get my refund for the items I did not receive 36 hours after sending this message then I am going to begin the dispute process with my credit card company and get the transaction reversed.” Just before that deadline was to expire, I got an email from them offering me finally, either a refund or a $35 credit on my menu log account.

I thought, well, I’ll get the credit because that’s going to be the easiest way. It’ll be resolved and I will eat there again. I thanked them politely for resolving the issue. Didn’t even mention all the rigamarole I’d been through to get it resolved. The last laugh was on menu log, because I said to Bonnie, “Let’s celebrate, let’s have dinner with menu log. Now that we are going to use it again,” because I wasn’t going to give them any more money with this hanging over me. I went to menu log and I found that in the few days that we haven’t been using it, the geographical boundary has shrunk to about the same as Uber Eats. That means that all these central business district restaurants that we are enjoying so much are suddenly not available to us.

Now, who knows whether this was going to last, whether it’s like the capture, it’ll be all okay tomorrow, or whether for whatever reason, the access to restaurants has been severely curtailed since we started this whole process. As you might appreciate, I’m not perhaps as keen on menu log as I was, when I did the last episode. Customer service really is not alive and well in some parts of the interweb. The margins, I guess, for the things that these companies are doing are so small that they just can’t afford to provide good quality customer service. If they are going to provide customer service, you would think that they would be equipped to just make it right for the customer. It’s not hard.

Advert: What’s on your mind. Send an email with a recording of your voice, or just write it down. Jonathan@mushroomfm.com. That’s J-O-N-A-T-H-A-N@mushroomfm.com. Or phone our listener line. The number in the United States is 864-60 Mosen. That’s 864-606-6736.

Jonathan: Many of us say, myself included, that we want to be the first in line whenever an autonomous vehicle is released that a blind person can purchase and use. We’ve got some way to go yet, but there is technology in the offing that’s going to be available as early as late of this year, that is using some autonomous vehicle technology to help blind people navigate. It was a bit of a talking point at the recent consumer electronics show. When I read the stories about this, I thought that we should find out more about Biped. To tell me about it is one of its founders, Maël Fabien, and it’s great to have you on the show. Thank you so much.

Maël Fabien: Thank you so much for having me.

Jonathan: How did you get the idea of getting into the blindness space and doing a product like this?

Maël: Coming from the field of autonomous driving and mostly research in artificial intelligence. I was actually doing research in that field, and on the other side, I was living quite close by the family Hospital in Lausanne, where I had the chance to meet a couple of blind and visually impaired people, mobility trainers, too. I was I would say getting quite familiar with the field. At some point, it became quite obvious that I should just bring these two together. I started looking up on the internet what exists as solutions and figured out there was not that much going in this direction, and that there would be a chance for us to build something valuable.

Jonathan: Since you have some knowledge of the autonomous vehicle industry, I’m really interested to get your take on how long do you think it will be before a blind person gets into their own autonomous vehicle that they have purchased for themselves?

Maël: Yes, I would say a couple of years, that’s as vague as it can be. There’s ongoing tests by a couple of companies, including Cruz. Right now in San Francisco, fully driverless systems. At this moment, the car is able to pick up a person and drive the person to his destination, and then drop off the person. There is still the problem of identifying precisely the position of the user to make it very accessible, how to get in the car, but I would say it’s probably a matter of, at best a dozen a month or a couple of months before we start to see actual tests that really include taking a blind person from one point to another, using a driverless autonomous vehicle, and then a couple of years for commercial applications would be my guess.

Jonathan: That’s exciting. It sounds like technology is getting there, but is society ready? Obviously, these cars are going to have to make some difficult decisions. For example, what do you do, the classic trolley problem? What do you do when somebody runs out in front of the vehicle, and you can choose to either swerve and kill the occupant of the vehicle or swerve in another way and save the life of say, a toddler who’s run out in the street? Obviously, because this is a computer, it is going to have to make a conscious decision. It’s programmed to do something.

Maël: Absolutely. It’s also one of the reasons why we raise the standard so much in terms of security when it comes to letting an autonomous system drive yourself around the city because we would never tolerate having a system that is performing as well as or as good as a human driver, for example. There’s definitely a trust issue, and there’s definitely at some point ethical concerns to have regarding the final decision that the system can make in very urgent situations. I think in general, the more transparent this industry gets, and the better the system gets, the more likely it is that we are going towards a mass adoption at some point.

Jonathan: In the meantime, though, we have Biped, can you give me your elevator pitch? What does this thing do? Can you describe what it physically looks like and what value you hope it’s going to add to a blind person’s life?

Maël Fabien: Sure. Biped is in sense an AI copilot for blind and visually impaired people. It’s worn on the shoulders, it’s a bit like a vest or a harness. There’s a part that goes behind your neck where there’s a small battery system, and then there’s two parts that come on your chest on the front. On the right, you’ll find all the cameras system. Just like an autonomous vehicle, we are using 3D cameras to capture 170 degrees of field of view around you.

Afterward, we process that information to detect all the important elements around you and predict their trajectories a few seconds in advance. What we are essentially doing is, instead of just vibrating or detecting that there’s something somewhere, we are able to tell what it is, and where is it going. That gives us the information a few seconds in advance of what will matter in a couple of seconds, what has a risk of collision with a user and then we warn the user with a special audio feedback. We use bone conduction headphones to generate 3D sounds around the user.

There’s a small, for example, music that we developed with sound designers from the gaming industry, to make you think of a car or to make you think of a person and you hear those sounds very intuitively moving in space and coming towards you, or turning right, and then the sounds are dropped whenever the object is not a risk anymore for you.

Jonathan: I’m interested in this concept of wearing this with the cameras and everything, are you going to stand out and look a bit dorky wearing this thing?

Maël Fabien: That’s one thing we iterated quite a bit on because to be fair, the first versions were going in that direction. The first versions were a bit of a nightmare to attach to clip or whatever, and they looked a bit weird, we’ve really brought it to a level where anyone can wear that in a very crowded place, and very, very few people are going to notice that you even wear something.

From far, it looks like you’re just wearing a backpack, for example. We’ve really brought it to a level where it’s a very discreet system. It’s also one of the reasons why it’s meant to be used as a complement to a white cane or to guide dog that really do this work of signaling that you have a visual impairment to the others. The test that we’ve been doing at some point where we dropped the white cane, for example of someone, and were working independently because we could capture theoretically this type of information, like a small step, for example, and then warn the user about this. The main danger that we encountered was, other people around were not paying attention to the person anymore. The whole challenge became even harder than it used to be. Definitely a compliment to like in there and very discreet by nature.

Jonathan: How much training does a blind person need to be able to interpret the information that Biped is giving them?

Maël: On the usual tests we make in around five minutes you’re already able to tell what is an obstacle, where the obstacle is located, and where is it going, and distinguish between two to three classes of objects around you. It could be as simple as vehicle, person, and urban furniture for example. Over time, the aim is that via an app that we provide, there is a small training system where you can just get used to having those sounds played, and you can also customize what type of sound you want to attach to, what type of objects.

I would say that 95% of users really grasp the 3D understanding or the 3D positions of sounds in the very first minutes. Then it’s a matter of how much fine grain you want to get in terms of environment description, and how many classes of objects you want really to be warned about, or if you want to remain a bit higher level. Over a continuous use of a couple of days, you definitely get to the level where you’re proficient in the use of Biped, I would say.

Jonathan: If you’re walking along a street and there are a series of doorways that say you’re looking at stores and you’re walking down the street, will this tell you when you’ve reached the door, when there’s an open door on your right for example?

Maël: Yes, you would be walking on the sidewalk, you would have a small beep coming from the right, for example, just to tell you that there’s an obstacle, the obstacle would be the wall. As long as you hear the beep consistently, it means that there’s a constant obstacle on your right. Then at some point, you would get to an opening, a door or something, you would either hear if the door is not materialized by anything, you would just hear that the sound on the right stopped. That would mean that there’s an entry you can go through or otherwise if it’s a door that you have to push, for example, then our cameras would detect it and would generate a specific sound for you to tell you that there’s a door just on your right. Then if you pass my door, the beep starts again. It means you’ve just passed the door and you’re here, the wall again.

Jonathan: You’ve made several references that make it clear that you’ve done a lot of user testing, a lot of product testing and that’s great because sometimes I talk with people who have got the next big idea for blind people. The only trouble is they haven’t actually checked it out with blind people and they get a bit of a shock when they do. Clearly, that’s not the case in this instance. What sort of feedback have you received in terms of the value that’s being added for, say, a really competent white cane or a guide dog user?

Maël: Indeed, you highlighted quite well that user testing was at the core of what we wanted to do since day one actually. The next couple of days after we had the idea, we just contacted the user association, just like the Swiss Federation for the Blind, which is close by from where we’re based and started drafting something together and asking them about how could we fly this technology to bring something valuable to the market?

From the very first test on the software was really going in the right direction. Even with proficient white cane users what they have, as the addition is an environmental understanding of their surroundings. With the proficient white cane users, they can walk very fast. They can avoid a lot of obstacles. They know their way very well but the problem comes in dynamic environments where lots of things can be silent, when there’s lots of obstacles that might be above-hit level.

What we’ve seen really in the test is that people like to have this aspect of being able to anticipate and take action based on what the cameras are analyzing a couple of seconds ahead. For example, if there’s a group of people walking in your direction, the fact of having that information a few seconds ahead, and you know that the group is slightly on the left, you can just take one step on the right, for example, and you will not have the case where your white cane is hitting the foot of someone, and all these aspects that let you anticipate the information were things that were really appreciated.

Jonathan: Sometimes you get the feedback that low vision guide dog handlers have to learn to trust the dog, because the dog is aware of obstacles. It will weave when it’s ready to weave around the obstacle. Is there a danger that for a totally blind person wearing Biped, that the information it is conveying may undermine the relationship with a guide dog?

Maël: That’s very interesting topics. I’m not sure I can provide an exact answer to that, because I think that’s also what longer run tests and market feedback will also give us once we launched the device. We really want to frame it as a complement to whatever the person might be using on a daily basis, whether it’s a white cane, a guide dog, or any other assistive device in the end.

We’ve made the choice to really focus on the audio feedback only, no haptics for example, just so that there’s a very specialized channel for the processing of that type of information. Anything that comes from the Biped device is audio and then any other feedback, tactile information, for example, from the white cane is just dedicated to the white cane. The ability to distinguish that that’s at least the way we thought about it. Then I guess longer run tests, will give us this type of information and we’ll be able to learn still quite a bit from the moment on we put that device in the hands of end-users at home, definitely.

Jonathan: We get a lot of user feedback on this show with people emailing in and phoning in. We’ve had some very lively discussions in recent times, actually, about whether autonomous technology like first robotic technology to some degree will eventually replace the guide dog. Obviously, there are some disadvantages of guide dogs. Sometimes you get refusals, which can be very stressful. Sometimes they get sick. Of course, they shed hair and some people are allergic, all sorts of things like that. It sounds like you’re not making that claim at this point, though.

Maël: I don’t think that there is a value really for Biped to say that we want to replace the guide dog because I guess from all the users we talk to, it also brings a lot more than just guidance and navigation. Even though there might be some downsides of owning a guide dog because it’s can be lots of responsibilities and as you highlighted, dog might be sick, you might not be able to go everywhere you want to go. I think it’s also bringing a lot more than this. We really want Biped to for those who own a guide dog become a complement that also the dog is never going to tell you that for example, there’s a bike coming in a specific direction with the precise information. The dog might stop or just guide you a bit on the right to let the bike pass but then you get that additional layer of information.

I think for all of those who do not want to own a guide dog for the various set of reasons, Biped can be a good replacement at this level, but just as an option for those who cannot access a guide dog or do not want to have a guide dog. I don’t really believe in a very strong claim of saying that Biped will replace guide dogs, that’s also not the vision that I want to have for the future of what technology is bringing. It should enhance and not replace in a sense.

Jonathan: On the flip side, guide dogs don’t run out of batteries. What do you expect the battery life to be of this device?

Maël: So far, I would say it’s around four hours of continuous use. It’s a removable battery. The battery that you have in the back, you can just slide it out and put in a new one. We talk about four hours of continuous use so walking and getting feedback for four hours, which is usually quite long. It would definitely last more than one day for normal use case, I would say. Then if you’re going for a very long day, and visiting lots of places in one day, you can definitely pack a second battery. There will be one in what we ship for the device so there will be a replacement battery. You can pack two and have eight hours of battery.

Jonathan: Will that have a standalone charger so you could effectively have one in the Biped and one in its charger?

Maël: Yes, it will. We made the battery such that there is no specific direction in which you have to plug in the battery. It’s basically plugging in from everywhere and there’s a magnetic charger. It’s very intuitive to use.

Jonathan: Is there a companion app with this or is Biped completely standalone?

Maël: Both work. Obviously, the smartphone is bringing additional layers of settings, personalization training, et cetera, et cetera and that’s what’s covered by the companion app that we will launch. We’re also worried that not everybody wants to use a smartphone or knows how to use a smartphone or has a smartphone, in which case the training part can be done with a mobility trainer.

We’re working on setting partnerships at this level so that one’s just able to reach out to a mobility trainer and ask for training on a specific device. Then we have a specific training program that we’ll go through in them dedicated to mobility trainers. Then you lose the flexibility of being able to change the settings, customize the sounds and everything. At the same time, it’s just a simple on-off button on the device and you’re just ready to go like this.

Jonathan: I guess Biped is conveying a lot of information but many blind people will still want turn by turn. They will want to know what businesses they are passing, all of those things that blindness-specific GPS apps will do. I guess you would just use your GPS app of choice in conjunction with this.

Maël: Yes. Things we’re actually working on to be totally transparent, we are in the process of integrating the navigation layer. So far, I would say we have the coverage of the anticipation of obstacles in 23, 24 classes of objects around you. A range of up to 30 meters around you. Then we also want to have this more fine-grained navigation information. We’re thinking of two ways to do it. Have a base integration of, for example, a system like OpenStreetMap, and use that to provide information on businesses.

Then again, just detail navigation information, but have the ability for the users to mute that and then use the navigation app they want to use on top. One thing we’re trying to work on is just optimize the moment that sounds for the navigation aspects are generated not to collide with whatever important information there might be at the obstacle level, just so that everything integrates very smoothly.

Jonathan: If you wanted to use OpenStreetMaps, would you have to download that data for wherever you are into the device?

Maël: No. There’s something we would handle on our site directly.

Jonathan: There’s Wi-Fi in the device or cellular something like that?

Maël: The device basically is connected. It’s not connected to the internet. The device is connected just to your smartphone over Bluetooth, then we use the smartphone to download data on the fly.

Jonathan: It makes perfect sense. What’s your launch plan for this? I read somewhere that you were hoping to launch in Europe first. Is that still the case?

Maël: It is still the case. Around September we’ll have the public launch. We have the private launch before that with a couple of end-users. We will have a public launch in September targeting most European markets. Some of them will be a bit more advanced in terms of reimbursement of the solution. The aim is to target expansion, for example, in the American market side beginning of 2023 or a bit earlier, end of 2022.

Jonathan: The big question, of course, do you have any feel at this point for how much this will sell for?

Maël: We’re trying to set that to be I would say as accessible as possible by compressing the production costs on our side, especially the sorts of the problem with all these technologies that rely on optical systems and cameras, especially since we cover quite a big range, we have three cameras inside. I would say so far we would be able to offer a device for around $3,000, maybe a bit less, and that’s what we’re working on, and also seeking the reimbursement for the solution and in as many countries as possible. We also want to offer a subscription model often, like having a lump sum payment is to require from the users, especially within a couple of thousands dollars and it’s something we’re totally aware of. The subscription model would drop to around 100 bucks per month, and from there end users would have one month for free, and then start paying for the device on a monthly basis.

Jonathan: How long have you been working on it for?

Maël: Been about a year and a half, we incorporated the company in January 2021. That’s a quite new venture for us, and now there’s about six of us in the company, and there will be still six of us, approximately to launch, and then we expect to expand the team a bit later.

Jonathan: What response did you receive from the Consumer Electronics Show? What was the feedback like?

Maël: It was an interesting addition at the Consumer Electronics Show. There were not as many people as the other years. You might have read that there was nearly 75% drop in the attendance overall, but it did bring quite many people to the [unintelligible 01:31:28] which we’re located in. We were, I think, quite happy to meet a couple of end-user associations that are present here in the US and for with whom we’ve had very good discussions. We also met a couple of people who were interested in other fields of application of the device which we hadn’t really thought of previously.

That was always interesting to try to see if we can engage in pilot projects for other applications and that people come up with their own applications, I would say, and then it was also very interesting at the level of press coverage because it gave us a bit of exposure at that moment, and we had lots of signups on our beta testing program, and of course, that’s a very good contact, we’re overall very happy with the output of the Consumer Electronics Show.

Jonathan: Is that beta testing program taken care of or are you still looking for users?

Maël: It’s still ongoing, we’re still looking for users. The way we do that is you can sign up on the website, it’s Biped.ai, and you can just sign up wherever you’re located in and then we try to arrange group tests, and if not, then we try to see if there’s data we’ve served, and then we really try to travel to as many places as possible to let pretty much anyone test the device. Because for the moment, we don’t have the network of partners to distribute the device and organize test and demo sessions in all countries. That’s still something we manage ourselves directly.

Jonathan: We’ll be watching this with a lot of interest, and I wish you luck. It’s a difficult space to get into obviously because there’s much need, and this market has so little to spend on many occasions, and so it is often the case that one has to get involved in government programs, and they can be a bit slow to adapt. There are lots of complexities in addition to the really innovative technology that you’re using. I hope it all works out, and we’ll look forward to keeping in touch.

Maël: Thank you so much. Thank you very much for the invitation.

Jonathan: I think this is definitely a technology to watch, and if you would like to watch it, you can head on over to the Biped websites. I see they are still looking for testers if that interests you, you can go to Biped.ai. That’s Biped.ai.

[music]

Scott: Hi, Jonathan, it’s Scott in London, England. I love the show. Two things from me. I have the same laptop as you Lenovo X1 carbon, Gen nine. I got mine back in May last year. I have been having some issues with audio drivers, the Realtek HD audio drivers were causing some significant issues on this laptop for me. I had to remove those and installed the Intel drivers, which I’ve been having much better luck with. Just something to be aware of there. The second thing for me is on one password volume eight and have been finding that it’s much less accessible than one password volume seven. I’m not sure if you or anyone else in the community has come up against this, but would be really interested to hear if anyone has any workarounds or tricks and tips. I’m in contact with one password about them looking to improve the accessibility. This app is something that I use a lot. I would really appreciate them improving the accessibility of one password on Windows.

Jonathan: Thanks for your contribution Scott, good to hear from you. Knocking on wood again, I have not had any issues with the audio drivers on my ThinkPad. My ThinkPad X one carbon ninth generation. I have, however, seen all sorts of audio problems with real tech drivers. In fact, did you know that if you look up Realtek in the thesaurus, it comes back with trouble, okay, may have made that up. I may not have though. I mean, it wouldn’t surprise me if that’s what you got, if you looked up Realtek in the thesaurus. Those things are terrible, and one of the big issues of late, and by of late I mean, the last four or five years of laptops, is the aggressive way that those things hibernate, and they take half a second or so to wake up, which can be just really debilitating for a screen reader user.

I was expecting to experience that with my ThinkPad X one carbon and expected to use the avoid speech cut-off feature in JAWS or Silenzio but I didn’t need to do that the audio has been the best I have experienced in many years on a Windows laptop. I wonder what the difference is between the one that I have, and the one that you have, but it is often installing those generic Microsoft drivers do provide a workaround. I’m glad that’s working for you. I am still on one password 7. 9.8, and I haven’t upgraded to version eight. I’m finding one password’s branding increasingly confusing because they have two versions of one password. They used to call the browser-based one password X, but they don’t appear to do that anymore.

I think they just call it one password, and if you want the old one that I like much better then the fully browser-based one because it has a standalone app, and that’s got really good accessibility, they seem to be calling that one password classic now. I did feel like I had to go really fossicking around for that when I got my ThinkPad and I was installing applications. I hope they don’t take one password classic as it’s now called away. Because if they do, I think I will look for another password manager, and we may well have a robust discussion on the show about what password managers people are using, and how they find it for accessibility.

Certainly for me on iOS, and also windows, which is where I get most of my work done one password has been really good. I’m hoping that I don’t have to make a change there. If you’re using one password that’s all browser-based, maybe you could try going to the classic version and seeing if you like that better, but if they’ve messed up one password classic as well in version eight, that would be really disappointing.

Rebecca Skipper writes in and says, “First, have you tried installing the ARM version of Windows 11 on a Mac, and will it work with third party screen readers?” I have not tried this, Rebecca, and what I do know is that at this stage there is no JAWS for ARM processes. I’ve got a perfectly good thinking pad and I’m quite happy with that. I feel no need to install Windows 11 on the Mac. She continues, “Windows SE is rolling out for the education market on PCs with 4 gigabytes of RAM and 64 gigabytes of storage. Is this sufficient for JAWS and NVDA?”

Well, probably it might be a little bit sluggish, but it might be okay. For those not familiar with this, Windows SE is a way for Microsoft to have a go at the Chromebook market. Although I’m wondering why they’re bothering because I mentioned in my Chromebook feature last year, what an explosive growth period Chromebooks had gone through during the pandemic, that honeymoon is well and truly over. Chromebook sales are tanking at the moment, but anyway, Microsoft is late to the party. They are not making Windows SE widely available. This is a scaled-down cloud-based version of Windows, and they’re sending it exclusively to the education market. If somebody needs to use JAWS or NVDA with it, I imagine it will work all right. As I say it could just be a little bit sluggish.

She then says, “Should we still emphasize writing in Braille or focus more on learning how to read grade one and grade two? I remember the problems I had translating between Braille and print. At the time I was using a BrailleNote Apex, and windows. I can’t tell you how many times words like BrailleNote were not translated correctly, leading me to use a QWERTY keyboard for most things, and using Braille writing for personal notes.” That’s exactly why, Rebecca, UEB has been so important because there were lots of ambiguities and I can imagine what happened there. If you write the word BrailleNote in contracted Braille, some devices might translate the dot six N in the middle to the ation contraction. Despite the protestations of some, UEB was absolutely necessary and I think that has largely taken care of some of those anomalies.

Here’s an anonymous contributor who says, “Hey Jonathan, I’ve been working with lighthouse now for almost 10 months and thought you might be able to spread the word. Lighthouse Central Florida/lighthouse works is looking for APD agents. It’s not a job fitted for most customer service representatives, but they desperate. The agents they do have are good at what they do. The job involves troubleshooting for agencies with disabilities. It is level one troubleshooting. It’s mostly writing tickets. Unfortunately, the universal campaign they offered that I originally signed up for is full because it is off-peak season. They do pay you during your training. Well worth it if you need a start. Applicants don’t need any prior experience and opportunities are available to grow. Lighthouse. Central Florida/lighthouse works does have good insurance options, 401K plans, and other incentives.”

If you’re looking for an employment opportunity, visit www.lighthouseworks.org, and fill out an application online.

[music]

Mosen At Large podcast.

Jonathan: Well, I mentioned him at the beginning of this season, the 2022 season of Mosen At Large, but he has been on my podcast on and off talking about iPhone things mainly, but it’s a long time since we’ve had the spotty nephew on the show. We’re welcoming back. Welcome spotty boy.

Anthony Horvath: It’s an honor and a privilege.

Jonathan: How you enjoying all the iPhone goodness, these days, because you like me, didn’t get an iPhone 13?

Anthony: No I didn’t. Now I’m still rocking the year the 12 pro max and I’m loving it.

Jonathan: Well, that’s the main thing.

Anthony: A few issues of course with Braille.

Jonathan: Well, yes, but of course we had a fun time at David and Joanna’s wedding. We were all at this table. I think that was my last reference to the spotty nephew is when we were all at the table doing our thing and you were on that boring, old slow 4g. This is Anthony Horvath for those who aren’t mushroom FM listeners. Anybody who listens to mushroom FM will be familiar with Anthony’s voice because he has hosted all sorts of things. Your first show was what?

Anthony: It was the Beer Fridge all the way back in 2010.

Jonathan: Then you had another one.

Anthony: The Monday Mayhem.

Jonathan: Monday Mayhem.

Anthony: Monday Mayhem, which started in 2011, and The Friday Free For All. Which was in 2012. Then the short-lived one was The Kiwi Connection in 2013 and then The Shed.

Jonathan: I don’t know whether listeners realize, obviously, I listen to your Anthony Unleashed show, which is on every weekday, you’ve upgraded from that shed.

Anthony: I have. I’m no longer in a shed. I’m in a mansion.

Jonathan: It’s very nice in there.

Anthony: It is.

Jonathan: You’re upward me mobile. Even if the mobile is only 4g mobile, whereas I’m upwardly 5g mobile. Anyway, I brought you on here because you may not know this, but you’ve got an Oklahoma guardian angel.

Anthony: Who?

Jonathan: It’s true. I shall introduce you to her now.

Alison Fallon: Hi Jonathan. This is Alison Fallon in Tulsa, Oklahoma.

Jonathan: See.

Alison: I really enjoy your show even though it’s a lot techier than I am. I do enjoy it. I’ve learned a lot from it. One thing that you do, which seems odd to me, is that every time you talk about your nephew, you call him spotty Anthony.

Jonathan: That’s his name?

Alison: If I were Anthony, I would not be happy about that. Especially for someone who doesn’t like people to be labeled. I’m wondering why you do that. That’s all I had to say. Thank you.

Jonathan: Why do you do that? Why do you burst out like that? There you go. Somebody is outraged on your behalf, Spotty.

Anthony: What have you done, boy, what have you done?

Jonathan: I’m in trouble with Allison. How did we get the spotty thing going? I don’t even remember. It’s been so long.

Anthony: I don’t know. It was in the early days of mushroom FM anyway. I was on one of your shows and you introduced me once upon a time. He was my spotty nephew.

Jonathan: I thought it was a bit longer than mushroom FM but maybe I’m wrong about that.

Anthony: I don’t recall it.

Jonathan: The origins of the term spotty nephew actually is, it comes from The Goon Show where they talk about a spotty Herbert. The goon show is a British show from the fifties with Peter Sellers, Harry Secombe, Spike Milligan. It’s very popular among the blind community in a lot of Commonwealth countries. It’s amazing how The Goon show has endured among the blind community.

Anthony: I love it.

Jonathan: I can assure you, Alison, he calls me much worse. You should listen to Anthony Unleashed and hear. We should call it Spotty Unleashed.

Anthony: That would confuse people

Jonathan: Unleashed Spot. When we were having the coronavirus thing and the level four lockdowns, we called it Anthony Locked Down, didn’t we?

Anthony: Locked down. That’s right last year, end in 2020 then. I’ve still got that jingle somewhere.

Jonathan: For those who don’t know, Anthony and I are nine years apart in age. When Anthony was born, it was kind of like having a little brother in some ways, but because he used to call me uncle when he was a youth, when he was a very little youth, he was a bit confused because it was kind of like having an older brother and yet he was calling me uncle. One day he was trying to work out where I fit into the whole stratosphere of everything. He said, “Uncle Jonathan, are you a kid or what?”

Anthony: Yes. I just couldn’t work out why you were always getting in trouble and being told off for things.

Jonathan: If I was an uncle, like somehow the term uncle was an immunity pass from being told off. That sort of thing. You were like a groomsman at my wedding to Bonnie, right?

Anthony: Yes, I was.

Jonathan: It was pretty exciting. We go way back. It’s good that Allison is sticking up for you, but don’t worry, Alison, it’s all done in very good humor. In fact I kind of regret having told you that this message existed before you came on here because when I called you because I just about was bent over double laughing and I called you to say, I got this message and you were laughing hard too.

Anthony: It was the best. It was the best thing I heard all day.

Jonathan: We all need people in our lives who are going to be our protectors.

Anthony: Absolutely. 100%.

Jonathan: Anytime you need a little bit of reinforcement like with the boss or I don’t know, government departments, maybe Alison could be your guide, you know?

Anthony: That’s an idea.

Jonathan: Thank You for coming on the podcast.

Anthony: You’re welcome.

Jonathan: Goodbye, spotty nephew.

Anthony: Goodbye.

[music]

Jonathan: I’d love to hear from you. If you have any comments you want to contribute to the show, drop me an email written down or with an audio attachment to Jonathan J-O-N-A-T-H-A-N@mushroomfm.com. If you’d rather call in, use the listener line number in the United States 864-606-6736.

[music]

Mosen At Large Podcast.

[01:47:47] [END OF AUDIO]