Podcast transcript, Mosen at Large episode 194, Aira comes to the Envision Smart Glasses, Apples got new iPhones coming, and a final discussion about literacy in a blindness context

This transcript is made possible thanks to funding from InternetNZ. You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

 

[theme music]

Jonathan Mosen: I’m Jonathan Mosen. This is Mosen At Large, the show that’s got the blind community talking. This week, Aira comes to the Envision smart glasses, I catch up with Karthick Kannan to talk about all things Envision, we wrap up our discussion about literacy in a blindness context, and we prepare for Apple’s next big event.

[theme music]

Jonathan: As always, it’s wonderful to be here, it’s certainly a thrill. It’s episode 194, and I’ve got a couple of announcements before we get into things. The first and most important one, from my point of view, is this, it’s a girl.

[applause]

Jonathan: Thank you. Thank you. Thank you. Yes, it’s very exciting. They did the ultrasound, they got the scan, they found out that it’s a little granddaughter. Well, a granddaughter for me, a daughter for them on the way in January, and it looks like the name at this stage is Florence. Sometimes, that changes, doesn’t it? People have a change of heart at the last minute, or sometimes a baby comes out and they say something like, “ah, it doesn’t look like a Florence.” This is very exciting and I can’t believe it. I mean, it just makes it seem even more real, so Granddaughter Florence. Some people have already said, “Ah, she’s going to get called Flow Mo,” but that’s all right, you could get called worse. [chuckles] That’s all happening and we’ll keep you posted, of course, as January crawls closer. Man, it seems to be taking a long time.

The other thing I wanted to tell you before we get into things is that this is the last podcast for August, and it is the last podcast where the transcript will be funded by a grant from InternetNZ. I do want to extend my thanks to them for funding the podcasts for around about 16 months now. I really do appreciate their support. I know some people find the transcripts of the podcasts just convenient, and some people prefer it, but there are others for whom the podcast transcripts have been essential to allow them to participate and partake in the information and discussion that takes place on this podcast.

Of course, I’m referring specifically to the deaf-blind community, who are far too often neglected and forgotten. It’s been a pleasure to be able to do this. I didn’t want these transcripts to die, unfortunately, there is not a funding round available from InternetNZ at the moment, but I’m very pleased to be able to tell you that the podcast transcripts are going to continue under a new arrangement, and that arrangement is that we have accepted sponsorship for the transcripts of the podcast from Numa Solutions. This will begin with the first edition of September.

Don’t worry, we’re not going to litter your podcast with ads, there’s a very specific goal in mind here, and that is to ensure that deaf-blind people get their transcripts. That is absolutely critical to me. Starting next week, you will hear a sponsorship message from Numa Solutions, one short one, and another read-through from me to tell you about some of the things that Numa Solutions are doing. They’re doing some great things, so it won’t be difficult at all for me to tell you with a lot of enthusiasm about the work that Numa Solutions are doing. I really do appreciate Mike and Matt at Numa Solutions, stepping up, sponsoring the podcast’s transcripts so that we can keep these podcasts transcripts coming. Way to go guys, look forward to working with you, look forward to keeping the transcripts coming.

It is nearly time for another Apple event, and I must say I’m quite relieved about the timing. The Apple event is called Far Out. Yes, and it’s going to be on the 7th of September US time, and you’ll be able to stream that live at 1:00 PM Eastern, that is 10:00 AM Pacific, and that equates to 5:00 AM in New Zealand, oh my goodness, 6:00 PM in the UK. Hopefully, you’ll get a rough idea of when it’s on, where you are from those time zones. I’ll just complete it actually and say, it also means that it will be at 3:00 AM in Eastern Australia. Golly, gee, there’ll be people up no matter when it’s on, to find out what is in the iPhone 14.

There are some murmurs that may be this is when we will hear about Apple’s new classical music service. That’ll be interesting as well, but obviously, the key thing that we’ll be interested in is what’s in the iPhone 14 range. I’m not committed. I actually skipped the iPhone 13 last year because I didn’t think it had enough of an upgrade for me compared with my iPhone 12. I may well skip it again, but I am convinceable. To help us make sense of it all and make those big decisions, should we commit to an iPhone 14 or not, we will have our panel getting together and recording a special episode of Mosen At Large right after the keynote concludes.

Normally, we don’t edit this very much, we just want to get it out to you quickly, so it doesn’t have the usual standards of editing that Mosen At Large normally applies. Timeliness is key in the situation, and so, we will get that out to you after the keynote as soon as possible on the 7th of September. In the Northern Hemisphere, it will, of course, be the 8th of September here. The reason why I’m relieved about the timing is that I did think we might have to try and record this remotely from a hotel in Europe somewhere, but Apple’s going early this year, and that means that the delivery date for the iPhone 14 is most likely going to be the 16th of September with pre-orders opening on the 9th. Maybe they’ll take it out another week, but I don’t think they’re going to.

The iPhone 14 is pretty close, and that also means that iOS 16 is pretty close. If you’ve still got some significant issues with iOS 16, I’m keen to hear what those are, how you’re finding iOS 16 at this point, because I think they’re likely to be locked-in, baked-in at this point, and that any changes you might want, any bug fixes you might want won’t appear until a subsequent release. iPhone time again, we’re on top of it here on Mosen At Large.

[theme music]

We’ve been having a lengthy discussion about Braille literacy in a blindness context, and by the time this segment is over in this week’s podcast, we probably will have spent about a little under an hour and a half on this subject in the last three episodes. I thank those who have sought to contribute constructively to the conversation. I am going to draw it to a close after this lengthy segment this week because I think we’ve reached a point now where anybody who was convinceable on either side has been convinced, and people who are firm in their beliefs will not have their minds changed, and we’re starting to get a little bit repetitive.

After this week, it will be time to move on, and I won’t be including any further contributions on this. As it is, I’ve probably received about three times more contributions than I’m going to include this week so I’ve done my best to include a representative sample, a sample of the more constructive, thought-provoking feedback that we’ve received. I’ve also endeavored to make sure it’s roughly proportionate.

Let’s begin with Debbie Lieberman. She says, “Hello, Jonathan. I’ve been a weekly listener for several years. This is my first time commenting.” Well, welcome and thank you for listening. “I’ve been pondering your view on literacy. While I agree that if one is listening to an audiobook, having someone read a book, or having a machine read a book, one isn’t decoding symbols to form words, if someone who cannot read print or Braille, this is the first time,” she says, “I’ve typed Braille with a capital B, except when referring to Louis Braille himself. You and other Mosen At Larges have changed my thoughts.”

“If someone who cannot read print or Braille can sit down at a keyboard and compose well written text, I believe that person is literate. I would say a person who can decode or encode symbols to form words is a literate person. I always look forward to spending two hours with you and the amazing global community you have created. Keep up the good work.” Thank you, Debbie. Debbie is in New Mexico.

I completely understand what you’re saying, and I’d like to take you back to my childhood. I don’t know how old you are, Debbie, so maybe you had a similar experience, I can’t be sure. I went to school, for the first part of my schooling anyway, in the 1970s, so this predates any computers. I started school in 1974, and I think when I was about seven or eight years old, we started to learn to type. We were given touch typing lessons, it was a dedicated part of the curriculum, and when we got good enough, we got given a typewriter in the classroom to use. This was a huge badge of honor, and people aspired to be given the typewriter in the classroom.

We were encouraged when using the typewriter, first to Braille down what we wanted to write, and then to type it. The reason for that was that when we were brailling something down, we were fully literate, we were able to write it down, and then read it back. We could perhaps think as we were reading what we were writing, maybe try and tweak it, although that wasn’t easy on a Perkins Brailler, of course, and correct it.I can remember several heartbreaking occasions where I typed out maybe a two or three-pager on my typewriter. It was an old Olympus typewriter, and I would hand it in, only to find that the ribbon had run out or there’d been some sort of other technical problem which meant that it wasn’t possible for the teacher to read my typing. The reason why I got into that predicament was because I was using a device that did not allow me to be fully literate.

When I look back now, I think, surely, it was the teacher’s obligation to learn Braille rather than me being forced to have my education compromised by using a machine that rendered me illiterate because I could not read back what I had written. I was also a pretty fast typist, and often my brain would go faster than my hands could so my typing wasn’t the best, and it really wasn’t until I got a spell checker on a computer that my typing became a lot better because I could use my screen reader to have it read what was on the screen and I could make corrections and, of course, I could connect a Braille display as well and read it to myself.

If you can write something down but not read it back, my view is that doesn’t constitute literacy. I mentioned last week the importance of this discussion from the perspective of making good public policy for the next generation of blind kids. You could, for example, have people argue, well, we don’t need to give blind kids Braille devices that have a Braille display on them because why don’t we just give them one of these new Braille input keyboards? There are several of them out there now that allow you to Braille into a device quite nicely with this little keyboard but they don’t have a Braille display.

I’m not saying those devices don’t have a place, they can be incredibly handy when you just want to input Braille somewhere where you don’t want to take a full Braille display but you want to have your screen reading back to you, you want to Braille in, but they shouldn’t be a substitute for a Braille display because they’re an input-only device.

Sandra: Hello, Jonathan and fellow listeners. I’m Sandra from Germany, and I’m a Braille reader. In the last show, you discussed whether someone listening to material, for example, an audiobook or to their computer screen reader reading to them, could say that they read a book. Spontaneously, I would probably have said no, they’re not because they’re not technically reading, similar to what you said in the last show.

However, Merriam-Webster’s dictionary says, “One definition of to read is to become acquainted or look over the content of (something such as a book).” Now, if you apply that definition, listening to an audiobook might also be reading a book because it helps you to become acquainted with the contents. Same as if you have your computer read to you. I’m not sure which definition of to read we should use, however, I do think that having this discussion isn’t very helpful because it is unnecessarily divisive.

Now, we might wonder why do blind people listening to books want to say that they read them. I think there’s good reasons to do that. We live in a culture where writing and reading is highly valued and information that is imparted through texts is taken more seriously than something that has been said and heard. If people say that they’ve read something, there’s more weight to that information, and maybe blind people who don’t read Braille would like to be taken as seriously as people who do read Braille or people who just, in fact, read in the literal sense of the word.

Now, why are we actually having this discussion? I think what we really do want is that everyone who wants to learn Braille has good access to good Braille instruction, regardless of whether they’re children or adults. I think, to that end, it’s not very helpful to argue that people who don’t read Braille can’t read because it’s just off-putting to the people we are trying to reach. I would rather that we talk more about the benefits of reading Braille and, of course, it’s that you’re more likely to be successful at work if you can read Braille.

Now, of course, someone will say, “Well, I don’t read Braille and I have had a successful career in banking,” like the listener, who started the discussion in last week’s show. Of course, you might be able to do that, but as far as I know, there’ve been studies showing that the likelihood of you being able to do that is greater if you know how to read Braille. It’s just a fact and you don’t really have to offend people or create unnecessary arguments about semantics. That is the approach I would rather like to see, instead of us discussing who’s able to read and who’s not able to read, and if reading or listening is better, and so on.

Jonathan: Thanks so much, Sandra. To some degree, I agree with you in the sense that if somebody said to me, and I quite often hear this in the blind community, “I read this book on audible.” I wouldn’t stop and say, “Oh no, you didn’t.” That would just be a waste of time. However, I don’t agree with you that the conversation isn’t helpful for this reason. You commented about the people that we are trying to reach.

Now, I have been having this literacy discussion for 35 years. That’s now how long I have been advocating on blindness issues. The people that I’m trying to reach when we have this discussion about literacy, are not other blind people, they are public policymakers who are looking at every turn to shortchange blind children and those blind children aren’t able to speak up for themselves.

We are so used in society to having low expectations of blind people that sometimes even blind parents don’t know what expectations to have of their blind children. That, by the way, is why it’s absolutely essential that parents of blind children and blind children themselves get access to adult blind mentors and role models as soon as possible so that those expectations can be raised.

You may be able to forgive some sighted people for not knowing any better. What concerns me is that from time to time, you do come across blind adults who say, “I got by without Braille, therefore blind children don’t need Braille.” I think that is deeply unfortunate and incredibly damaging because I’ve met so many adults with profound regrets and even bitterness about the way that they were treated as children, not given the chance to learn Braille, and I would go as far to say that is a form of abuse. Neglect is a form of abuse. If you were deprived of access to literacy as a child, then that is a form of abuse.

I’m not sure whether Europe has had the same assault on the education of blind children that some of us in countries like the United States and Australia and New Zealand, and I think the UK, that we’ve had to contend with since talking computers came along. When I talk to people from Europe, I hear stories like, “Well, we’ve been able to get a Braille display at home and a Braille display at work or at school.” It seems, anecdotally, to me, that countries like Germany have been much more supportive of an education in Braille, but that’s not the case for all Western countries and it really has been a battle that we’ve had to fight.

Unfortunately, what seems to be happening here is we get some people who, when we point out this literacy issue, seem to take it as a personal insult that we’re pointing this out. If anything’s being insulted, I’m insulting the policymakers, the system that has deprived people of literacy, an example coming right up.

Carolyn: Hi, Jonathan. I just wanted to comment on your comments with regards to the education of blind and vision-impaired children. Well, I too love the idea of everyone being able to go to their local school like you. It’s not always ideal and that’s not necessarily the supports. One of the chronic areas is that there is a shortage of support teachers with the knowledge to be able to assist students who need access to Braille or other resources.

I do believe very strongly that vision-impaired children should be given access to Braille as another tool in their tool kit for a number of reasons. For me personally, I was a vision-impaired child who was denied access to Braille, and because of that, I’ve suffered from bad posture. Spending hours in my childhood leaning over a desk looking through a magnify to read one or two words at a time, it made reading aloud for me something I was absolutely terrified of when I was sent out to my local schools because my reading aloud was not smooth. It was slow and stunted just because of what I was seeing through the magnification.

Had I been given access to Braille, my flow of reading would’ve probably been much better. The fact is too my eyesight as an adult failed and I took it upon myself as an adult to learn Braille. Had I not done that, I would’ve been in a real pickle and I couldn’t participate as an equal in Toastmasters without Braille. I’m very much pro, and if I could talk to parents of vision-impaired children before they start school and say, “Look, don’t discount Braille because it gives you more options.” I’m really passionate about this, but I also have a theory on how to resolve the issue of giving children more time to improve their Braille skills.

A lot of schools have volunteers that come in, whether that be a parent or a grandparent, to help the children with their reading and writing. These are not paid positions. These are volunteers that come into the schools. Now, there is a whole pool of very talented blind and vision-impaired adults out there. Why can we not be part of a pool of volunteers to go into the schools and give that same support to a blind or vision-impaired student in a school? There’s also something else that can add to that is by us doing that and being there for that student, it’s giving them a mentor. Someone they can relate to, someone they can look up to, “Oh, wow, that’s a blind adult and they’re doing this,” and that can make the world of difference as well.

I think in some ways the education system needs to start thinking outside the square and that schools need to start looking for volunteers that can help these students that have different needs to the rest of the student population. This could be done for deaf students with sign language. It’s got endless possibilities, but it’s just an idea that I’ve had, but I am most definitely pro Braille. I am definitely pro Braille for vision-impaired students as well as blind students because of what I know is possible, what I know I could have done if I’d had Braille a lot earlier.

Jonathan: Thank you for sharing that Carolyn. I couldn’t agree more. There are so many benefits of a blind mentoring program. Many people haven’t come across blind people before, or maybe the only blind person they’ve come across is an elderly person who is struggling to come to terms with their sight loss, and normally it is sight loss. The vast majority of people go blind later in life, it’s an age-related condition, and so when a blind child comes into a parent’s life, they really don’t know what to expect. They may have very low expectations because they’ve never been challenged any differently. When the time is right, blind mentoring is absolutely critical, in my view, to set parents’ expectations correctly and blind mentoring can also be so beneficial for blind kids themselves.

I count myself very fortunate that going to the school for the blind, there were a lot of blind adults around, particularly because there were a number working at the transcription department, which was also on the school for the blind premises in those days, so I got to meet some blind adults and also having an older blind brother, I got to meet some blind adults that way and that had a huge influence on me.

I can remember discussing this 20, 30 years ago with one particular person who was at the time involved in the provision of services to blind children who was dead against to this idea that blind adults had a place in mentoring blind children and the parents of blind children. This particular individual was so full of psycho babble and jargon, making it out that somehow bringing up a blind child was the super complex thing.

Some of the blind kids that were under this person’s care if you will, were detrimentally mollycoddled. Part of that was also because this person was just giving nonsensical complicated, ridiculous advice. We need to have blind adults as a part of that system. Not necessarily blind adults doing super amazing things or anything, just living life, being independent, getting on a bus and yes, reading Braille.

“Hi, Jonathan,” says Dawn, “You talked about the level of education of blind kids who go to normal schools. I do agree that blind kids should be integrated into the normal education system, but I also feel that they are not encouraged to learn Braille. My mantra is Braille is the key to literacy. I believe it is too easy for teachers, rather than encouraging children to learn Braille, to let them use speech which means that their spelling often suffers. I know that Braille displays are horrendously expensive, but I’m sure there could be a system where Braille displays were made freely available either through some system of sponsorship or government scheme. I think the extra time it takes to teach Braille to blind children is well worth it for them to receive the essentials of a good education.”

Christopher Wright says, “Hi, Jonathan. I agree with you concerning Braille. It’s a literacy tool that everyone should have the opportunity to learn, especially children. I was fortunate to learn it, though sadly, I find now I’m out of the education system, I don’t use it nearly as much as I once did. Still, it’s a valuable skill and if I needed to read and write Braille, I could, which is the point.

The major problem, as I’ve said before, is the fact Braille hasn’t fully merged with the digital age. I’ve made my opinions known about the NFB, but if they care so much about Braille, why aren’t they trying to make it more accessible to everyone? Why are we still stuck with ridiculous ancient single line refreshable Braille devices that only let you read eight words at a time if you’re lucky? I’ve heard the small market argument so many times, but you’d think by now someone would have come up with a system to represent many lines of Braille without it being prohibitively expensive for the people it’s trying to help, particularly if it’s so crucial as organizations like NFB want you to believe.

I’m amazed every single day at the level of access we have. We made our own solution to the problem of horrendously expensive screen readers with NVDA, TTS is only getting better. We have access to mainstream libraries like Kindle. I can write either on a computer keyboard or in Braille and have that information seamlessly translated into the print alphabet I have next to no knowledge of, et cetera. Where’s Braille at all those advancements? Maybe it’s time for us as a community to create a solution to this problem just like we’ve done within NVDA and remote incident manager.

I was able to go through a couple of very long Kindle books using TTS in a matter of a couple of days. I shudder to think how long those same books would have taken to read on a single-line Braille display, or even worse, via mountains have hard copy Braille books. Until we have a page of refreshable Braille, I’ve accepted TTS engine reading to me. As I like to explain to sighted people, the current system we have for refreshable Braille is the equivalent of your monitor showing a single line of information at a time, which makes it super inefficient for all kinds of tasks.

It’s better than nothing and there are some genuine uses for it such as using devices silently, fixing your system audio when it glitches and you otherwise have no access to do so, proofreading or efficiently reading to people in real time, trying to quote a TTS engine sounds like a nightmare. Aside from those specific use cases, I find it extremely tedious for most things such as reading for pleasure.

I agree that Braille helps with spelling because you can feel the words which helps to embed that concept into your brain. Having said that, I’ve always been a proponent of proofreading. I check what I write several times to ensure it makes sense. If I run into spelling errors, I do my best to learn from those mistakes to make myself a better person. It’s not as efficient using TTS, but I like to think the content I write makes sense 99% of the time. If I really want to make sure, I can connect my Braille display and go through the tedious process of reading eight words at a time.

I’m not saying I don’t like Braille because that’s not true. The sad reality is we’re not where we should be in terms of good digital Braille access for everyone, so we have to use the tools we currently have the best way we can. If someone eventually comes up with a multiline refreshable Braille system that doesn’t cost horrendous amounts of money, I’ll buy and promote it like crazy. I thought the Commute was going to be a solution, but it’s sadly over $1,000. That seems quite low. I’ll give them credit for making the prices low as they did, but it needs to come down a little more so it’s accessible to as many people as possible. Thanks, Christopher.”

Well, this is something I know a wee bit about, having been involved in the product management of Braille displays. It’s very hard to get Braille devices that do three things. One, refresh quickly enough so that when you’re scrolling through your document, it’s keeping up with you. Two, do that refreshing in a relatively quiet way. I need a quiet Braille display here in the studio, that’s important. Three, making sure that it actually feels like Braille.

It’s interesting that the old Piezoelectric cells that were first used in the early 1980s, perhaps even slightly earlier than that, are still the preferred technology that works. For the use case that you talk about, I’m not sure what you think you might gain if you had a multiline Braille display for reading text, say, a Kindle book or proofreading an email as you talk about. Because even on a single line Braille display, you can scroll it without your Braille reading fingers leaving the display, and because the human eye can take in so much more information at a glance, I’m not sure that it’s analogous to make reference to a monitor versus a single line Braille display.

Most Braille readers I don’t think would be able to read, say, two lines of Braille at once. Technically, because you’ve got two hands, you might be able to do it. I think few Braille readers could. I’m personally not overly worried about the single line Braille display thing for reading text, but I am encouraged by the work that Dot Pad is doing in the graphical area.

We talked with APH earlier about this. Very soon, we will talk to the chief executive of Dot Pad about this very subject, because STEM subjects are so critical for blind people to be involved in as well, to have the opportunity to work with those, and having a multiline Braille display that can produce tactile graphics, tabular data, that kind of thing really is significant and I think we’re there. I’ve been hearing for years about this, the holy grail of Braille as it were. I do think we are quite close.

In terms of the cost of Braille, I know it doesn’t feel like it when you’re on a fixed income and it just seems out of reach, but the cost of Braille has actually gone down quite dramatically. While I was involved in the industry of manufacturing Braille devices, I personally saw two separate 40% reductions in the cost of refreshable Braille as the source of Braille cells was changed and we could find them cheaper, so that was significant. Also, when you look at the inflation rate, in real terms, the cost of Braille devices has gone down as well.

In terms of a percentage of one’s income, Braille devices are cheaper than they used to be, but they’re still jolly expensive. That’s why the Orbit Project was devised and those involved came to realize that actually, it’s not as easy to manufacture good quality Braille devices as some people might think. That’s the thing. People can be critical of how expensive Braille devices are, but if it’s so easy and Braille display manufacturers are ripping everybody off, then why doesn’t some big disruptor come along and disrupt? The answer is because it is a very technically complex process to manufacture refreshable Braille devices.

As I said, I think last week, the really exciting thing though, is seeing what’s happening in the United States with the NLS e-Readers. That is such a positive, exciting program, getting Braille in the hands of more people. Hang in there, Christopher, because I think the big revolution when it comes to multi-line Braille devices is not too far away. Whether it will be at a price point you’re happy with is the big question because these things won’t come cheap, but you’ll be interested when we talk to Eric Kim, the chief executive of Dot Pad, and that interview’s coming up soon.

John: Hi, Jonathan. John Won in Los Angeles. I’m going to jump in on the Braille literacy question that you’ve been discussing. Number one, I cannot agree with you any more vehemently that Braille literacy for young kids being taught Braille, nothing could be more important. I’m 100% on board with what you’re saying. You said it better than I did. I’m very active with the Braille Institute of America. You know about the Braille Challenge. You know how much we promote Braille literacy. I’ve been lucky enough to be at that award ceremony and see these young kids who work so hard to become Braille proficient and it’s awe-inspiring. Those kids are going to just do so well in life. Believe me, take it on faith as I do, but it’s just wonderful.

Having said that as someone who was born sighted, I learned how to read and write I guess around age four or five, went through life. I’m a voracious reader, I always was. I was so voracious that I was one of those nerdy kids whose mother had to tell him to go out and play to get some exercise, which I am so thankful my mother did tell me that, but you get the idea. I just loved reading, always have. I went through college, law school, had a career, always read newspapers, magazines, books, et cetera, till my vision went away. Now I listen to books and the NFB-NEWSLINE, et cetera.

Having done what I did, I guess I overreacted, as maybe others did, at the idea of being called illiterate. I had a problem with that but I calmed down. I took that and took a deep breath. I respect you enough from having listened to you to know that you did not mean it as a pejorative term, even if I took it that way. You said you didn’t and I understood what you were saying.

However, I think it is a pejorative term to call someone illiterate unintentionally or not. I just thought about it and took a little– it was a bit tongue in cheek, a little bit sarcastic, but if literacy is the ability to read and write, I think we can agree that you are not literate when you’re sleeping or if you’re sighted and you’re in a dark room and can’t read Braille, you are not literate, I guess, because you can’t read or write in a totally dark room. Therefore, most people that you run into in life are probably illiterate for a third of their life because they sleep eight hours a day.

If you don’t think it’s a pejorative term, I suggest you go up to employers or police officers or judges, thinking about some of the judges I’ve appeared in front of, I don’t recommend this, and tell them that, “Your honor, you’re illiterate. You’ve been illiterate for a third of your life,” or anything to that effect. Clearly, again, I’m being tongue in cheek here, but you get the point why someone might take that the wrong way. Hopefully, you understand what I’m trying to say. I also took your point that I listen now to words. By the way, I, in recent years, have learned Grade 1 and 2 of Braille, so I guess I’m Braille-literate now, although I’m not proficient but I at least have taken it and I can do the alphabet and things like that.

I was thinking about all the legal briefs and documents, et cetera, that I prepared over the years before word processing days. I didn’t write those. I dictated them to a Dictaphone or to a secretary who took them in shorthand. Now, I think I wrote those, but I never once typed them or wrote them, et cetera, but I think that was a writing just as I think listening is reading or the equivalent of reading. We’re nitpicking here and I realize your point is to encourage Braille literacy, and on that, we totally agree.

Jonathan: Thank you so much for calling in, John. I appreciate it. I think I’ve mentioned this before, but it’s relevant for me to mention it again. In New Zealand, we subscribe to the social model of disability. When I’m advocating on the subject and explaining to employers or others how the social model of disability works, I make the point that, let’s say it’s 3:00 in the morning and there’s been some sort of major disaster. There’s been some power cut. The street lights are completely out. It is pitch black outside, and for some reason, I as a blind person and a sighted colleague need to go into our office to get stuff out or something like that, who would be the disabled person at that point? Not me.

I’d be as familiar with the building as I ever was, I’d be able to locate the steps. I’d be going up. I’d be guiding that sighted person, which illustrates the point that disability is created by barriers. Sure, if you put a sighted person in a dark room, then they would be rendered illiterate at that point. No argument from me about that at all. That’s why good public policy removes the barriers that turns an impairment into a disability. Yes indeed, I am familiar with the Braille Challenge and I’m very grateful to the Braille Institute for doing such excellent work, promoting the use of Braille among children, and thank you for your involvement in it.

Dan Teveld is writing in and says, “Hi, Jonathan. I agree with you that Braille is literacy. All to often, blind people who should know Braille don’t present themselves well. I have been in situations where a blind person was doing a public presentation and I had to wait for them to listen to Jaws while they were determining what they needed to say. A sighted person could never get away with this. The issue which bothers me the most is the fact that many blind people don’t proofread email. I recently raised this issue on an email list by pointing out that poorly written email would not help a blind person make a good impression when interacting with a sighted person.

I made the mistake of calling out one individual where I should have made a general comment. I was so disgusted by this individual’s whining tone that I lost my temper when responding. I got a lot of criticism for that, but I stand by my conviction that for better or worse, we are judged by sighted people every day and need to put our best foot forward. This is something we all need to work on throughout our lives. I don’t expect perfection, but I would like to see some people try to help themselves rather than passively asking for help because they either can’t read or don’t get in the habit of using Google to search for an answer to their questions. Obtaining information solely by listening is a passive activity.

I do know blind individuals who have done well for themselves and didn’t learn Braille, but this is the exception rather than the rule. Almost every blind person I know who learned Braille is working or retired. The people I know who I retired had long-term careers. The people I know who are currently working and use Braille are writers, lawyers, teachers, and IT professionals.

I would also like to thank you for mentioning the Opticon. I would like to learn the print representation of every character, number, and special symbol I encounter in Braille. The tactile arrays under development will initially be too expensive for many blind people to afford. I would think it would be possible to develop a smaller and cheaper tactile array like the one used in the original Opticon.

The original Opticon have the added advantage of a moveable camera which could be placed on an object, which, for some people, is easier than holding a camera above a printed page or other surfaces like boxes, bottles, and computer screens. I used an Opticon in high school to read books in Spanish and liked the fact that I could learn and read Spanish characters just like my sighted classmates.

I am subscribed to an Opticon discussion list on the FreeLists website. One of our list members has compiled a list of use cases for the Opticon. It’s unfortunate that neither assistive technology companies nor advocacy organizations like ACB and NFB are interested in the Opticon. I know one person who has three Opticons because she is worried about getting them repaired. The original Opticon patents and training materials are in the public domain. There have been attempts to fabricate a new Opticon, but they didn’t go anywhere due to lack of funding.”

Thanks for your email, Dan. This one was a bit marginal for me to include. I’ve been trying not to include those that are pretty explosive on both sides because I do appreciate that this is a sensitive discussion and we endeavor to have these discussions in a respectful way. Clearly, the word illiterate is much more triggering than I would have anticipated to the extent that I’ve had a couple of people saying, “We are not going to listen to your podcast anymore because of what you’ve said.” That’s people’s choice. Obviously, it’s my choice to say what I genuinely believe in as respectful a way as I possibly can manage.

As long as I have breath in my body, I can tell you that I will continue to advocate for high expectations of blind children, for the expectation that a blind child should read just as a sighted child should read, and that for many blind children, including those with low vision, with a prognosis of deteriorating vision, or for whom using print is exhausting, Braille is the only tool of true literacy that we have. It’s called Mosen At Large for a reason and it’s people’s choice not to listen if they don’t want to if they can’t cope with an opinion that differs from their own.

I think though that we have to be really careful that we don’t make assumptions about what’s led to this situation for every person. This is why I’m so passionate about this subject because I know there are blind adults who were not given the option to learn Braille when they were kids, perhaps because they had a little bit of vision at the time, as we heard with Carolyn’s contribution, and there was pressure on Braille resources. Braille resources were rationed and so only the blindest of kids got Braille when clearly other kids should have been taught it. It’s the system that has failed those people and that’s why this issue was so important because the system keeps failing so many people.

The consequences of failure are immense. It’s more likely to end up in lower socioeconomic status, under-employment, or most likely unemployment. That can lead to mental health issues as you feel like you’re not able to contribute. These are really serious things that we are talking about. When we see somebody coming out and reading with eloquence in a public situation like that, it may be because they just haven’t had the exposure to Braille that they ought to have had. Should they learn Braille? Well, that’s an option for them, of course.

It is a lot harder to learn the code when you’re an adult and it may be that you’ll never get the fluency that somebody who’s been learning from the age of four or five has got. We know that Braille is not a language, but I think there are some similarities in terms of kids who can be multilingual when they learn languages at an early age and people who try and pick up a language later in life. They don’t tend to have the fluency a lot of the time. It’s not impossible and nor is Braille fluency impossible when you’re an adult, but I think it is harder, so we do need to show some compassion for people who have been victims of the system in a situation like that.

You do raise a really interesting separate but important point regarding rights and responsibilities. As anyone who listens to this podcast or even an episode will know, I have been advocating for the rights of blind people for a very long time, but rights do come with responsibilities. If you have the technology to listen to this podcast, then no matter how difficult the circumstances, we all have a lot more technology access, a lot more opportunity than many blind people from around the world do. Even if you use a screen reader and a text-to-speech engine to proofread your material, most apps these days come with spell-checking technology and it doesn’t take too much effort to use the spell checker.

Microsoft Office, if you have access to that, even has a grammar checker where you can get things picked up. It can be a bit frustrating to use, I find, but it is available to you. Again, I’m really conscious that this technology is a lot more intuitive for some people than others. Sometimes people have multiple disabilities. Sometimes it’s just that what comes naturally to some of us doesn’t come naturally to others. This is why I’ve always been a fan of things like the Victor Reader Stream or the BlindShell Classic 2, or whatever technology is out there that allows people to engage with all of this information that we have access to.

I do understand that sometimes there are people at their computer or using their smartphone who find it difficult to understand how to work with the spell checker. I’d like to hope that those with the ability to will take the time to learn how to use the spell checker and the grammar checker because whether you’re writing to a blind person or a sighted person, it’s really hard to get past that perception that if somebody’s sending you an email riddled with spelling mistakes, they don’t care enough about how they are perceived or their correspondence with you, and it just lessens the impact of that correspondence.

We’ve got so many tools at our disposal these days. If we can manage it, it is important to use them but this technology really is a struggle for some people, so it’s important I think to have compassion for people’s circumstances, but to try and encourage people to persist, particularly with things like spell checkers and that kind of thing, because you’re right, it does have a negative impact.

Regarding the Opticon, which I haven’t used for many years now, when I think back on it, what amazes me is the latency of that thing. It was immediate. You would run your camera over the print and immediately get translation. It was real-time and I think that was one of the really important, compelling things about it. People may have some comments on this last point about using spell checkers, about our use of technology. As I say, I am drawing the literacy discussion to a close after a lengthy discussion about it, but I thank everybody who has contributed to what is a pretty challenging subject.

Ad: Be the first to know what’s coming in the next episode of Mosen At Large. Opt into the Mosen Media list and receive a brief email on what’s coming so you can get your contribution in ahead of the show. You can stop receiving emails any time. To join, send a blank email to media-subscribe@mosen.org. That’s media-subscribe@M-O-S-E-N.org. Stay in the know with Mosen At Large.

Jonathan: Time to hear from fellow Braille fan, fellow ABBA fan, Jackie Brown. She says, “Hi, Jonathan. Glad plans for your forthcoming trip are well underway. I was born and brought up in London so all your excursions sound pretty familiar and will take me on a trip down memory lane. As regards Sony headphones, I have had the WH-1000XM3s for a couple of years and love them. I resisted buying the XM4s because one upgrade seemed a little extravagant. I have my heart set on the XM5s given my current pair have received a lot of love and play. Another justification for hoping Santa will consider the XM5s, hint, hint Martin,” hey Martin, you’re hearing this? “Is the fact they can pair with two devices, which is always useful.

I’ve always been a big fan of Beyerdynamic headphones, and I’m treasuring the pair I’ve had for around 25 years but haven’t come across anything by them with Bluetooth and noise canceling at an affordable price, though, I accept they do exist. Personally, I prefer Sony over Bose, but it’s all very subjective and whatever works for your lifestyle and budget are the most important considerations. I look forward to any comments on the WH-1000XM5 headphones you may have when you try them out. They get excellent reviews in the press.” Indeed, they do. Glowing. Positively glowing, but Jackie says it is always good to hear word-of-mouth opinions. “Thanks as ever,” writes Jackie.

Good to hear from you, Jackie. Yes, I got these headphones, and as I’m putting this part of the podcast together, I have had the WH-1000XM5s for about four days now. There is a major consideration for me that is not a major consideration for most people listening to the show, and that is that I have to try and make them coexist with my hearing aids. I don’t have enough hearing to make it sound really nice if I take my hearing aids out completely. I can crank them up actually, loud enough, but they just don’t have the frequency response that I need because of my hearing.

Usually, I have my hearing aids in, I need to adjust them and wear the headphones, and I can make that work with these because they have a nice big cup so I can get the hearing aid in. You’ve got to be careful with feedback, but I have been able to make that work, which is great. They sound really wonderful. I’ve listened to a lot more music since I’ve got them. The app is idiosyncratic, but it’s accessible. Sometimes you have to try and get some context about labels and switches and things, but it’s workable. It’s not a problem at all.

I’ve had no difficulty adjusting to the little touch panel. It’s not really a touch screen because it’s not a screen. It’s a touch panel on the right-hand side and it’s very iPhone-like. You flick up and down to adjust the volume, you flick left and right to skip between tracks, you double tap to answer a call if you get one, you double tap to end the call, you tap and hold to get your virtual assistant. The really nice thing is you can choose which virtual assistant you want. Because I find Siri so problematic, I’ve actually got it set up to use the Amazon one, what we affectionately call the soup drinker on here, and it works really well. It’s cool to be able to sit there with your headphones on playing a game like Song Quiz or something like that.

I’ve made phone and FaceTime calls on it. People say they sound really good. One feature I also like is that they have good microphones in there for ambient mode and also to help with identifying noise to cancel. If you’re in a situation where you are wearing the headphones and you can tell that somebody’s making an announcement, you might be on a train or a bus or a plane, and you want to just hear that without taking off the headphones, you cup your right ear. What you’re effectively doing is covering the whole touch surface. When you do that, it pauses what you’re listening to and turns the microphones on, genius, really simple. Then when you finish listening to the announcement or whatever or talking to whoever it is, you take your right hand away and the music resumes.

There are also a couple of other features that are nice. There’s a mode where it’s detecting voices around you and pausing if you’re being talked to, that’s really handy. As a blind person, sometimes you’re on those long-haul flights, oh boy, I have this to look forward to again. They’re trying to attract your attention, but you can’t see them there and you are grooving to the tunes. You know what I mean? You’re grooving to the tunes. What I used to do is I would say to the flight attendant, “Look, if you need to attract my attention, just gently tap me on the shoulder or something, and then I’ll know you are there.”

Well, this they’re detecting voices. I don’t know how well this works yet. I’ve tried it at home, but that’s probably not the best test. If someone’s talking to you, it’ll pause the music and turn the microphones on. Then when the conversation stops, there’s a detector on it that you can set in terms of how much silence it needs, and then it will just resume the music for you. I like the equalization feature with the app where you can actually change the EQ based on presets, or you can go in and manually adjust each band of the equalizer yourself. When you save, you’re actually saving that preset to the headphones. Once you’ve saved, it’s actually built into the headphones.

Now, I have a comment, and please don’t take this as definitive because I’m very new with this and somebody may be able to come up with a solution. I was pretty excited about the idea of pairing two devices at a time, but I have not been able to get this to work yet so I switched it on in the app, and that’s the first important consideration. You do have to go into the app and tell it that you want the feature where you’ve got two devices to connect at the same time on.

Once I did that, I paired my Think Pad and I also paired my iPhone. This is a pretty common use case because sometimes I’m curled up at night, editing the podcast, and I want to still hear the notifications from my iPhone where all my news apps are. I thought this would be perfect for that but so far, I haven’t got it to work. When I pair the Think Pad, it seems to hold on to the signal.

Now, there is a feature in Jaws called Avoid Speech Cutoff. When you turn that on, it’s essentially sending silence out, and I thought, okay, that’s probably holding the Bluetooth connection open, and ghastly as it is, I might switch that off and see if it helps. It didn’t seem to so I’ll keep playing with that. If anybody has the Sony, WH-1000XM5s and they’ve got this feature to work, say between a PC and an iPhone, I’d be really interested in hearing. As I say, I will keep playing, but look, they’re wonderful. They sound great.

I haven’t been in too many noisy environments yet to check the noise cancellation features, but I like the way that you can program the noise control button. I like the fact that you can just tap the power key and get a voice prompt telling you how much battery is left. The battery does show up in the little widget, the iOS widget, which shows battery notifications, so it’s very good.

One observation though, and it’s a bit of a non sequitur, but it’s also relevant, I think, is I’m reminded how laggy voiceover is with these sorts of Bluetooth devices. I have a Sonos Roam and we’ve demonstrated the Sonos Roam on this very podcast. I was struck at the time by how much lag there is and I still am every time I use it. When you flick around the screen, for example, with voiceover, there really is a noticeable lag. This is one of the things that made me nervous about switching to Bluetooth-based hearing aids.

It does seem that the Made For iPhone hearing aids, at least the Oticon ones that I have at the moment, have overcome this in some way. Perhaps it’s to do with the Bluetooth profile being used or the specific Made For iPhone specification, but it’s much more snappy whizzing around the touchscreen with my hearing aids connected via Bluetooth than it is using one of these devices.

Maybe it’s just using more bandwidths because of how much high-quality stuff it’s sending, but it’s really noticeable. It makes me wonder whether some of these newer Phonak aids have the same problem. If you are a Phonak hearing aid user, I would be interested in knowing whether you find flicking through voiceover laggy with them. Because one of the big advantages of Phonak is that they say they’re made for anything. You can pair all sorts of Bluetooth audio devices with your Phonak aids, not just iPhones. That really does appeal to me as a principle, but if the Phonak aids are as laggy as these headphones are when you’re using voiceover, then no. If anyone has any intel on that, I’d be really interested.

Voiceover: Mosen At Large podcast

Jonathan: All the way back in Episode 67, we spoke at length with Karthik Kannan from Envision about their technology, how it got started, and the Envision glasses, which were quite new at the time. Two years have flown by and a lot has happened with Envision. A lot is happening right now. You may have heard over the last week about some pretty exciting developments. Karthik is back with me and we’ll explore Envision in-depth. Karthik, it’s great to be talking with you again. Thanks for being on the show.

Karthik Kannan: Thanks so much for having me on the show, Jonathan. It’s a pleasure to come back. There’ve been so many of our users who’ve spoken very highly of your podcast. I’m always excited to be on the show.

Jonathan: All right. Let’s talk about the app, first of all, because it’s free now. How did you make that possible?

Karthik: It’s been a really exciting few months, a few weeks for us here at Envision. They say that years’ worth of things sometimes happen in weeks. That’s really how it’s been feeling at the Envision HQ because yes, the app has gone free and it’s really something that has been in the works for years. It’s very funny sometimes because users tend to have one of two theories. They assume that A, either Envision’s got really fat check, I don’t know, from a sheikh in the Middle East or something who just decided to fund the Envision app forever, or they think that we’re going bankrupt.

The truth is, as they say, a little bit more plain and simple because we’ve been working on trying to make the technology of the Envision app and the technology of the Envision glasses as efficient and as operationally inexpensive as possible. It’s been a work in progress over the past five years. To put it simply, what we’ve been doing is we’ve been bringing more of our AI offline. All the AI that used to be online earlier, we A, made that more efficient and wherever possible, we try to bring those pieces of AI entirely offline. When you bring a piece of AI entirely offline, it becomes operationally inexpensive to run. Just like any other piece of offline technology, it just doesn’t cost us to keep running it. That’s exactly what happened with the Envision app.

Over the past few years, we’ve been taking some of the most used features on the Envision app, working really, realy hard, and bringing them offline without compromising the accuracy of those particular features. For example, the scan text feature of the Envision glasses where people read PDFs or scan documents, we brought that feature entirely offline for a lot of Latin-based languages, along with Japanese, Hindi, Chinese, and a few other languages as well. We brought that entire thing offline. That allows us to basically cut the cost of that feature dramatically and offer that to users for free.

The main goal with the Envision app has always been to try and make this AI technology as accessible as possible. That’s what we just did by bringing the whole thing offline. We thought it’s the right moment to make the app entirely free. That’s basically how it happened. Of course, it took us five years to get here, but we eventually did.

Jonathan: That’s a good technological background. In terms of the business proposition, I guess the app is now a loss leader, right? If you can get people hooked into the app, into the Envision ecosystem, the idea is that hopefully, you’ll encourage people to buy the glasses. Is that really how it works from a business point of view?

Karthik: Well, you could say that, yes. As of now, there is still a cost component associated with the Envision app, which is why within the Envision app, if users really find the app useful to them, they can go ahead and make a donation, which further goes into making the app free for more people. In fact, the way we structure these continuations/contributions within the app is you can pay $1 and make the app free for 5000 users and go all the way up to $1,000 and make the app free for a million users.

The main idea with the app is that, yes, it is a bit of a loss leader but we do accept contributions from users, and yes, it is also a way for people to get introduced to what Envision is. Because what we noticed is a lot of glasses users actually first started off as app users and we saw that as more people came to interact with the Envision brand through the Envision app and really get to know us, they started to eventually see the value in the app or rather the glasses being this device where you can get the same technology and in places even better technology without having to hold your phone around in your hand and keep pointing at things.

That is the business strategy and right now the glasses subsidize the app in a way. Over the long-term horizon, our aim is to bring the cost of the app as near zero as possible, the operational cost of the app to near zero as possible, and then make sure that we’re just focusing on product improvement going forward.

Jonathan: You’re competing with some big juggernauts. You’ve got Seeing AI that Microsoft is doing and they can afford to give that app away and keep tweaking it. On the Android side, you’ve got Google itself doing Lookout. What is the value proposition for Envision in terms of the app? What do you think you’ve got that those apps don’t have and that should encourage users to give Envision a go and to switch to that app?

Karthik: I think there are a lot of things that Envision offers. I think, first of all, it’s the fact that Envision is a cross-platform app. We’ll also be bringing Envision to the desktop and the web by the end of this year as well. Envision is not just going to be on iOS and Android, but it’s also going to be on the web. If there are users who are using both these devices and for example, they might scan something with their iPad and then read it on the Pixel device, that’s entirely possible now because Envision is a completely cross-platform app.

The second thing is Envision offers more features when it comes to reading text than either of these two apps. For example, Envision is the only app that offers people the ability to import a PDF, import an EPUB file, import a Word document, and we’re constantly adding more and more import formats. Anyone who has an inaccessible PDF, which is a lot of people, they can just go ahead and import their files directly into the Envision app. They can also save these files within the Envision app for reading later on as well. Of course, Envision, if you’re looking to scan documents, Envision is the only app out of the two apps that you mentioned that offers people the ability to scan multiple documents at once, along with providing guidance of how to move the phone in order to capture the picture.

I think the third thing that a lot of people absolutely love about Envision is it offers more languages. Envision offers more than 60 different languages for people to read from. There are people who use the app to read Arabic, people who use the app to read non-Latin languages. The two apps that you mentioned have a really good background when it comes to Latin-based languages, but with non-Latin languages, we offer a lot more variety. It’s a more global app in that sense.

I think, lastly, Envision, especially on the Android side, is available on any smartphone that’s running Android 7 and above. Unlike, say, for example, Lookout, which is still relegated largely to the very high-end devices, Envision is able to offer our AI on any device that’s running on Android 7 and above, including tablets. There is a lot more features that Envision offers when it comes to reading, and the fact that it’s available on both these platforms and of course, coming to the web later this year, they’re all huge pluses, and of course, the kind of languages, the number of languages that Envision reads.

Jonathan: That should mean that you’ll be able to use your Windows PC or your Mac to do some of these tasks by the end of the year.

Karthik: Exactly. For example, if you get an inaccessible PDF or if you want to read a document, or even recognize the CAPTCHA, all you have to do is just right-click on the document and then, say, envision it on the desktop. Then it would open up the Envision app on the desktop and then read it out, process the PDF or the document for you, and then read it out right on the desktop itself. You’ll be able to save the document on your desktop and then have it reflected on both your iOS and your Android device and read it on the go. It’s becoming like that really interconnected ecosystem with the web and the desktop being added to it.

Jonathan: Is Envision on the BlindShell Classic 2 at this stage?

Karthik: We’re in talks right now. We’re going to be on the BlindShell as well. We’re going to be on a whole number of other devices as well. Along with the BlindShell, we also are talking to the folks at Polaris and we are also in talks with the folks at HumanWare. Envision is going to be pretty much available on all these devices. We’re building a custom version for all of these different devices as we speak.

Jonathan: How do you think the Instant Text feature stacks up against your competition? I realize this is really anecdotal, but to give you an example, we get these meals delivered every week because my wife and I have a pretty busy lifestyle. I find that if I use Lookout on my Android device or Seeing AI or a similar product on iOS, it pretty much gets it every time. It doesn’t matter what way I’m holding the meal up, I get told what it is. When I use Envision’s Instant Text, sometimes I don’t get anything intelligible back and I wonder what the difference is there.

Karthik: I think I would definitely urge you to try some of the later versions, the latest versions of the app. Because one of the big improvements that we made to the Instant Text over the past few weeks is one, you will be able to read text regardless of the orientation of the text. You could be holding it upside down and the app would still be able to read it. Two, we also improved the app’s ability to read text even in curved surfaces. If you’re holding it in a bottle or a prescription bottle, or even a packaging where it’s not a regular rectangle-shaped one, it’s got these creases on it and so on, the app will still be able to read it.

We did put a lot of effort into revamping Instant Text on the app and on the glasses past few weeks. I would definitely encourage you to try it out, maybe the latest version, and give it a shot.

Jonathan: I definitely will do that. Now, when you do something like this, make an app free, a lot of the world celebrates and they say this is great because we all know about the dire socioeconomic status of many blind people. There are people who have paid for Envision who might be a bit grumpy, I suppose. Are you getting any adverse feedback about this?

Karthik: I would say it’s a very small percentage. Yes, there is. Of course, not everyone is going to be perfectly okay with the situation all the time, but we actually have something called the Envision Pioneer program. Anyone who’s ever made a purchase of the Envision app subscription over the past five years, for that matter, we’re calling them Envision Pioneers.

Now, Envision Pioneers actually get a whole bunch of perks that regular app users don’t have access to. One, they get a 10% off on anything in the Envision store. For example, if they want to buy the Envision glasses, straight off the bat, they get a 10% discount on the Envision glasses, no questions asked. There’ve been so many people who have made use of that particular discount over the past few weeks ever since the app went free. Second is we still will continue to offer priority support for customers who have or had made a purchase of a subscription.

Of course, as the app goes free, and as you know, any free app that is at scale, it’s always going to be hard to give priority customer support, so we are working on a really robust knowledge center that answers 90% of the questions that people might have. If they still want to ask us questions, they can always write to us as well, but then for the Envision Pioneers, we’re offering them priority support just like how we always used to when Envision was a paid app.

Along with that we have a few other perks that we are working on right now. We are going to be setting up a hall of fame on our website and within the app which will list pretty much all the subscribers who have ever paid for Envision. We’re also going to be setting up something called impact metrics where, for example, if someone has been a subscriber for two years, they can see how their money actually has not just benefited Envision but also the wider community. Because anyone who has ever been a subscriber of Envision has contributed in a very objective way to making the app free for maybe another 5, or 10, or 15,000 users who are using the app as of today.

We have all these different perks that we are offering Envision Pioneers, and around 99% of the people are extremely happy because they do understand that at the end of the day, we are trying to democratize this technology. We’re trying to make this available for as many people as possible, and they do feel happy with the perks that we’ve given them, and they do feel happy about the fact that at the end of the day, their contribution has made this technology become more accessible to people over time.

Jonathan: Count me in there. I bought a lifetime subscription to Envision some time ago, and if my little contribution has helped those who can’t afford it to access this technology, then personally, I say bring it on. I’m glad to have been able to make some difference there. That’s good. Are you going to still work on the app? I guess there is some concern that if you make it free, is it going to be less a priority for you now in terms of feature enhancements?

Karthik: Absolutely not. The app team is still intact as it is. We have a bunch of whole features that we have been working on for the past few months all going to be coming onto the app over the next few weeks. Like I said including a web version of the app as well. Yes, the app team is committed, Envision is super committed to continuing to build the app, because like I said the Envision app is the low barrier way for people to experience Envision as a company, as a product, and as a community.

Anyone who installs the Envision app can also join the Envision community which is on Telegram. Super, super live, and super robust. We just want more people to be able to have access to this technology as a starting point, so we are definitely committed to keep working on the Envision app. I think the best way to answer the doubts in people’s minds is when they see the stuff that we are going to be pushing in the next few months, and I think people are going to understand that the app is alive and kicking.

Jonathan: Yes and you mentioned the Envision community. It’s really creditable the amount of social media you’re doing, you’re popping up everywhere on Twitter spaces and Clubhouse all over the place with these sessions that talk about Envision and the difference that it’s making in people’s lives and that’s really encouraging. When we last talked, the glasses were just rolling out. There were a few people with them. It was very much the beginning of a journey and that was two years ago. How have the glasses gone for you? Have they met your expectations, and most importantly, the expectations of those who have them?

Karthik: Yes to both. Actually, it’s a resounding yes to the second question, but the first question yes, we really had a lot of hopes for the glasses in the sense that we really envisioned going out there with the glasses, having a road show. Because if you remember, we launched the Envision glasses at the very beginning of the pandemic in March 2020 is when we launched the glasses. Our thinking all through the development in 2019 was we start a pre-order campaign, we then go ahead and hit different centers across the US, and hopefully, across the world, and just keep showing the glasses to as many people as possible and generate buzz for the pre-order, but the pandemic happened and we couldn’t really go out there.

Also in 2021, a lot of the events that we really hoped to be present for were either virtual or they were canceled. In terms of getting the glasses out into the hands of more people and then having them experience it, I’d say it’s like a 50/50. We accomplished some things and we couldn’t accomplish some things. As of today, Envision has a completely hybrid approach to getting people to experience the glasses, which is why we are being so active online.

Because we realized that we’re still living somewhat in the pandemic and it’s still going to take a few more weeks maybe, I don’t know, a year or so, for things to fully open up in terms of events and travel and things like that. Luckily, we don’t have another wave coming. All things considered, I would’ve loved if there was no pandemic. I think everybody would’ve loved it across the world if there was no pandemic, but unfortunately, that happened and so we couldn’t get it into the hands of more people.

The second thing, yes, for anyone who’s purchased the glasses in 2020, in fact, we only had three or four returns so far from the original 150 odd units that we shipped in the beginning of the pre-order. Three or four units out of 150 that we shipped, that’s to me, a huge success, and that’s because the glasses have evolved dramatically over the past couple of years.

In fact, the product that I am talking about, the Envision glasses today is a completely different product from the one that we actually shipped. That’s how much the glasses have actually evolved over time, both in terms of the number of features, both in terms of the accuracy, both in stability, it’s a completely different beast today than what it was when we first shipped it. The first version that we shipped of the glasses, it feels like a completely different product.

What is very satisfying for me personally, is that, unlike other companies which go through multiple generations of the hardware in order to show tangible improvement in the product, we’ve kept the hardware device the same. We’ve provided over 24 different updates completely free of cost and pushed it out and really evolved the glasses that way. People who bought it during the pre-order campaign, the hardware has not changed but the underlying software has changed dramatically, and that’s a huge testament to the hard work that the team at Envision puts in. It’s crazy to me that two years later, the product is still alive and kicking with updates, and we still have so much that we get to push, so it’s a completely different beast.

Jonathan: There’s no threat to ongoing supply of that particular hardware at this stage?

Karthik: No, there is no threat to that. Even if there is a new version that comes out, we definitely are committed to supporting this hardware. Like I said, we have a whole bunch of things that I yet to be pushed for this hardware device that I definitely foresee us supporting this for a while to come.

Jonathan: It’s an expensive product of course, and you can understand that some people might be a little nervous about that kind of outlay. How easy is it for people to evaluate the glasses and just ascertain whether it fits into their lifestyle, whether it makes an appreciable difference?

Karthik: This was one aspect where we dramatically changed our approach from 2021 to 2022. In 2021, we were still dependent a lot on showing the glasses physically and things like that. In 2022 we dramatically changed course. One, we slashed the prices of the glasses by $1,000, making it way more affordable for people to buy directly. Earlier, we relied a lot on distributors to go to events and do demonstrations one on one, and since we realized that the world is probably not going to come back for a while, we decided to go ahead and slash the prices of the glasses and make it more affordable directly for people to purchase.

As of today, earlier I think about a few months ago, at the beginning of the year maybe, the Envision glasses cost $3,500 US, today it costs $2,499 to buy. Second thing is that we opened up the whole demonstration of the Envision glasses to anyone across the world. As of today, people can go on the Envision website, request a free demo of the Envision glasses, and that puts you directly in touch with a member of the Envision team.

We set up a time with you, we need about 45 minutes to an hour, and then we get on a call together, and then I or anyone from the Envision team, we have a really nice demonstration team within Envision now, they will walk you through the entire glasses, including any questions that you might have. Show you how the video calling works, show you how Aira works, and everything. At some point, people can ask questions during the demo, and so on and so forth.

Once the demo is done, we offer a 30-days-no-questions-asked return policy. People can go buy the glasses on the Envision website and then they can use it for 30 days, and then they can go ahead and return it if they don’t like it. No questions asked, you can just put the glasses in a box and ship it back to us and we’ll take care of the cost and things like that.

What we also do is as soon as you get your pair of the Envision glasses, you will get an email to book a 90 minute onboarding session with the Envision Team and someone from the Envision Team will actually get on a call with you, sit with you for 90 minutes, help you set up the glasses from the moment you get it and help you pair it with the Envision app, show you how to add contacts to make video calls, walk you through the whole thing. Two weeks later we check in with you once again and so on and so forth. We’ve made it completely possible for you to sit at home, experience the glasses, get it risk free and get an onboarding or training from us completely free of cost, and then experience it. If you still think it’s not the right product for you, you can just return it, no questions asked.

Jonathan: I love the way that you just slipped Aira in there. Just a quiet little mention. This is the big one. When we talked in 2020, we discussed the fact that the Envision Glasses are a platform and that it’s an ecosystem that will build up over time. Meanwhile, Aira decided to trim back quite considerably. They concluded in the hardware business, it’s expensive to maintain, all those things. Now you and Aira have announced this partnership. Could you talk us through that and what it means?

Karthik: Yes, sure. Like you mentioned, from the beginning, one of the big goals for the Envision Glasses was to become a platform that will allow other developers to build accessible apps for this accessible smartphone platform, a smart glasses platform. That’s the whole idea from the beginning. When people buy a pair of Envision Glasses today, they not just get Envision’s technology, but they also get a whole bunch of other apps.

The first app that actually came on the Envision Glasses is the Cash Reader app. The Cash Reader app is super popular amongst people. They’re on iOS, they’re on Android. We partnered with them and today, anyone who gets a pair of Envision Glasses basically gets the Cash Reader app entirely free, including all the currencies and everything like that. The next big announcement, of course, everyone has been telling us from day one that you guys should have Aira on your glasses ever since, of course, they pulled out their own hardware.

Even if I think they had their own hardware, I think people like the versatility of owning the Envision Glasses, which has all these different AI functions, then also having something like an Aira on board. Yes, we spoke to the folks at Aira back in 2020 when we first launched the glasses. At that time, like you said, they were going through a phase of restructuring and things like that. We got back in touch with them. We’ve been in touch with them ever since then. They’ve seen how much the glasses have been accepted by the community. They themselves have actually gotten so many requests from their users to get onto the Envision Glasses.

Earlier this year, we reconnected again at CSUN and that’s when we kicked off the development of Aira on the Envision Glasses. As of today, Aira is available as an app on the Envision Glasses. Not just that, anyone who has an Envision Glasses actually gets 300 minutes of Aira entirely free of cost. You might already be an existing user of Aira, but if you are an Envision Glasses user and you update your glasses to version 1.8.0 and connect your Aira account with the Envision Glasses, Envision gives you 300 minutes worth of Aira for you to use as a one time offer.

We now have this partnership with them. We’re also partnering with BlindSquare. BlindSquare is also going to be on the Envision Glasses as well. You’ll be able to scan any QR codes and the Envision Glasses will also go ahead and recognize BlindSquare QR codes as well. Apart from that, we have a whole bunch of apps that are coming that we will be soon announcing later this year. Some very, very popular navigation apps, some other calling apps as well, really popular ones that everyone’s been using in the community. They’re also going to be coming onto the Envision Glasses soon.

Jonathan: Those 300 Aira minutes, do they expire at any point?

Karthik: They don’t expire. It’s a one time addition of 300 minutes. Of course, we will have a website linked in the show notes that’ll give you more of the terms as well.

Jonathan: For those who aren’t familiar with the glasses, we will, at some point on this podcast, be doing a demo. I really thank you for setting that up so we can put it through its paces. Can you talk me through the way that a user works with the glasses and maybe some information about how people are using them in their daily lives? The difference that you’ve heard that the glasses are making.

Karthik: Sure, the glasses basically look like any other pair of sunglasses. On the right-hand side of the glasses, you have the camera, and then if you travel further down from the camera, you would go ahead and have the trackpad. All this is on the right-hand side of your face. Just in front of your right eye is a display. Then right next to the display is a camera. This is a wide angle camera that has much wider view than your regular smartphone camera. It’s able to take in more information. If you come further down the line, you basically go ahead and have a trackpad. This is a multi-touch trackpad.

You can use the glasses using voice and touch. You could go ahead and say, Envision, what’s in front of me? Or read this or open scan text or instant text and the glasses would actually go ahead and operate entirely with voice, or you have the trackpad with which you can use the touch gestures. You travel further down from the trackpad that’s located right next to your temple, further down from there is a speaker. There is an on-board speaker on the glasses, and there is also a facility to connect the glasses to a Bluetooth speaker like your AirPods, Bluetooth headphones or speakers like your AirPods, Aftershokz and so on.

If you travel further down from your right ear, right behind your right ear is where you have the battery. The glasses come with about five to eight hours of battery life on continuous usage. You can also put the glasses on standby by just folding the glasses. That gives you an additional 10 to 12 hours of battery life when you put the glasses on standby. The glasses charge through USB-C fast charging, which means they go from 0% to 50% in about 20 minutes’ time.

You can use any fast charger that comes with the glasses. The glasses have a lot of features in them, but what it does really, really well is reading text. You can go ahead and use the glasses to read short pieces of text. Let’s say you are enjoying TV and you want to read what’s on the screen. You could basically use the glasses to read using instant text. You can just look at the particular display and then it just speaks out everything that’s on it there for you, or for example, your screen reader breaks down when you’re using your computer.

That’s another big use case for instant text. When the screen reader breaks down when using the computer, if you’re using a narrator on windows, for example, if that breaks down, you could use instant text to be able to read text as well. In fact, we just had a gamer who was using instant text on the glasses to read what was there on his Nintendo Switch. The Nintendo Switch is a very popular console, but it doesn’t have any screen reader built in, which makes it completely inaccessible.

He was actually using a very clumsy version of Seeing AI on an iPad to read what was on the display. With the Envision Glasses, he can play the games that he wants to play and just turn on instant text and it just reads out everything that’s on the display. It also reads documents really well. It guides you on how to take a picture first multipage document, for example, you could just be holding a book in front of you. It first guides you on how to move the book around so both the pages are in frame and when both the pages are in frame, it takes a picture automatically, detects what’s left page, what’s right page, and then speaks out the information to you accordingly.

It also detects headings as well, headings and columns. You could be reading a newspaper article or a magazine article, the glasses automatically know what the headings are, how it should be read because it’s in columns, and then reads it out to you. Those are all possibilities. You can, of course, also scan multiple pages at a time. We have something called batch scan where you can just keep scanning through multiple pages. All this is with reading. Another massive feature that people love is the video calling feature. We call this call an ally. Using the Envision Glasses, you can make a video call directly from your glasses to a friend or a family member.

They get a notification on their phone. We’ve built a special app for this particular video calling feature called the Ally app. It’s a free app available on both iOS and Android, coming to the web as well later this year. People install the app on their phone, they get a notification, your sighted friend or a family member can see everything from your perspective directly from the glasses. Since you’re not holding a phone in your hand and pointing it around, it’s completely hands free, you can do whatever you want to do.

A lot of people use it when they want to get help with cooking. A lot of users use it when they’re out walking around and they want to get some help, or they use it for some very personal moments. I know a very young person using the glasses. Before she goes on a date, she just wants to check in with her friend about how the outfit looks and things like that. Those are very personal moments that is possible using the Envision Glasses.

Right now, like I mentioned, you also have Aira on the glasses. Again, instead of having to call the Aira agent and then point your phone around, you can make a call directly from the glasses, it connects to the Aira agent, and then they can go ahead and see things from your perspective and you are completely hands free and they can also take images from the glasses as well. If you are outside and if they want to take a picture of a sign, they can directly do that from their side on the glasses, just like how they do on their phones.

Jonathan: The scanning of multi-page books, I take it you’re able to save that text somewhere on your device to refer to later.

Karthik: Yes. That is a huge plus as well because unlike other competitors in this space, Envision app and the Envision Glasses are actually connected together, though the Envision Glasses is a completely standalone device. You don’t need to have your phone with you or the phone open in order to actually use the glasses. You can leave your phone behind, but if you want to save the text, for example, from your Envision Glasses and then read them later, we have an export text functionality.

You can export the text that you’ve scanned on the glasses. It shows up in the Envision library in your Envision app. You can just go ahead and import it there and then read it on your phone, or then copy it and send it to people elsewhere and so on. Since Envision is available on both iOS and Android, you can just go ahead and export the document and it lands up in your Envision account on both these platforms, you can just open it wherever you want, read it, share it, do whatever you want to do with it.

Jonathan: I can understand why you wouldn’t adopt FaceTime for this because it’s not truly cross-platform, I guess it kind of is, but it’s also not. Have you considered using WhatsApp as a video call solution which is cross-platform?

Karthik: Yes, but the thing is, these video calling platforms are not open. We would like to use them, but they don’t have any software development kits that allow other developers to actually make use of their video calling technology. Moreover, we have a lot of plans for the Ally app as well. For example, in the future, you’ll be able to share your location to an Ally directly when you’re on a video call.

Those are all things that are possible when we build our own app and we have a lot more other features on the Ally site coming, which is the main reason why we decided to build our own video calling app instead of trying to adopt one that’s off the shelf, because the off the shelf guys don’t offer their solutions for other developers to build on, and two, we do have some custom requirements and custom features in mind that’s very specific to the glasses that does require us to go ahead and build our own tech.

Jonathan: What should I have in terms of expectations of being helped by the glasses when, say, I’m traveling around either outside or in a office building or a shopping mall and I’m looking for a specific sign to tell me that I’m getting to a store or a particular door that I want to go into, because of course that’s the challenge with GPS. It can get you there, but it’s that last 100 meters or so that’s the challenge. Are the glasses at that point where it would help with that kind of signage or is it better to phone an Ally at that point?

Karthik: It’s better to phone an Ally because the instant text feature is pretty powerful in the sense it can read all pieces of text around you. We believe it’s going to take a little bit more time for the technology to become more smarter and then be able to guide you in a way that’s safe, which is why when people are outside and they want to get help with things like signage or the direction and so on, I think it’s best to call a friend or a family member to Ally, or if you have Aira minutes, you can always go ahead and call Aira and they might be able to help you.

I think getting human help there is a much better thing to do than having to go ahead and use the glasses to be able to do that, but having said that, I think once you do get used to the glasses and you understand how the instant text feature works, there have been lots of power users of the glasses who after a few months of using it day in and day out, they understand how to exactly use the technology or the tool to get what they want and they become really adapted actually navigating or using the glasses to navigating those. They get really good at it. It is possible to get really good at it, but I would say in the beginning, if you’re just not getting it right, there’s always a human to help you.

Jonathan: Facial recognition, is that something that you’re pursuing?

Karthik: Yes. Envision Glasses can go ahead and be taught faces. You can teach it faces of friends and family members and it’s able to detect faces 1 to 1 1/2 meters in distance. It’s also able to recognize multiple faces at once. [unintelligible 01:34:17] A, B, C, after three faces that you’ve already trained it. Yes, it is capable of recognizing faces and we are working to extend that functionality to also objects. Someday you’ll be able to teach Envision objects like how you teach faces.

Jonathan: The competitor, I guess, for you in this space would be OrCam, I would think. How do you stack up if people ask you, and I’m sure they do, what’s the difference? What do you tell them?

Karthik: I think there are two parts to this question. One is the tangible stuff, which is all the features that the Envision Glasses offers. I think the biggest feature that I would put forward today and the reason why we’re on this call is that Envision is an extensible platform. It’s not a closed device, which is one of its biggest strengths. Because it’s an open platform, it’s an extensible platform, it allows us to have apps like Aira, or Cash Reader, or BlindSquare, and many other apps that are coming down the line onto the Envision Glasses.

When people buy a pair of Envision Glasses, they go ahead and get all these other apps along with the Envision Glasses itself, and wherever possible, we try to sweeten the deal by offering people– Like in the case of Aira, we’re offering 300 free minutes of Aira for anyone who buys a pair of Envision Glasses or even if they’re not users of the Cash Reader app, they get the Cash Reader app entirely free when they actually buy the Envision Glasses. The Cash Reader on the Envision Glasses is what I’m talking about. Those are all things that would never be possible with a device like OrCam because at the end of the day, their approach is very different from the open approach that we take.

Second thing is Envision offers more features than what the OrCam or other similar devices in this category do. We offer also more number of languages. If you are someone who is not just an English speaker, but you’re someone who is a German speaker or a Spanish speaker or an Arabic speaker or a Japanese speaker, all of these languages are available to you without any additional cost, completely free out of the box. Just when you buy the Envision Glasses, you have all these languages available and the Envision Glasses is also translated in about 30 different languages as of today.

We keep adding more languages every few months, as in when we have people translating it. I think the fourth big thing is when you buy a pair of Envision Glasses, you get a free 90-minute onboarding/training session along with help from Envision whenever you want to. That is the intangible aspect of talking about the people aspect, the community aspect, which is again, very, very unique to Envision.

Of course, when you buy a pair of Envision Glasses, you get two years worth of software updates free of cost, so you don’t have to keep paying for new features that we put out. All these features, for example, something like the voice commands or something like Aira, anyone who owns a pair of Envision Glasses today, and even someone who bought a pair of Envision Glasses two years ago, gets the Aira update free of cost.

We also offer the same 300 minutes of Aira free and other such perks that might come with the glasses to people who might have purchased the glasses in the past as well. You don’t have to pay for subsequent updates. After two years, you have the option of paying €100 or $100 per year to get continued updates of the glasses, or even if you choose not to pay for these updates, you’ll still have all the features that you have accumulated over the past two years when you first bought the glasses and you also get other mission critical updates that we offer.

I think the big difference which is easy to quantify but it’s hard to compare, is the pace of evolution of the glasses. I think you can ask anyone who purchased a pair of glasses back in October of 2020 when I first came on to the show, they would testify to you as to how dramatically the glasses have evolved without them having to pay an additional cent for these updates over the past two years and that’s the kind of commitment that we have to the glasses, to the product, and to the community that we continue to keep pushing out these updates and continue to refine the product as much as we can.

Jonathan: That €100 or $100, it’s a software maintenance agreement which is a pretty standard model in this industry.

Karthik: Exactly. It’s a way for us to sustainably keep improving the glasses and it’s a way for us to keep the cost of software and things like that, keeping it going after the first two years.

Jonathan: You’ve got Aira there on the platform now, how about Be My Eyes? Is it coming?

Karthik: Well, I would come on a different episode and talk about it someday. Let’s just leave it at that.

Jonathan: Yes.

Karthik: Let’s leave it at that for now, but yes, our aim is to try and get as many apps that the community loves to use onto the glasses as possible, so never say never.

Jonathan: One of the biggest challenges for assistive technology startups is trying to get into the various systems around the world and they all differ and they’re all complex in terms of getting funding so that blind people can go to their agency, whether that be a government entity or some other thing, and apply for funding for a device like this. Screen readers, Braille displays are well and truly established, but it can be a lot harder for newer companies to actually state their value proposition and get in that system. Are you having any luck there? Are there places in the world where people can now go to a funding entity and say, “Actually, my life would be significantly enhanced by having the Envision Glasses. Please fund these for me”?

Karthik: No, you’re so right. It’s been definitely a challenge in doing that. We work on that stuff as equally hard, or even harder than what we do on the technology side, on the product side, because that’s super hard as well. For example, in Europe, we have something called medical device regulations. We are MDR certified, which allows us to come under any insurance. In Europe, for example, has the Envision Glasses covered or can cover the Envision Glasses. For example, in the Netherlands, since we do have the medical device regulation certification, we’re able to go ahead and be part of all the insurance companies.

We can go ahead and, say, for example, a person in the Netherlands could write to the government and request funding and they can actually go ahead and get these devices funded. We’re trying to do the same thing for all the countries that we are part of. Of course, it is going to take a bit of time and it is very dependent on each country. What we do when someone comes to us and says, “Hey, I would like to go ahead and buy the Glasses, but then I would like to get funding for it,” we put them in touch with our local distributors and we have distributors, as of today, across the world, including in Australia and New Zealand, we have PVI, great distributors.

What we usually do is when someone comes to us and asks us for funding, we put them directly in touch with the distributors and the distributors can help a lot with getting funding or checking if funding possibilities are available and then helping the people through that process. In the UK as well, Envision is now covered under Disabled Students’ Allowance, which allows students to request funding for the Envision Glasses in the UK and it’s possible to get that covered. We also have insurance coverage in the UK as well.

Wherever possible, we also try to offer customers a payment plan so customers don’t have to feel the pinch of having to pay for this upfront. Again, we’ve partnered with a lot of local distributors, and wherever possible, distributors offer customers the ability to make payments in installments, which is actually very helpful as well.

Jonathan: I look forward to learning more about what’s happening and also at some point soon, we will do a full demo of the glasses as well. Thank you for coming on the show. It’s a pleasure to catch up, and I know that there’ll be a lot of excitement about the Aira announcement, so congratulations to both organizations for making that happen.

Karthik: Thank you so much for having me, Jonathan. We’re definitely going to get you your glasses as soon as possible for you to take it out for a spin, and yes, once again, thank you so much for having me here. Absolute pleasure.

Speaker 3: Like the show? Then why not like it on Facebook too? Get upcoming show announcements, useful links, and a bit of conversation. Head on over now to facebook.com/mosenatlarge, that’s facebook.com/M-O-S-E-Natlarge, to stay connected between episodes.

Peter: Hi Jonathan, it’s Peter from Robin Hood county, hoping you and your family are well. On listening to the problems that people are having with Siri, personally, I don’t have that many issues because I find that I do most of my work using Siri during the early morning, because I’m an early riser. This is because I was always told that internet use is lighter during the early hours and many, many moons ago, I used to work nights and use the internet at the same time. Well, I suppose it was a sort of intranet because it was from firm to firm.

I do find on the odd occasion that Siri can be a bit difficult, but I think it needs to be improved vastly. The other thing I’d like Apple to do is a router. I would drop my WIFI service immediately if Apple did a router. Fix Siri and do a router, that’ll do for me as far as Apple’s concerned. Nice to hear you and all the emails again, and I hope you enjoy your trip, especially the one to the UK. As I always say to people in London, I know some hotels where they don’t ask a lot of questions. That is an old Kenny Everett joke, by the way.

Jonathan: It’s like that Mr. and Mrs. Smith. Thank you, Peter. Well, I can tell you that we’ve got gigabit fiber, which by New Zealand standards is slow now because the big fiber plan you can get in New Zealand is 8 gigs down and 8 gigs up. Even I can’t really conceive of a use case for that where we are so I haven’t invested in the technology that would make the 8 gigs down and 8 gigs up possible.

If there’s a server crunch, it’s at the Apple end, and if that’s what’s going on, if it’s simply a server crunch thing, then Apple’s got the money to invest in more robust server technology. Interesting what you say about the router because Apple did use to make them, the AirPort Express and the AirPort Extreme and they were really very good. I guess they just made a determination that routers aren’t their core business and they’ve got out of them now.

Now we are invested in the ubiquity ecosystem and every so often I have a grumble about some accessibility challenges I have with the full web user interface of ubiquity, but largely it is really a good experience so I wouldn’t go back now. Gene Richberg suggests that if you’re having a problem getting Siri to get things done and it just gets stuck like that, try going into the control center and toggling airplane mode on and then back off again. In most cases for me, says Gene, this seems to fix that problem. It’s the equivalent of turning it off and back on again without having to go through the process of a reboot. It shouldn’t be necessary, but it may help in some situations. That’s a useful tip.

It’s funny, says Gene, this is a tip a friend of mine found out quite a few years ago from Apple’s accessibility department. She continues, now, as far as notifications not coming in naturally without having to wake the phone up, if you remember back when iOS 15 came out, Apple introduced the scheduled summary that is supposed to keep all non-important notifications silent and only notify you about things it thought were important. Well, it doesn’t look like that works as expected. In fact, I had this very thing happen to me in the beginning after I upgraded.

I had made an order through Walmart and I didn’t get any notifications about where my order was in the process and had to constantly check the app and found out that my order had been delivered. Thank God I was keeping up with it, but man, I was furious trying to figure out what was going on. Then I remember hearing about that stupid scheduled summary feature. When I turned it off, things started acting normal.

To get to this, just open the notification settings, then it seems to be the very first option there. If it’s turned on, just double tap it to turn it off. If you and Steve could try these suggestions and see if they work for maybe you and others who are having these issues and hopefully these suggestions will work for them too. Thank you, Gene. It’s a timely reminder for those who may not know about the notification feature. Unfortunately, this issue has been around a lot longer than iOS 15 for me. It’s been an issue for quite some time now and I have all those grouping of notifications switched well and truly off because I got the FOMO, the fear of missing out, so I like all my notifications coming in in real-time.

Unfortunately for me, it isn’t that issue and it’s not that it happens to me when I haven’t woken up the phone. The phone is awake. It’s on the home screen, just lying on the desk because I have the auto lock mode disabled. It’s sitting there, switched on, and sometimes it will just go into this mode. Looking it up, it does appear to be a bit of a common problem. I believe Apple said they fixed it. For me they have not. Maybe for those who don’t know about the notification summary, it might be worth checking into that, so thank you so much for passing on those ideas.

Costa: Hello, Jonathan, this is Costa from Finland. First, thanks for the podcast. It is very interesting, the topics you cover. I would like to let you and your listeners know about how good Apple is currently at receiving feedback because I’ve been a developer subscriber for a long time. I’ve been using the developer betas of iOS for years and years. I’ve noticed that within the past few months, Apple has greatly improved in answering to and taking feedback into account, otherwise as well. I am an Eloquence user on iOS 16 betas and I have noted some bugs with the Finnish version of Eloquence, and I have reported them to Apple via the feedback assistant application.

I’m amazed at how good they are at responding and fixing the bugs. For instance, the quality of Eloquence was questionable, or rather it was not as good as it could have been in betas 1, 2, 3, and 4. However, in beta 5, they added a 16 kilohertz version of Eloquence. Currently it has a bug where Bluetooth audio has a better quality like an uncompressed version of Eloquence and the iPhone’s own speakers have a compressed or distorted version, but the quality of this has been improved immensely by them adding the 16 kilohertz version of Eloquence to the fifth beta.

I’m amazed because now, of course, I can’t know how many others have requested this, but I have an ongoing feedback thread with Apple, and I have requested the 16 kilohertz version less than two weeks before it became available. That’s amazing. I never would’ve thought that would happen so fast. They’ve also fixed some issues with intonation pauses and others.

Again, I can’t be sure how many others have submitted the same feedback as I have, but according to feedback app, there are no similar feedback submissions, so go figure, but just for your information and the information of your listeners, if you notice any bugs with accessibility or otherwise with Apple products in the betas, and you have the chance to submit a feedback via the feedback assistant, I would greatly suggest you do so because they are obviously listening and improving stuff based on our feedback.

I have a feeling that we, the blind community, matter more these days than before. Maybe it’s just me, maybe I just feel like that because the issues I described have been solved, but still, it feels great to know that they haven’t forgotten us.

Jonathan: This email says, hello, Jonathan, my name is Kyla Golden. I’m writing this email in regards to the current iOS 16 beta cycle. You might think that I’m crazy, but I’m not much of a fan of Eloquence, though I do sometimes switch to it when I’m reminiscing about my days as a third grader using BrailleSense U2, which used a modified version of Eloquence. I did however manage to get my hands on some of the other voices available for download, including my current favorite, Evan.

I also tried my lack at the novelty voices, out of which Box, Zarvox, Trinoids and Super Star were my only useful candidates. My only complaint with the current stage of beta testing is that Apple haven’t thought to add a startup sounds to the iPhone like we hear on desktop computers such as the MacBook. This is still an issue as sometimes I do have to power off my phone for troubleshooting reasons. I just don’t get why Apple didn’t just decide to make the startup sound universal because it can be done on a mobile phone.

For example, the SmartVision 2 running Android. Speaking of the BrailleSense U2, I am currently using a loaner BrailleSense 6 while mine is being repaired, and I noticed that HIMS have released a patch software update to fix a few bugs. The last time they released a small version like this, they also changed the version number, calling it V 1.6 instead of simply 1.5 patch. However, this time I was surprised to find out that they have simply kept the V 1.7 moniker rather than calling this one 1.8.

They did also separate the web radio and voice recorder functions into their own perspective applications, which is a welcome change. I also would like to mention that Google are finally letting Android users use the high quality versions of the voices in their standard TTS engine. I only found this out when updating the speech services by Google app on my BrailleSense to the latest build and switching it from the default robotic sounding female voice. We’ll call her Chromebook. I hope to hear from you soon. Keep up the great work. God bless you. Thank you very much, Kyla, appreciate you writing in. You have fun with those toys.

[music]

Jonathan: In days of yore– days of my what? No, not that kind of your, the other yore. In days of yore, the halcyon days of Mosen At Large episode 183, we were talking with Paul Edwards about whether Braille should be capitalized or not when referring to the code. In an attempt to be conciliatory, I said to him, “Well, maybe if we’re talking about Braille as a verb, I brailed this, maybe in that situation you could get away with not capitalizing it.”

In an attempt to find a precedent for this I said, “When we talk about I Googled this, I Googled this, and here’s what I found. Do you capitalize the G?” Paul said, “I definitely don’t.” I thought to myself, “I think I always have.” In an attempt to be conciliatory, because I’m a really conciliatory guy, I said to Paul, “Well, all right, maybe we could agree that you should always capitalize the Braille when you’re referring to the code, but maybe we can park the verb question or agree that we can take out the verb question, because to me there are just so many obvious precedents regarding spelling Braille with a capital B when you are referring to the code, other codes, other scales, other things that are similar.

We covered those in episode 183. It is a compelling no-brainer of an argument. Well, Darryl Shandro has written in and he said that The Grammar Girl Podcast, which I know is highly regarded, I’ve never listened to it, but I have heard people talking about The Grammar Girl Podcast, has said that when you are talking about Googling something, you’re talking about Google as a verb, you should spell it with a capital G and others have said this too. That you can have verbs that are capitalized. Just because it’s a particular part of speech it doesn’t mean it shouldn’t be uppercased.

It looks like the jury is in on this, and I will provide a link to The Grammar Girl Podcast episode that Darryl references in the show notes. Thank you so much for sending that in, Darryl.

Mike May: Hey, Jonathan. As far as the question about the person who couldn’t set their ride to the pharmacy and back. The problem with Uber is that the way that they add additional stops is a bit confusing, but the way to do it is to set your home as your destination, even though that’s your starting point and your destination, and then your first stop is the pharmacy. The way that they have you add stops is confusing, but if you mess around with the destination setting, then you can get that sequence correctly input and there won’t be an issue.

The driver can change that if they choose to. You would have to approve it, but they can change it. They just don’t want to, or don’t know how. In Lyft, it’s easier. It’s really clear. Stop 1, stop 2, stop 3, destination. It’s easier to set. I use Uber more often, but that’s the way that they work it. You mentioned Aftershokz. I have a contact with the developers and the head of Aftershokz. They are now called Shokz in the US. Occasionally when I hear things, I pass along comments about connectivity. I think the problem that the user described probably has a bit to do with Aftershokz, but I think a lot to do with just multiple audio things going on when you have voiceover.

I just find this problematic with a lot of different apps and on the PC. If you do get Aftershokz and if you can use them for Zoom in a PC mode, they have a dongle which makes things much more solid that everything’s going to behave the way it’s supposed to. Seems like every time I reboot my computer, my audio destinations get messed up. If you get the Aftershokz with the UC, that designation is having this dongle, which will make it work better.

Of course, the best one for good audio, if you want to be picked up in noisy environments like a car or a train or at your home office where you don’t want to have a lot of echo, the Open Comm, O-P-E-N C-O-M-M is really the best model for that purpose. A while back, you talked about why would people want to use the FaceTime sharing of movies. Gina and I use that frequently because we’re back and forth between two different towns and sometimes we want to share the evening together and watch some Netflix. It’s really slick the way it works.

Amazes me how good the audio is in sharing it. I just launch Netflix on my side, I call on FaceTime with video, of course, and I don’t think it works with FaceTime audio alone, and it shares nicely and you can talk to each other and hear the movie really clearly. That’s the use case that’s good for us.

Jonathan: Thank you, Mike. That’s Mike May in with that contribution, and yes, I agree actually, because Bonnie was stranded when the airline canceled her flight to come home from a conference and the final episode of season three of For All Mankind, which is just our favorite show, came out. We used SharePlay to watch it together and it was super exciting, a very, very slick implementation they’ve got there. I love to hear from you, so if you have any comments you want to contribute to the show, drop me an email, written down or with an audio attachment to Jonathan, J-O-N-A-T-H-A-N @mushroomfm.com. If you’d rather call in, use the listener line number in the United States, 864-606-6736.

[02:00:39] [END OF AUDIO]