Skip to playerSkip to main contentSkip to footer
  • yesterday
On this episode of The Vergecast, we’re going to dive deep into why accessible design is universal design. First, guest host Victoria Song will chat with Jason Valley, a visually impaired Verge reader. Jason initially reached out to Victoria after her ⁠Live AI hands-on⁠, challenging the notion that the feature was a “solution looking for a problem to solve.” Jason shares how the tech has helped him live a more independent life, what he’s hoping to see improve, and how the blind and low-vision community has enthusiastically embraced the technology.
After that, Victoria sits down with Be My Eyes CEO Mike Buckley. Be My Eyes is an app that pairs blind and low-vision users with sighted volunteers to help them go about their day. Buckley gives his thoughts about how accessible tech design benefits everyone, why smart glasses and AI are a natural combo, and what challenges and opportunities in this space remain.

Category

🤖
Tech
Transcript
00:00:00Welcome to the VergeCast, the flagship podcast of what AI smart glasses are actually good for.
00:00:08I'm your friend V, and we are in week two of hot girl VergeCast summer.
00:00:14Anyway, today's show is really exciting and quite personal for me, actually.
00:00:19A few months ago, I was reviewing the meta smart glasses, and I kind of flippantly commented that I didn't understand why anyone would need AI to give a really,
00:00:30obvious description of something that they're looking at.
00:00:33Well, a bunch of visually impaired VergeCast listeners and Verge readers wrote in and said,
00:00:39actually, me, me, I am the person that this helps.
00:00:43And so, you know, a few of them actually challenged me to rethink my framing, and I said, you know what?
00:00:49You're right, because here at the Verge, we do love good feedback.
00:00:53So today, we are talking about smart glasses and accessibility.
00:00:58First, we'll hear from one of the listeners who wrote in to me, and he's going to tell us how the meta glasses has changed his life for the better.
00:01:06And then I'll talk to the CEO of Be My Eyes.
00:01:08That's a company that provides AI and live volunteer services for people with visual impairments.
00:01:14They're integrated into those smart glasses, and we'll hear about how they're thinking about how this technology can help the visually impaired going forward.
00:01:22And finally, my colleague, Mia Sato, will be joining me to answer a very spicy, potentially not safe for work question about smart glasses.
00:01:30All that coming up right after the break.
00:01:34This is the Vergecast.
00:01:38Support for the Vergecast comes from Shopify.
00:01:41When it comes to growing your business, you need a partner you can count on.
00:01:44A partner like Shopify.
00:01:46Shopify is the commerce platform behind millions of businesses around the world,
00:01:50and 10% of all e-commerce in the U.S.
00:01:53From household names like Mattel and Gymshark, to brands just getting started.
00:01:57With Shopify, you can build a beautiful online store to match your brand's style.
00:02:02Just use their design studio and choose from hundreds of ready-to-use templates.
00:02:06Or you can accelerate your content creation by taking advantage of a host of helpful AI tools.
00:02:11And easily create email and social media campaigns wherever your customers are scrolling or strolling.
00:02:18If you're ready to sell, you're ready to Shopify.
00:02:20You can turn your big business into reality with Shopify on your side.
00:02:25You can sign up for your $1 per month trial and start selling today at shopify.com slash Vergecast.
00:02:32Go to shopify.com slash Vergecast.
00:02:36Shopify.com slash Vergecast.
00:02:44Welcome back.
00:02:45Our first guest is Jason Vallee.
00:02:47He's one of the listeners who wrote in to me about how AI smart glasses changed his life.
00:02:52He also sent me a picture of his guide dog, Noel, who is cute as a button, and we'll put a picture of her into the show notes.
00:02:59Let's get into it.
00:03:00So, Jason, this is kind of a really special for me kind of interview because I do read all the reader feedback we get.
00:03:10Maybe not the comments sometimes, but usually if it comes into my email, I definitely try and take time to read it.
00:03:16And so when you reached out to me, it was because I had done a hands-on of the meta AI, live AI feature, and I sort of just kind of posed it in the framing as a solution in search of a problem.
00:03:35I was kind of asking, like, who is asking for this, and you sent such a thoughtful email about, like, hey, I use this tech.
00:03:43This tech actually dramatically impacts my life.
00:03:47So I guess what I wanted to ask you is, like, what drew you to the meta ray bands in the first place?
00:03:54And, like, can you explain the appeal of live AI for your life?
00:04:00Live AI is a visual interpreter for me.
00:04:05In my life, if it's not something that I can perceive with my remaining vision clearly, I need to ask somebody, hey, do you mind telling me what this is?
00:04:15Sometimes that can be a challenge.
00:04:16Not everybody is capable of saying, hey, you know, this is such and such.
00:04:21This is a widget.
00:04:22It's blue.
00:04:23It's round.
00:04:24It's this or it's that.
00:04:27Having the live AI, I can go into environments where I would need a chaperone.
00:04:33That could be a Dollar Tree.
00:04:35It could be going to a restaurant, and I can engage live AI and ask it to read the menu or tell me what's in front of me so that I'm not inconveniencing my companions and I'm getting on in a normal life.
00:04:49I'm enjoying my lunch.
00:04:50I have a menu that I have a menu that's in front of me that I can actually do something with.
00:04:54It's not just, I guess, a fan in front of me while everybody else is picking out their stuff.
00:05:00I know what I want.
00:05:01I can decide for myself.
00:05:02And that's kind of a powerful thing.
00:05:05Yeah, that's definitely not something that I, as a sighted person, would think about.
00:05:11Because me using that feature, it was very much like a, well, yeah, I can read the thing in front of me.
00:05:18But, you know, I think when we talk about blind and low vision people, that is a wide spectrum of, you know, ability to see.
00:05:29So, could you maybe describe what your condition is most like so that our listeners understand?
00:05:35Because I think when us sighted people hear blind and low vision, we just think of the main image that most people think of, which is someone who can't see at all.
00:05:46Correct.
00:05:48Vision loss in general affects a wide variety of people from no light perception at all to somebody who has low vision, which I believe is defined as 2060.
00:06:02And that's at a point where a person can no longer drive.
00:06:06In my particular case, I contracted a staph infection in 2015 and it affected my blood pressure.
00:06:14I had two emergency surgeries to correct it and it essentially damaged the retina in the back of my eyes.
00:06:23I had three separate injuries that occurred over the course of a year.
00:06:27And the official diagnosis that I got was NAION, which stands for non-arcturitic ischemic optic neuropathy.
00:06:36What that means to the average person is an optic nerve stroke.
00:06:41My retinas, which would normally be in the shape of a volcano, where your central vision is a very tightly compacted group of nerve cells in the middle, mine are entirely flat.
00:06:53So what that translates for me is in my left eye, I have no light perception from about 11 o'clock to about four o'clock.
00:07:03And then from four o'clock to seven o'clock in my left eye, I can look at things.
00:07:08If I look at it directly, it disappears.
00:07:10If I look past it with my eccentric vision, I can still perceive it.
00:07:14I can still read things if the text is large enough.
00:07:17If I have it white text on black, it makes it easier to see.
00:07:21In my right eye, I have no central vision.
00:07:25I have just the periphery.
00:07:26And again, when I look at something, it disappears.
00:07:28But if I look past it with my eccentric vision, eccentric vision, I can see something there.
00:07:36And then everything else, for lack of a better term, is looking through a plastic baggie.
00:07:42Everybody agrees that it's clear, but if you fold it, you really can't see through it at any distance.
00:07:47So for me, I have to be about 8 to 10 inches from a screen with large text for it to be useful to me.
00:07:58In the wild, if I go to a restaurant, let's say a buffet, I really don't want to get my face 8 to 10 inches from the food to see what's there.
00:08:06I think that's kind of disgusting, and there's a reason the sneeze hoods are there, to keep human contact from your food.
00:08:14So having the glasses, if I go into this buffet, I can ask it, what's in front of me?
00:08:20Can you read the sign?
00:08:21It'll tell me, okay, well, you have this option in front of you.
00:08:24So I don't have to rely on people with my residual vision in the wild.
00:08:31I can do things for myself.
00:08:33I travel independently with a guide dog.
00:08:35We have a walking lifestyle in my neighborhood, and I have some great friends which will take me out to do my shopping.
00:08:42That could be at Lowe's to get supplies that I use for my crafting.
00:08:47It could be a trip to Dollar Tree to get some things that I need.
00:08:51It could be to Albee's.
00:08:54That place is easy to shop for.
00:08:56Everything is the same in all the stores.
00:08:58So getting back to the vision part, these places are a challenge for anybody with any level of vision loss.
00:09:08So having these glasses, having a visual interpreter quietly telling me the things that I need is absolutely a game-changer for independence.
00:09:19What brought me to them is some years ago, I'm on my, I guess, 10th year of vision loss, is there's a device out there called OrCam.
00:09:28That device starts about $3,500.
00:09:32That's rather expensive for a blind person who's on disability.
00:09:38Looking at the price tag for the Meta when they came out last July, I was able to snag a pair of refurbed ones with a double warranty for $200.
00:09:48I was able to slip in some custom lenses for $65, and I have myself a great working pair of Meta glasses.
00:09:56So, you know, that's a huge example of cost and something that I think we don't necessarily think of because you're right, $3,500.
00:10:06That's an Apple Vision Pro level area of price.
00:10:10So, you know, something like the Orbi versus the Metas, like, did you think that this was, how do I say it?
00:10:21Like, was this something that to you was immediately obvious?
00:10:25Like, yes, I need to try this.
00:10:27Yes, this is absolutely worth the investment, no matter, you know, it can be kind of tough with emerging technology where you haven't tried it yourself.
00:10:38You're kind of relying on, you know, for the large part, sighted reviewers' impressions of this technology.
00:10:45And I think one thing that us tech reviewers could do better is to think of accessibility needs when we are, you know, to the ability that we are able to when we are reviewing these products.
00:10:58And I'll admit, that was something that I was not necessarily aware of or thinking of when I first reviewed the Meta Ray-Ban.
00:11:06So, I think if sighted, I mean, if a blind or low vision person was reading my review, they wouldn't get a sense of how this could help them.
00:11:15So, I'm curious, in your process of, you know, knowing what was available to you before and then reading about this new technology that was coming out, was it immediately apparent to you that this could be a game changer?
00:11:29The packaging, I guess you could say, on what was being pitched offered a lot of wonderful.
00:11:37I started doing some research, looking online, looking at the different options that were there, and I realized that this was the first option that actually had the AI enabled on it.
00:11:49And I started before a lot of the current options became available.
00:11:54There's been a lot of updates over the last year that have been amazing.
00:11:58And I'm part of the early access program on that.
00:12:02Part of my research involves social media.
00:12:05I'm part of several blind and visually impaired groups.
00:12:09And these started to show up on the radar, and people started talking about them and what it can do and what it can offer.
00:12:17And I've always been an early adopter.
00:12:19And it's like, okay, this is looking like it really wants to fit what I would like it to do.
00:12:26So, I started making a plan to get it into my budget and got it into my hands in July of last year.
00:12:33And it's been a game changer since then.
00:12:37So, you mentioned that it has enabled you to live and work more independently.
00:12:43I'm curious if you were ever a user of something like Be My Eyes.
00:12:48That's kind of another example that's come up primarily through smartphones.
00:12:52And it does have an integration with the MetaRay Bands as a type of technology that also helps other people live and work more independently.
00:13:02So, I'm curious, how does this experience compare to you?
00:13:05Is it complimentary?
00:13:06Do you use Be My Eyes on the glasses itself?
00:13:11I use Be My Eyes on my phone.
00:13:13I'm an iOS user.
00:13:15My primary use of Be My Eyes is to describe images that I can't pick out the details on.
00:13:23So, I can see a rough image of something.
00:13:26If you look at it, it's very blurry.
00:13:28So, I can run it through the AI and tell me in detail what it is that I'm looking at and pick up the nuances that are there.
00:13:35However, I haven't really used the volunteer aspect of things to reach out.
00:13:42I'm a little less comfortable talking with somebody about my medications or some of my day-to-day stuff.
00:13:48Having the AI gives me that privacy.
00:13:51So, I'm, I guess you could say a hybrid user of Be My Eyes, but I do have it enabled that if I need to contact a volunteer, I can enable it with a voice command.
00:14:03That's like a really interesting kind of example.
00:14:07Because I think when people talk about AI, from skeptics at least, a common refrain I hear is like, who is this for?
00:14:17Why would you want this?
00:14:18And this idea that maybe the AI can afford you at least a sense, because we don't know what all these companies do with their data per se, but like it can at least afford you a sense in your daily life.
00:14:30A feeling of like, oh, I don't have to have another person see my medication.
00:14:35Is that kind of your main, is that kind of the main appeal for you for AI versus a volunteer?
00:14:45Are there certain situations where you might want to talk to a volunteer more than a AI?
00:14:51And how have you found the AI in general when you are in those situations?
00:14:56AI is a bit more concise with the information that I'd like to get.
00:15:01With the meta AI, I'm not able to do some things because of their safety guardrails that they've put on.
00:15:10So I can't discuss a lot of my medications up close with it.
00:15:15I can simply get information off the bottle of the name, and that's about it.
00:15:19And then it'll tell me I can't proceed with the rest of the information.
00:15:22I can't describe it.
00:15:23So that is a frustrating point.
00:15:25But in other things, if I need to get instructions for something, I guess, less dangerous, if you will, something that doesn't have a life and safety, it really streamlines my day to get through it.
00:15:41So instead of waiting for somebody else to become available to me that can say, hey, Jason, this is your electric bill, and it says that you owe X amount, I can ask meta to, hey, look at this and describe it.
00:15:59And it'll go through and give me the juicy bits.
00:16:01It'll tell me how much I owe, who it's for, and when it's due, rather than giving me a complete spit-out of everything top to bottom and all the information that I don't need.
00:16:12With a human being, I can get that personal touch, but not everybody is capable of describing something in the way that's useful for me.
00:16:23People can do it in a general sense, but if I really need information, some people will say, well, it's over there.
00:16:32To a blind person, over there is a really open concept, especially when somebody is pointing and you can't make use of those hand gestures.
00:16:41So I'll ask somebody, hey, could you be a little more specific?
00:16:47Oh, yeah, it's over to your left, over there.
00:16:49And, you know, sometimes I'm slapping my forehead, you know, in frustration and having an AI, I can kind of drill down on the information that I need.
00:17:01So I can ask it, please look and describe, and it'll tell me, okay, it is such and such.
00:17:08Okay, well, I'd like to know a little bit more.
00:17:10What is the information on it?
00:17:11What text is on there?
00:17:13And it'll get into more detail.
00:17:15And if it's still not enough, I can ask it, and it's significantly more concise.
00:17:21So I'm getting the information that I need.
00:17:24If I'm in a place I'm not familiar with, I can ask it to read the sign on a bathroom and tell me if it's for male or female.
00:17:33That avoids a little bit of embarrassment.
00:17:35If I'm looking at an elevator pad, I can tell which way the buttons are supposed to be going.
00:17:42If I need to go to my doctor's office, I can look at the directory and find out what suite number it is.
00:17:48So having the AI on demand saves me a lot of time and not having to wait.
00:17:57It gives me precise and concise information to get through my day so that I'm not inconveniencing somebody else.
00:18:06And when you're visually impaired, it really feels like you're inconveniencing others to ask.
00:18:11Yeah, you know, I thank you so much because this is kind of a thing that I don't necessarily always think about.
00:18:22But, you know, it hurts my heart a little bit to hear that you feel like you're inconveniencing other people, to be quite honest.
00:18:30But I appreciate that AI is at least giving you a greater sense of independence.
00:18:35You know, when my mom was sick, the thing that she told me all the time was that she just wanted to feel independent and to not have to ask me.
00:18:43Even though, you know, for me, I was like, I would love to help you.
00:18:47I would love to do anything for you.
00:18:48So, you know, it does hurt my heart a little bit to hear that.
00:18:52But I am kind of, you know, sitting with the thought and absorbing the idea that, you know, maybe one really beautiful takeaway is helping people feel independent in this.
00:19:05And, you know, I'm curious what hopes you have for this technology going forward.
00:19:10And, you know, we do have some meta people listening to this podcast.
00:19:14So, you know, if you could speak to them directly and, you know, some improvements that you'd like to see in this text and pain points that you've had, like, would love to hear them.
00:19:25Some of the pain points that I've experienced with the live AI is it's a battery drain.
00:19:31I get about four hours of average use before I have to recharge my battery if I'm not using the live AI.
00:19:39As soon as I kick that in, I can get about 20 minutes to a half hour on a full charge.
00:19:46And then I have to recharge my glasses.
00:19:48Then I have to go analog.
00:19:50And that's kind of an inconvenience.
00:19:52So having a more efficient battery would be nice.
00:19:56I also noticed that unless I'm on a really fast data connection, that sometimes a live AI isn't as fast and responsive.
00:20:07So I don't know if there's a way to kind of tweak that for the information that it's getting from the user.
00:20:16But definitely the battery would be helpful to have something.
00:20:22Along the live AI, something that I've noticed and I've compared notes with some other friends is I'll ask the live AI to please let me know when it notices something that I'm looking for.
00:20:35And in my case, let's say I go to Dollar Tree and I'm looking for a specific item and I know that I'm in the area.
00:20:44I ask Meta to find it and it says, okay, I'll locate that for you.
00:20:48We go past it and I'll say, did you find it?
00:20:51Oh, yes, I found it.
00:20:52And it's like, why didn't you tell me?
00:20:54I asked a friend of mine who's also visually impaired.
00:20:59And in his case, he asked it to tell him, okay, let me know when I'm at my greenhouse.
00:21:05And he knows he's getting close.
00:21:07And he asks again and, oh, yeah, we're there.
00:21:11And it's like, well, you said you were going to tell me, okay, I'll remember that for next time.
00:21:14And it doesn't seem to remember that going forward.
00:21:18So I'm not sure if that's necessarily a pain point or growing pain of training an AI model.
00:21:27You know, it's not necessarily a blind thing, but it would be incredibly helpful for that.
00:21:33Yeah, I would agree with you.
00:21:35I also would like something like that because I'm super ADHD and I constantly forget the things that I'm looking for all the time and then start looking for other stuff.
00:21:46So, you know, I think that would help everybody, not just people who are visually impaired.
00:21:53You mentioned that you are part of a bunch of communities and that part of you finding out about this tech was hearing about it from word of mouth.
00:22:04I'm curious, have you seen the metas getting adopted by your community in ways that maybe were unexpected or, you know, is the community kind of adapting to the fact that this tech is largely accessible, widely available?
00:22:21Yeah, I think you mentioned something in an email about your guide dog and maybe thinking about ways the glasses can interact with guide dogs.
00:22:33In my community, these are spreading like wildfire.
00:22:35One Facebook community that I'm part of is Blind Buddies and it's like every third post, somebody is asking for the impression of getting the Meta Ray-Bans and you read the posts of people coming down and how they use it for sorting their mail, sorting the medication, getting through their daily life.
00:22:59It's accessible.
00:23:00People are being gifted pairs.
00:23:02People are saving up to get a pair.
00:23:04These are becoming a must-have in the blind community.
00:23:09When people are responding back, there is so much positive for the visually impaired.
00:23:19It's a wonderful example of how mainstream technology benefits everybody rather than just being a niche group, which is very expensive to develop and work in.
00:23:31We have access to something that is just phenomenal at a very economical price.
00:23:40And something that also is beneficial to our families is the visual interpreter fatigue.
00:23:48When you're constantly asking somebody to describe something for you, that creates a lot of stress on family members, whether it's your spouse or your kids or whoever is helping you on your day-to-day.
00:24:01So that's a lot of value that gets put into the day-to-day within our community as well, increasing our contentment and happiness in the family by not being so reliant.
00:24:12So tell me about how your AI glasses kind of integrate with your guide dog school that you're in.
00:24:19I'm a student at the Seeing Eye in Morftown, New Jersey, and I lost my guide dog link back in December, and I was paired with a new guide dog, Noelle.
00:24:31This guide dog school is 100% donation-based, and they've been considering adding a small module to guide dog training in how guide dog handlers integrate their AI.
00:24:44And something that came up was introducing it at the right time.
00:24:51AI for every visually impaired person is not appropriate at every level.
00:24:58It's a great augmentation to get through your day, but it's also important to have the basic skills so that if this technology isn't available to you, you still have your basic skills.
00:25:10And then from there, you can only do better when you have it.
00:25:13One instance we had talked about in the email how it benefited me getting to and from guide dog school, I use an app called Soundscape on my phone.
00:25:26I use that when I travel in unknown locations, and it gives me an audible description of what's around me as far as a GPS.
00:25:35And having the wonderful audio coming above my ear, it's subtle, it doesn't take away from my awareness around me, so I can use that to get into the airport.
00:25:50I can interface with an escort to get me to the ticketing counter.
00:25:54When I landed in Newark, I didn't know who was picking me up, and it was easy just to tap on my glasses and answer the call coming in, rather than put all of my luggage down and answer.
00:26:07I could just accept the call and connect with the people.
00:26:13And when I got on campus, we have the legendary coffee machine.
00:26:17It's in Braille, and it's also in text, and I can't really read either, so using the glasses to find out which brand of caffeine I wanted for the morning made guide dog school a lot easier,
00:26:32because we're 18 days of up at 5 a.m. and going to bed at 10 p.m., and we're walking 5, 6 miles a day.
00:26:41Kind of need that energy.
00:26:42Yeah.
00:26:42Need the right energy.
00:26:43Yeah, you know, caffeine should be accessible to everybody who needs it, regardless.
00:26:49So what would you like for those of us who are sighted or perhaps skeptical of AI to better understand about the experience that you've had?
00:26:59Like, what is something that everyone listening to this show should walk away and maybe file away in their brains to think about?
00:27:07I think with AI, people are looking for the negatives right now, and I think we need to be open-minded to the amazing things that it can offer us to improve the human condition.
00:27:21I mean, if you look at what blind life was like 100 years ago or 150 years ago before Braille really became a thing, being visually impaired meant a life of isolation and loneliness.
00:27:35I mean, having access to this evolving technology allows us to be normal, to live a life like everybody else.
00:27:49Having access to the same tech that everybody else has without it being special or being something that sticks out like a sore thumb, these are amazing things for us.
00:28:02I mean, to have a visual interpreter that sits above my ears, that's kind of a game-changer.
00:28:10I mean, to dismiss that as a bad thing, I'd really have to rethink that.
00:28:18I mean, there's so much wonderful that can be harnessed from AI for those of us that don't have all of our senses.
00:28:25I mean, you take the hearing-impaired community, having somebody or having an AI interpret visually what somebody is saying without sign language, that's also another opportunity for AI to advance.
00:28:43So there's a lot of things for the human condition to improve with AI rather than it being the doomsday device.
00:28:49Yeah, I can also fall a little guilty into the AI doomerism.
00:28:54I think it's good to hear about the positive use cases because one thing that we're all having trouble with is parsing what the AI tech people or the, you know, Andy Jassy out here with Amazon saying like, oh, you know, we're going to reduce the workforce.
00:29:12I mean, that's not a great thing for the CEO of Amazon to come out and say, but I think sometimes it's hard to imagine how it fits into our daily life.
00:29:21So I really want to thank you for taking the time, not only to like call me out, to think in a different way, I deeply appreciated that, but also just to like challenge us all to think about how it can fit into daily life.
00:29:35And I am so deeply thankful that, you know, you were able to share your slice of, of this experience with us and so that we can maybe think more creatively because it is very popular and sometimes very funny to, to make jokes about, hey, how AI is going to kill us all or steal of our jobs.
00:29:54So, you know, I think this is an important conversation to have too.
00:29:58It's also funny to joke around with the AI.
00:30:00We have two dogs in our home.
00:30:03One is a nine pound black and white chihuahua.
00:30:06And I asked my AI to describe it and it described him as a cat.
00:30:10And I tried to describe to my AI that it is in fact a dog.
00:30:14And my AI was arguing with me that says it's obviously a cat.
00:30:18You can tell by its ears.
00:30:19Oh, wow.
00:30:21So I'm going round and round with my AI and it took me a minute.
00:30:24And it's like, you're an idiot.
00:30:26You're arguing with a computer.
00:30:29You know, I've had those conversations as well.
00:30:32And, you know, maybe that's something that will improve over time.
00:30:35But, yeah.
00:30:37Oh, my gosh.
00:30:37Jason, thank you so much for taking the time to talk with us.
00:30:42Thank you very much for having me.
00:30:43I appreciate you.
00:30:44So we've got to take another break.
00:30:46But when we come back, we're going to talk to Mike Buckley, CEO of Be My Eyes,
00:30:51which is an accessibility service that uses AI and live volunteers to help blind or low vision users.
00:31:00Support for this show comes from Salesforce.
00:31:03You might remember a time not long ago when AI wasn't all that helpful.
00:31:08But today, Agent Force, the powerful AI from Salesforce, can analyze, decide, and execute tasks autonomously,
00:31:16operating at speeds and scales no human workforce could match.
00:31:20These AI agents represent a new world of digital labor that not only handles monotonous, low-value work,
00:31:26but orchestrates and carries out high-value, multi-step tasks.
00:31:30This isn't just another step forward.
00:31:32It's an enormous leap, redefining how work gets done and what's possible for businesses and their employees.
00:31:39Agent Force is adaptable, autonomous, and proactive, and, of course, totally integrated into Salesforce.
00:31:45So they're truly part of the team.
00:31:47That way, you and your employees can focus on the tasks that actually move your work forward.
00:31:52Agent Force, what AI was meant to be.
00:31:55Learn more at salesforce.com slash agent force.
00:32:02We're back with Mike Buckley.
00:32:04He's the CEO of Be My Eyes.
00:32:06It's a service that uses AI and live volunteers, along with smart glasses and phones,
00:32:10to provide assistance to users who are blind or visually impaired.
00:32:14Thank you so much for joining us here on The Vergecast.
00:32:18And, you know, one of the reasons why I wanted to talk to you was smart glasses are a huge rising trend in tech,
00:32:27wearable tech especially, and I cover wearable tech.
00:32:29But one of the things we don't talk all that much about is how it could be an assistive technology as well.
00:32:37So I was wondering if you could give our listeners kind of a brief introduction into what Be My Eyes is
00:32:44and how you found yourself with the company.
00:32:48Awesome.
00:32:48Thanks for having me.
00:32:49So Be My Eyes was started by a Danish furniture craftsman who was losing his eyesight.
00:32:55And he was pissed off because there were times when his family and friends weren't around
00:33:00to kind of help him with sight needs as his eyesight was deteriorating.
00:33:04So he invented this app whereby through one touch of a button,
00:33:09someone who's blind or low vision can be seamlessly connected to a sighted volunteer
00:33:15on a one-way video, two-way audio call, right?
00:33:18So the sighted volunteer can see outside the back of your smartphone
00:33:22and that volunteer can help with anything.
00:33:25Like, is this the tomato soup?
00:33:27Does this shirt match my pants?
00:33:29Am I at the right airport gate?
00:33:31Um, and after the first week of the app's launch, there were 10,000 people on it.
00:33:36And if you fast forward to today, there are about 900,000 blind and low vision users
00:33:42and astonishing 8.8 million volunteers, which I think is the largest online volunteer army
00:33:49in the world and the second largest volunteer army period after the Red Cross.
00:33:54Um, and we operate in 150 countries in 180 languages.
00:33:58And what I like to say about this is, um, Hans, our founder, he successfully merged technology
00:34:05with human kindness, right?
00:34:08Um, to solve a societal need.
00:34:09And, um, if you fast forward from the inception in 2015 to about 2022, we started talking a lot
00:34:18about artificial intelligence under a belief that AI could have very serious, um, benefit
00:34:25for people who are blind or have low vision in terms of visual interpretation.
00:34:28And so we were an alpha launch partner of open AI way back in, uh, in the spring of 2023,
00:34:36which seems like 10 years ago at this point, given all the AI developments.
00:34:40Oh, it really does.
00:34:41But now we do more than 3 million AI sessions a month of visual interpretation where the blind
00:34:47or low vision consumer can literally take a picture of just about anything and get,
00:34:51you know, very fast interpretation.
00:34:55Fascinatingly, even though the AI sessions have gone through the roof, we've also seen
00:35:00that the calls to human volunteers have also gone up.
00:35:04And so there is this thirst, I think, still for human connection, as well as a thirst for
00:35:10consumption under an AI model.
00:35:12And, and as long as the user kind of has a choice of either, either kind, um, we like
00:35:17that, but that's what be my eyes is.
00:35:18And all of that is free for the blind or low vision consumer.
00:35:22And we pay the bills by using this product and selling it to enterprises for great customer
00:35:29service for blind and low vision consumers and for workplace accessibility.
00:35:32So people can use be my eyes at work or people through our app can call a bunch of our customers
00:35:38through one touch of a button and get seamlessly connected to customer service, but with the
00:35:43benefits of AI or enhanced video.
00:35:46So that's how we pay the bills.
00:35:47That's pretty fascinating that, you know, um, users can choose between whether they want
00:35:52to interact with an AI in a particular circumstances or with humans.
00:35:56Have you gotten any feedback about like what scenarios they might prefer one or the other?
00:36:03You know, it's, it, it varies.
00:36:05It's very individual based, but there are a couple of common themes.
00:36:08Like, um, I talked to a woman who, um, said she's using the AI in her house more often
00:36:15when she hasn't cleaned up, right?
00:36:17Cause she doesn't want someone to see like that kitchen's messy.
00:36:20You know, um, um, I think that there's also a report, you know, a group of people who are
00:36:26blind and low vision who just appreciate the human interaction.
00:36:29Um, one of the important, fortunate realities of people with sight loss, um, is that there
00:36:35are higher rates of loneliness and higher rates of depression and higher rates of mental
00:36:39health.
00:36:40And so having that human connection can be quite meaningful for people sometimes.
00:36:44Um, I think there was also like sometimes, um, you know, people, uh, people can be better
00:36:51than AI and other times AI can be better, better than people, right?
00:36:54Like if you call me up and ask me about, you know, something super technical, you know,
00:37:01a router's broken, I'm probably not going to be very effective, but the AI will probably
00:37:05have an answer really quickly.
00:37:06And so I think that there's a lot of trial and error going on.
00:37:09Um, and most consumers are using both.
00:37:13It's very nice to hear actually that the volunteers are still, uh, requested and wanted, uh, because
00:37:20the narrative with AI often is that it's one scary, two, that it's going to take jobs away.
00:37:26And I mean, these are volunteers, but it's still people who meaningfully are giving up
00:37:31their time to do something nice.
00:37:33So, um, it's very kind of reassuring to hear that this isn't, uh, maybe a case where it's
00:37:41either or it's either and, is that the right way to say it?
00:37:44Like it's, it's an and situation, not an or situation.
00:37:48As of right now, yes.
00:37:49I mean, who knows really what the future holds for us, right?
00:37:52As these systems become more sophisticated.
00:37:55And, um, I think it's just about every day now where we hear about someone interacting
00:38:00with AI as though it was human and that's, you know, good or bad.
00:38:03I'm going to leave the value judgments to somebody else at the moment.
00:38:07Um, but we did, like, I remember telling our board of directors that we may put the kind
00:38:12of volunteer product out of business through the adoption of AI hasn't happened.
00:38:16Um, but, um, um, as of right now, it is, it is an and rather than an or.
00:38:22Yeah.
00:38:23I mean, to be fair, if I wanted to ask someone if something matches, like if my shirt matches
00:38:27my pants, I think I would trust a human a little bit more than an AI, but well, I guess we'll
00:38:33see how that evolves.
00:38:34But so one thing I've, I, I just as long as you're not asking me, me about fashion.
00:38:39You know what, fair, but, uh, let's say, you know, this really kind of depends on a camera,
00:38:47right?
00:38:48So I'm just curious, you mentioned that obviously this works with a smartphone.
00:38:52That's a pretty mobile device.
00:38:55That's easy for people to carry around.
00:38:57It's super ubiquitous.
00:38:58Are there any other types of gadgets that this works on?
00:39:01Look, uh, of course, you know, smart glasses, right?
00:39:06It's a huge and growing market for us.
00:39:09And think about this from the perspective of the blind or low vision consumer.
00:39:13If I have to interact with my world just to get basic visual information, that's very
00:39:18different than using glasses.
00:39:20What are the glasses?
00:39:22It's hands-free, right?
00:39:24That's power, right?
00:39:26That's power for the blind or low vision consumer.
00:39:28Imagine if you're going to an airport and you have a piece of luggage in one hand and
00:39:33a guide dog or a cane in the other hand as a blind or low vision, what the hell do you
00:39:39do with your phone, right?
00:39:41That's not a good experience.
00:39:43So using glasses or a wearable of any kind, but obviously the Ray-Ban meta glasses have been
00:39:48a complete, um, game changer for so many members of the blind and low vision community.
00:39:53Those glasses are power and freedom, independence, speed, right?
00:39:59For, uh, uh, a large section of humanity.
00:40:02And by the way, the second you think about, you know, oh, you hear blind and low vision.
00:40:07It's not about you.
00:40:08It's about all of us, right?
00:40:10There are over 300 million people globally who are blind or have low vision.
00:40:14And the world health organization indicates that this number may double by 2050 because
00:40:19of macular degeneration, increased rates of diabetes, aging population, growing populations
00:40:24generally.
00:40:25And so the fact that you can have a wearable that frees up your hands is a true game changing
00:40:32step function, right?
00:40:33So we have a very strong belief that, that, um, smart glasses and, and working with meta
00:40:40will not only help for sort of, um, personal and human use cases, but also for employment
00:40:46use cases.
00:40:47And, uh, we're, we're about to start a pilot with a fortune 20 company on this front very
00:40:52soon.
00:40:53I cannot use their name yet though.
00:40:54Okay.
00:40:55Well, you know, thank you for the hint.
00:40:57We'll keep an eye out for that.
00:40:59Uh, but you know, I actually am someone with very poor eyesight, like thanks parents and
00:41:05genetics, but like I have extremist astigmatism and even I am at the point where I'm starting
00:41:11to use accessibility settings on my phone just so I can read things easier.
00:41:15So I don't, I don't remember a life without glasses to me, having smart glasses seems like
00:41:20a very natural, uh, evolution about it.
00:41:24So I guess one question I have was, was it always obvious to you that smart glasses
00:41:29was kind of the next, uh, kind of frontier for, for your product and your service?
00:41:35Because, you know, we, we tried smart glasses 10 years, 10, 12 years ago with Google glass.
00:41:40And that was, that was a big no, no for a lot of people.
00:41:43So like, just curious with the meta Ray-Bans, was that a really obvious partnership to you?
00:41:50Were you guys like, Hey, we heard you're making this.
00:41:53Why, why haven't you contacted us?
00:41:55I can't say it was obvious because you and I lived through glass holes, right?
00:42:00We, we, we remember that like, you know, um, um, at a certain point though, we, we kind
00:42:08of saw people in our community starting to use these things before we even had an integration
00:42:14and there was some hope express.
00:42:17So no, it wasn't obvious, but it became apparent over time.
00:42:22Um, and how the partnership came about was, is, you know, I contacted meta and I was not
00:42:28really having success.
00:42:29And then, but then finally I got through to Andrew Bosworth, uh, the meta CTO and he got
00:42:36it immediately.
00:42:37Um, and he helped make it happen.
00:42:39And so, um, we contacted them, but to his credit, Andrew Bos, um, said, yeah, we have
00:42:46to do this.
00:42:46And it's, uh, it's been a great partnership ever since.
00:42:49So how does the integration work?
00:42:51So we basically have built things on the backend engineer to engineer at both companies where
00:42:57you can actually call a human volunteer through the meta glasses.
00:43:01And we're talking about more advanced integration now of AI functionality, customer service, group
00:43:07calling, a whole bunch of other things, again, under that belief that this is power for the
00:43:12community, number one, right.
00:43:14And independence.
00:43:14And second, that the consumer needs to be given a choice, um, as to how they want to access
00:43:20visual information that may be AI sometimes, or it may be a human sometimes.
00:43:25So we're, we're looking at enabling both.
00:43:27That's pretty cool.
00:43:28Um, so fun factor, it wasn't necessarily an obvious, uh, use case for me either.
00:43:36I was actually testing the live AI and the multimodal AI, um, beta when it came out earlier
00:43:44this year.
00:43:44And I was kind of, I have to admit, I was a little flippant.
00:43:47I, I said something along the lines in my, uh, write-up about how, like, okay, who, who
00:43:52needs this captain obvious description?
00:43:56Uh, cause even though my eyes are bad, I can still see, uh, with corrective, very strong
00:44:01corrective lenses.
00:44:02And I actually heard a lot from our readers and people in the blind and low vision community,
00:44:08which is one reason why I'm reaching out now to be like, actually, this is life-changing
00:44:13for us.
00:44:14This is game-changing for us.
00:44:15Don't discount us here.
00:44:16So this is my penance talking to you guys.
00:44:19And I'm just curious, like, uh, how you view something like multimodal AI from the sense
00:44:28of, like, it creates its own challenges, right?
00:44:30It's very power-intensive.
00:44:33It's very draining on the battery.
00:44:35Have you guys put any thought as to how you're, how, how you're going to adapt your particular
00:44:41services to accommodate something that's so new and so power-hungry and, you know, is
00:44:48possibly prone to hallucinating things?
00:44:51Like, how, how do you approach that challenge as well?
00:44:54Well, I look full speed ahead and you have to give the consumer the choice, right?
00:44:59Like, there's no doubt that live, uh, interpretation of one's environment is powerful, important,
00:45:06and necessary.
00:45:07That doesn't mean it's perfect, right?
00:45:09Uh, and whether it's a hallucination or whether it's, um, my favorite test to do with live
00:45:14AI is if you hold your hand out and then count your fingers down and ask it to tell you what's
00:45:19going on.
00:45:19Um, it doesn't tell you it's counting, so there's still latency, right?
00:45:23There's still, there's a lag between what the camera is able to interpret and what the
00:45:27AI is able to interpret in terms of what's going on.
00:45:31But, um, the future is really, really bright here, right?
00:45:36Real time will eventually become actually real time, right?
00:45:41Which will have transformative power in terms of use cases.
00:45:45Like, think about navigation, right?
00:45:47Think about getting around an airport.
00:45:48Think about obstacles, right?
00:45:50As you walk along in the street, all of that is coming.
00:45:54I mean, Meta has already kind of got, you know, uh, has already demoed it, you know, Orion
00:45:58glasses, which is a whole separate and wild, um, capability.
00:46:03But combining that with kind of live AI, um, I think is, is again, potentially transformative.
00:46:10I would also say that as a, uh, a sighted person, um, I had the same initial reaction you did
00:46:17about, oh, why do I need kind of the live interpretation?
00:46:20And then I was wearing my Meta glasses, um, when I was walking around the city with a bunch
00:46:26of historical landmarks.
00:46:27And I just kept saying, Hey, Meta, what is this?
00:46:31You know, and it told me what it was.
00:46:32And I said, what's the historical significance?
00:46:34Tell me more about the year or the, the architect or the artist.
00:46:38And it was, it was like lovely, right?
00:46:41And it was lovely to have my phone in my pocket and not be searching around for information
00:46:45that way.
00:46:46And so I do think that we're, we're really literally at the, at the, at the cusp, the
00:46:52very early stages of exploring the utility of, of, um, this technology.
00:46:58Yeah, for sure.
00:46:59And like, it seems like there's a lot of different takes on smart glasses.
00:47:03Um, obviously I think for your service to work, it requires a camera, but I guess, do you
00:47:11think that the smart glasses in general, provided they have a camera in there that can't
00:47:16be the kind of ideal form factor for your tech?
00:47:22And are there other companies you're kind of thinking about working with that you can
00:47:26talk about?
00:47:27Yeah.
00:47:28Look, yeah, it's certainly plausible that this will be the form factor of wearable that
00:47:33wins, right?
00:47:35Who knows?
00:47:36But like, you know, what if I have a necklace that can do the same thing and I don't want
00:47:42to wear glasses all the time?
00:47:43Like maybe that's going to win.
00:47:44Like, you know, um, you know, there were all these rumors about, uh, would Apple or
00:47:50somebody else put a camera in the, in the earbuds?
00:47:53I don't think that's happening anytime soon, but it's kind of interesting conceptually.
00:47:57Right.
00:47:57But then I have to have something in my ears all day.
00:47:59Is that what I want to do?
00:48:00So I don't think we really know, you know, the form factor that's going to win out at the
00:48:06end.
00:48:06I will say that in the current market, glasses are the best, right?
00:48:11And, and meta has the best product.
00:48:13Now, is Google going to come?
00:48:15Is Apple going to come?
00:48:17Are there like eight startups that are working on this too?
00:48:20Sure.
00:48:21Right.
00:48:22And who knows what like Joni Ive and Sam are going to build over at open AI, right?
00:48:26Although people are suggesting they don't think it's glasses, but who knows?
00:48:30Right.
00:48:30Um, I do think that in the near term, the market for the people in, in the community that we
00:48:37represent, the blind and low vision, it's certainly the best form factor and, um, and
00:48:41has a incredible potential.
00:48:43And, and of course, you know, uh, we're willing to work with just about anybody and everybody,
00:48:48but I, I will always have a soft spot for meta because meta said yes to us and, uh, and
00:48:54they worked with us.
00:48:55And, and by the way, they did that, you know, without asking for a lot of credit and without
00:49:00like, you know, trumpeting it in a way that was kind of artificial or, or, um, um, let's
00:49:07just say meta put in a lot of human hours and doing this while no one was looking.
00:49:13I feel like you are right that it was pretty a low key in terms of the overall overarching
00:49:20messaging, because, uh, for better or worse, we are always kind of catering to the mass
00:49:26market.
00:49:26And this kind of feels a little bit like an oversight, but, uh, I do think, you know,
00:49:33this was kind of a use case that I saw mentioned for Android XR in the Google IO keynote.
00:49:39Was that something that you walked to, um, have you seen like other companies talking about
00:49:46this to you guys?
00:49:47Have you seen like any increase in outreach from people interested in this technology or
00:49:52is it just mostly been this one main partnership so far?
00:49:56No, I mean, it's, it's, it's a growing market and we've been contacted by everybody on some,
00:50:01on some level, you know?
00:50:03And so of course we've talked with Google and we've talked with Apple for years about a
00:50:07whole host of things.
00:50:08And there are some integrations that we have, um, uh, on the Apple vision pro and, uh, um,
00:50:15it's suffice it to say, like, we're, we're, um, we're getting more calls than ever, which
00:50:21is, which is great.
00:50:22Um, I think it's going to be interesting to see if, if, um, you know, what company or group
00:50:29of companies solve some of the problems that you're talking about, right?
00:50:32Whether it's battery life, right?
00:50:34Whether it's, you know, small form factor, getting the cost down to a point like, like where
00:50:39it's really great.
00:50:40Like part of the, the remarkable thing about the Meta Ray-Bans is, um, that the price point
00:50:48is not absurd, right?
00:50:50Relative to a phone.
00:50:52And if you look at like other assistive technology, you know, even the Google glasses back in the
00:50:57day, they were, they're often thousands of dollars, you know?
00:51:01And here's the reality.
00:51:0370% of people who are blind or have low vision are either unemployed or underemployed.
00:51:08So when you start talking about $2,000 or $2,500 for a device, it's over, it's not going
00:51:14to work.
00:51:15Right.
00:51:15But if you have a $300 device or a $330 device that also doesn't charge you monthly for kind
00:51:23of AI costs, that's a big deal, right?
00:51:26And I can foresee a future and a not too distant future where a government, right?
00:51:32Or a social service will subsidize the cost of that hardware because they know it provides
00:51:37substantive value to its citizens, right?
00:51:40Or to help facilitate employment and power and independence that we've talked about before.
00:51:45And so, um, I think it's coming.
00:51:47Yeah.
00:51:48I mean, that would be wonderful if that was the case.
00:51:50Like even I know in a bunch of wearable tech, sometimes they are HSA or FSA eligible and
00:51:56that really just helps with the accessibility and for people who might need it.
00:52:00Um, but along, along that, uh, thought process, you know, I always say that accessible design
00:52:06is universal design because like, it's a really important thing.
00:52:10I think we as tech reviewers could be a lot more thoughtful about that when we evaluate
00:52:14products.
00:52:15But, um, from someone who is deeply in that space of, of kind of developing accessible tech,
00:52:22thinking about how accessible tech needs to go into the future as well.
00:52:25I'm wondering, like, what would you like to see improved?
00:52:29We do, we do happen to have a lot of people who are in, are in the, you know, the right
00:52:35rooms and the right spaces to make those improvements listening to this podcast, I should
00:52:39think.
00:52:39So like, what, what do you think would be like massive areas of improvement that could really
00:52:46make a huge benefit?
00:52:47Like whether it be design, engineering, just like how we're thinking about developing like
00:52:52smart glasses or any other type of wearable device with a camera for this particular type
00:52:57of technology.
00:52:58I think it's the thing that you alluded to is that like, um, it has to happen at the design
00:53:05stage, right?
00:53:07You know, giving thought to like, I'll give you a perfect example of this.
00:53:11Um, years ago, whether it was malls or banks or airports, everybody rolled out some like
00:53:18touch screen kiosk, right?
00:53:20Whether it was an ATM or a hotel thing.
00:53:23Do you know who that sucks for?
00:53:26Someone who's blind or low vision.
00:53:28It's completely useless, right?
00:53:30Yeah.
00:53:31Uh, to, to do that.
00:53:32And now we're event, we're going to solve a lot of these challenges in the near term through
00:53:37voice interaction, right?
00:53:38Which has huge power and potential, but like the design stage is really important and making
00:53:44that very small, modest investment in the design stage to make sure your product is accessible
00:53:50and adaptable to broader communities, not only pays dividends in capturing the market
00:53:56of disabled consumers, but it also makes your product better for everyone.
00:54:01And I'll give you an example of this.
00:54:03Um, when we were an early partner with open AI on the first GPT models, right?
00:54:09We brought 19,000 blind and low vision beta testers to that effort who gave constant feedback
00:54:17and iteration on the descriptions that they were getting, um, from the models, which helped
00:54:23the open AI, um, scientists and engineers further refine how those models work, making them better
00:54:31for everybody.
00:54:32So I think those investments pay off not only in trying to capture the market of the disabled
00:54:37consumers, but also making the product better holistically for a broader segment of society.
00:54:42Yeah, I think, um, I was talking with my meta contact recently and they mentioned that one of the,
00:54:47one of the options now that's possible, I still have to test it, uh, with the meta glasses is that
00:54:53you can now opt into much more descriptive, uh, descriptions from the AI.
00:54:58So like maybe me as a sighted person, I don't need, um, super detailed descriptions, but if I am part of
00:55:05the blind or low vision community, I can opt for really detailed descriptions, uh, of that.
00:55:12And I think, you know, that's like a, a really small tweak that I know I wouldn't have thought
00:55:18of if I didn't invite, you know, underserved communities into the room while talking about
00:55:23design.
00:55:24But to think about what you just said, like that could be really useful for a sighted consumer
00:55:29as well.
00:55:29Like having a toggle switch, right.
00:55:32That controls the depth of information that I get, like how much, like if, if I'm out and
00:55:37about, I might want like a three second answer to questions.
00:55:41Right.
00:55:41But if I'm on vacation or I'm touring a museum or whatever else, I might want the switch all
00:55:46the way on the other side.
00:55:46Cause I want robust information.
00:55:48And so that's a really great example of how thinking about the design elements of, um,
00:55:55the robustness of description and characterization can benefit not only the blind or low vision
00:55:59consumer, but anybody.
00:56:01Yeah.
00:56:02I guess I'm curious if there's any parting thoughts from your experience or stories that
00:56:07have been shared with you from actual people using this technology that, you know, I think
00:56:13the vast majority of our listeners, myself included are sighted people.
00:56:17So like, what are some things that we should be thinking about, um, when we are thinking
00:56:22about how we evaluate this tech in our daily lives?
00:56:26Everybody comes at a consumer product or anything from, from their own lens.
00:56:30Right.
00:56:31Um, but I mean, in terms of the, the people, you know, that, that you reach every day, like,
00:56:36I think it's good for everybody to think about, you know, the fact that the world and the
00:56:42people within it and their needs are pretty diverse.
00:56:45Right.
00:56:45And so it's sometimes hard to think outside of our own little bubbles.
00:56:49I know I'm guilty of it, but I can tell you that from the perspective of the community
00:56:54that we work with, the blind and low vision consumer, um, I try to avoid hyperbole, right?
00:57:00Because there's plenty of that in the AI world going around right now without me, without
00:57:05me adding to it, but, but, but I get two emails a day, right?
00:57:13Um, anytime I run into someone at a conference talking about the fact that, that the glasses
00:57:20and the experience of be my eyes and AI combined with those meta glasses is literally life-changing.
00:57:27Um, and don't take my word for that.
00:57:29Talk to, talk to the blind or low vision consumer.
00:57:32But, um, um, I have stories about like people taking the train for the first time on their
00:57:38own, right?
00:57:39Or getting, you know, getting around an airport.
00:57:42There was a story actually, um, there's a guy named Robbie and he posted about it publicly.
00:57:47So I don't feel like I'm outing him who, who he works for the BB, BBC, and he read a book
00:57:53to his two-year-old daughter for the first time using those glasses and be my eyes.
00:57:59Like, how cool is that?
00:58:01Right.
00:58:02So what, so, so think about the things that we all take for granted and then think about
00:58:08Robbie for the first time as a dad, as a blind father reading to his daughter.
00:58:13Or I, I just, I don't, it doesn't get any better than that.
00:58:17When you think about like kind of human impact, right.
00:58:21And the benefits of this technology, like there's a million reasons to be wary of AI,
00:58:26right.
00:58:26And be fearful of it.
00:58:28But like, I'm pretty sure that there are a bunch of reasons also, um, to be, to embrace
00:58:34it and, um, and champion it.
00:58:36And, uh, that story of Robbie reading to his daughter is probably my favorite.
00:58:41I, that's a really lovely story.
00:58:43And honestly, I test a lot of AI stuff and some of it can feel very dystopian.
00:58:49So thank you, honestly, for the reminder that, you know, there are creative uses that can
00:58:55help people genuinely lead better lives that maybe to your point are not steeped in the
00:59:02hyperbole of this is going to change everything, but not give concrete examples of real life
00:59:09examples of how it could.
00:59:11So, man, that's, that's, it's really lovely to, to hear that.
00:59:15So thank you so much for taking the time.
00:59:18Yeah, no worries.
00:59:19Thanks for having me.
00:59:20Okay.
00:59:20So we've got to take one more break and then we're going to take a sexy, spicy question from
00:59:26the verge cast hotline.
00:59:27Uh, maybe don't listen to this in an office.
00:59:32Anyway, be right back.
00:59:33AI is moving fast.
00:59:37So fast.
00:59:37It's hard to keep up.
00:59:39In fact, in ServiceNow's latest AI maturity index, scores dipped 20% from last year, but
00:59:46that's okay because AI isn't a sprint.
00:59:49It's a marathon.
00:59:50You may be behind today, but tomorrow you could be a pace setter.
00:59:54Dive into ServiceNow's AI maturity index and see how you can innovate as fast as your ambitions.
01:00:00Visit servicenow.com slash AI maturity.
01:00:03All right, we are back.
01:00:11Let's get to the hotline.
01:00:12As always, the number is 866-VERGE-11 and the email is vergecast at theverge.com.
01:00:18We love all your questions and we try to answer at least one on the show every week.
01:00:22So let's get into this week's.
01:00:24And for this week, I have my lovely colleague, Mia Sato, on and we are going to answer a little
01:00:30bit of a spicy question about smart glasses.
01:00:33Roll the clip.
01:00:35Hey, David.
01:00:36I've got a question about smart glasses.
01:00:38I haven't heard y'all talk about this yet in this way.
01:00:41I know that with, like, the wristband or, like, the necklace that, like, listens to you,
01:00:46there's ways to be like, oh, no, I'm going to mute it so that you can hear this private
01:00:50conversation.
01:00:51But, like, glasses see things.
01:00:54But also, I need the glasses to see.
01:00:56So, I don't know.
01:00:58Well, maybe this is too sensitive for the vergecast, but I have a wife.
01:01:03And sometimes, she and I like doing things.
01:01:06And I like being able to see her when I do those things.
01:01:09But if she won't have sex with me if I've got, like, camera glasses on, but if I take
01:01:15them off, then I can't see her.
01:01:17Huh?
01:01:17So, like, are we going to live in a world where if you have smart glasses, do you have
01:01:24to have a pair of normal glasses or your wife won't have sex with you?
01:01:29Like, I don't know what that world looks like.
01:01:32And at that point, why have the glasses in the first place?
01:01:35Anyway, I cannot be the only person that has thought about this.
01:01:40But also, like, when I go to the bathroom, is Mena just going to have, like, a bunch
01:01:45of pictures of, like, me going to the bathroom?
01:01:49I don't like that.
01:01:50And if Eli Patel hears this, he's going to say, well, you do take four cameras into the
01:01:55bathroom with you every time.
01:01:56And I say, yeah, that's true.
01:01:58Like, an AI is training on it, right?
01:02:00Like, my phone camera's not on all the time, right?
01:02:04Anyway, please help me.
01:02:06I want to get smart glasses, but I also like having sex.
01:02:09Rock and roll.
01:02:13Amazing.
01:02:14It's an amazing question.
01:02:16Anyway, I asked you to be my partner answering this question because you write so many smart
01:02:23things about how technology shapes culture.
01:02:26And I feel like this is directly on that intersection.
01:02:30And, you know, this is my beat.
01:02:31And I've written some stuff about wearable etiquette as well.
01:02:34So, yeah, I feel like we are the most equipped people at The Verge to answer this question
01:02:40with loving, kind, empathy.
01:02:44So, yeah.
01:02:45Yeah.
01:02:46Yeah.
01:02:46So, listener, I don't know your name.
01:02:49Maybe that's better for the best.
01:02:51But I guess I'm curious, first of all, what your wife thinks.
01:02:55Like, is this a conversation that y'all have had about where in the house is appropriate
01:03:01to wear the glasses versus where isn't?
01:03:04That would be my first thing is just, like, get a temperature check.
01:03:07Because I do think that there are some things that are intimate enough that, like, you might
01:03:13want a disconnected life there, right?
01:03:17Like you said, going to the bathroom, maybe when you have sex with your wife, other times.
01:03:27Because, like, I think, like, the reality is that even when companies say, like, we have
01:03:34these great security protocols, your data is safe, there have been instances where moments
01:03:41that you wouldn't want a stranger to see are leaked, are revealed in some ways.
01:03:47And we can get into that and talk about those examples.
01:03:49But to me, like, one is, like, ask your wife what she thinks.
01:03:54But also, I feel like smart glasses are not for the bedroom.
01:03:58I think that's point blank, period.
01:03:59Super fair.
01:04:00So, like, last year when the smart glasses and the Vision Pro were getting kind of some
01:04:06buzz, I wrote a piece about, like, a wearables primer etiquette.
01:04:09And I interviewed Emily Post's grandson about it.
01:04:13And one of the things that he mentioned is exactly what you're touching on, which is, like,
01:04:17having these conversations.
01:04:18Like, this is definitely a conversation you need to broach with the wife and, like, talk
01:04:23about what boundaries they're comfortable with.
01:04:25From, like, a gadget perspective, I can tell you there's an off button on the Meta Ray-Bans
01:04:31specifically.
01:04:32So you can just turn it off.
01:04:35But, like, the interesting thing about these glasses and how they signal privacy when there's
01:04:41video recording or photo taking is that there's a little LED light.
01:04:47So you'll see the LED light when that's happening.
01:04:51So that is a visual cue and a signal.
01:04:54But maybe, you know, I don't want to kink shame.
01:04:57People do.
01:04:58True.
01:04:59A number of acrobatics in the bedroom.
01:05:02I don't know where hands are being put.
01:05:04There could be ostensibly a situation if they're on that they could accidentally take a picture
01:05:11or start video.
01:05:12So, you know, there's...
01:05:14Or, like, you forget to turn them off.
01:05:16Or you forget to turn them off before it's happening.
01:05:20There's Meta AI in there.
01:05:22If you say anything that sounds like, hey, Meta, I don't know what your wife's name is,
01:05:26but what if it's Greta?
01:05:28And you trigger the AI and the AI just starts doing something.
01:05:32Like, you know, one, you need to have the conversation.
01:05:35And two, I also agree that there are just certain spaces, public versus private, where maybe
01:05:41you want to, like, consciously draw a line between the technology and your life.
01:05:48Like, I just think that's healthy digital hygiene in general.
01:05:52And, you know, you don't have to throw away your old glasses when you get the smart glasses.
01:05:57You can just keep them.
01:05:58Well, yeah.
01:05:59Another part of this question that I wanted to pick up on was, like, the idea that you
01:06:02shouldn't have two pairs of glasses, which I completely disagree with.
01:06:06I think you actually actively should have multiple pairs of glasses.
01:06:09Because, like, what if you break some?
01:06:11What if you just want to switch up your look?
01:06:13Like, I am a recent glasses wearer.
01:06:15I'm not wearing glasses right now.
01:06:16But, like, I have, like, four pairs of glasses because sometimes one pair looks better with
01:06:21what I'm wearing.
01:06:22Or, like, some, like, fit on my head differently.
01:06:26And I need something different, you know, depending on the day or depending on the activity.
01:06:30So, I would say, like, it's totally fine to have two pairs of glasses.
01:06:34I get that it might be annoying to switch back and forth, but, like, maybe you just leave
01:06:37a pair of normal glasses on your bed night or your nightstand or, you know, in a drawer
01:06:42or whatever.
01:06:44I think it's fine to have two pairs of glasses.
01:06:47I am a lifelong glasses wearer.
01:06:49I got my first, again, I'm also wearing contacts right now, but I got my first pair when I was
01:06:53four.
01:06:54And the first thing that I learned was you must always have a backup pair of glasses.
01:06:59Yeah.
01:06:59Because you don't know when one's going to break.
01:07:01And the thing about these metaglasses is that they, you do have to put them in a case to
01:07:05charge.
01:07:06So, it's not like you can wear them 24-7 anyway.
01:07:10Not that you would want to.
01:07:11So, also, if it's just going to affect your performance because you got them on, they're
01:07:15heavier than normal glasses as well.
01:07:18So, I...
01:07:18Not as aerodynamic.
01:07:19They're not as aerodynamic.
01:07:21And, again, no kink shaming.
01:07:22I don't know what you're doing in there.
01:07:24I don't know what acrobatics are required or whatnot.
01:07:27So, have a second pair of glasses always.
01:07:31Yeah.
01:07:31And, you know, I would actually say that to anyone who's thinking about smart glasses.
01:07:35Have a pair of dumb glasses on hand.
01:07:36Like, you don't...
01:07:37Yeah.
01:07:38You don't always want to have an AI...
01:07:40Yeah.
01:07:41...coming with you everywhere for everything.
01:07:44Right.
01:07:44Yeah.
01:07:44And I feel like this example and this question is, like, in my opinion, too intimate to even,
01:07:50like, risk it.
01:07:52But there have been cases where, like, Roombas have recorded people in their homes without
01:07:58them realizing, like, on the toilet.
01:08:00Like, that's, like, a real thing that has happened.
01:08:01And then I don't think that, in that case, it was, like, publicly released.
01:08:05But people working for Roomba or working, you know, contractors or whatever did have access
01:08:11to those images.
01:08:12Like, this is a thing that happens.
01:08:13And in my opinion, like, there are places that are just a little too intimate for tech
01:08:20companies to have a seat at the table.
01:08:22Like, you know, for example, like, for me, I don't do any digital journaling.
01:08:28That is, like, pen and paper.
01:08:30100%.
01:08:30I'm with you on that.
01:08:31I also don't do any digital journaling at all.
01:08:33Yeah.
01:08:33I'm not doing the Apple journal thing.
01:08:35I don't want them to know my thoughts.
01:08:37I want there to be one copy of it.
01:08:39And if I want to destroy it, I can burn it in the backyard.
01:08:42Yeah.
01:08:42Done.
01:08:43Not that 100%.
01:08:43I used to do digital journaling, and those have all since been deleted.
01:08:48Yeah.
01:08:48And my pen and paper, like, it's just nice sometimes to do things analog.
01:08:53Not everything has to be this way, but, you know.
01:08:56Why take the risk?
01:08:57Why take the risk?
01:08:58And also, to your point, even if there's an off switch, and even if you're really diligent
01:09:03about the off switch, you are human.
01:09:05You are going to forget.
01:09:06I was testing the B, which was a wearable AI thing that listened to everything you did.
01:09:12I went to the bathroom.
01:09:15It heard everything, including me exclaiming, oh, shit, that was a shit.
01:09:21And then the AI recommended that I get back on Lactaid, because it was like, no.
01:09:27Ooh, don't like that.
01:09:28I don't like that.
01:09:28That was the most humiliating thing of my life.
01:09:31And it was literally, I went, oh, shit, that was a shit.
01:09:34Oh, shit, this thing is listening to me.
01:09:36Shit.
01:09:37A lot of shits in a row.
01:09:37A lot of shits.
01:09:38And it recorded all of that.
01:09:40And, you know, that company is like, everything is private.
01:09:43We don't record the audio recordings.
01:09:44But there's a transcript.
01:09:46Yeah.
01:09:46There's like a, there's a trail of a thing that was said.
01:09:49Also, like, terms and conditions are subject to change at any time, right?
01:09:53Like, I feel like we have all learned that lesson many times over to the point where, like, I'm not wearing the smart glasses to have sex.
01:10:02I think that's, yeah.
01:10:04Bedrooms.
01:10:04No.
01:10:05Bathrooms.
01:10:06I'm sure there's another thing that starts with the B.
01:10:08Even just, like, being around your house sometimes, like, that is a private space.
01:10:15And I don't know.
01:10:17I feel like most people, like, you walk around in your underwear.
01:10:19You do.
01:10:20You forget your bathrobe in the bedroom.
01:10:23So, you need to, like, do the naked dash through the house.
01:10:26Like, there are lots of instances where it's not just relegated to, like, only in the bedroom, only in the bathroom, only in the closet.
01:10:34So, yeah.
01:10:35I feel also, like, I will say I'm, there's probably, like, zero instances where I could be convinced to get smart glasses for all the reasons that we've talked about.
01:10:45Like, I'm very forgetful.
01:10:47I kind of just, like, do what I need to do when I need to do it.
01:10:50So, it's a no from me.
01:10:52But maybe you're, you have a happy balance.
01:10:55I'm more of a kind of a middle ground where, obviously, I'm testing these things.
01:11:01Obviously, I'm a gadget girly.
01:11:03But I do think you have to be really thoughtful and intentional about when you're using this tech before this tech, like, rules you.
01:11:10So, I think, have this conversation with your wife and she gets final say.
01:11:15Because the thing about this new technology is that we are navigating new privacy, like, comfort zones and levels with it.
01:11:23And, you know, with the new era of smart glasses, privacy hasn't come up quite as much.
01:11:29Because I think we're a lot more comfortable just everyone doing, like, oh, my God, this is my TikTok.
01:11:34I'm public.
01:11:34Wow.
01:11:35So, we're much more comfortable, theoretically, with cameras being around all the time.
01:11:40But this is a very private, intimate space.
01:11:42This is a very private, intimate act.
01:11:44But you've got to have the conversation first.
01:11:48And it has to be something that you both agree on and that you both consent to.
01:11:52Because, oh, she's not going to, I can tell you she's not going to be happy if you just buy them and wear them.
01:11:59Don't explain it.
01:12:00Don't explain it.
01:12:01Yeah, that's not good.
01:12:03There will be no sex having if that happens.
01:12:06So, yes.
01:12:09Yeah.
01:12:09Just think of it as an opportunity to kind of, like, expand your glasses look.
01:12:13Yeah, have more than, yeah, have a couple pair.
01:12:17The takeaway is have more than one pair of glasses.
01:12:20You can have more than one pair.
01:12:21Treat yourself.
01:12:23Treat yourself.
01:12:24Yeah.
01:12:25So, I think, yeah.
01:12:26Hopefully this is helpful.
01:12:27And have your wife call in, too, if she has additional thoughts.
01:12:30Yes.
01:12:30I would love to hear from her if she has her thoughts.
01:12:32Like, I would love to hear it.
01:12:34We could do a follow-up.
01:12:35Yeah.
01:12:35So, yeah.
01:12:36So, that was this week's Vergecast hotline.
01:12:39That's it for the Vergecast today.
01:12:41Thanks, Mia.
01:12:42Thanks, Anonymous.
01:12:43Thanks for everyone on the show today.
01:12:46And thank you for listening.
01:12:48So, there's a whole lot more from this conversation at theverge.com.
01:12:51We'll put links in the show notes.
01:12:52But also, read theverge.com.
01:12:54Find our bylines.
01:12:55You guys are internet savvy.
01:12:57I know you know how to do it.
01:12:58As always, if you have thoughts, questions, feelings, you can always email us at
01:13:02vergecast at theverge.com.
01:13:04Or keep calling the hotline, 866-VERGE-11.
01:13:07We love hearing from you.
01:13:09Send us all your thoughts.
01:13:10And we do mean all your thoughts and questions and ideas for what we should do on the show.
01:13:14We do a hotline question every week.
01:13:16So, keep them coming.
01:13:18This show is produced by Eric Gomez, Brandon Kiefer, Travis Larchuk, and Andrew Marino.
01:13:23The Vergecast is a Verge production and part of the Vox Media Podcast Network.
01:13:27Jake will be back on Friday to discuss all the news from this week.
01:13:30And goodbye.

Recommended