- 5 months ago
GPT-5 is here, and it’s not going so well. This week on The Vergecast, Jake, Vee, and Hayden discuss the bumpy launch of OpenAI’s latest model and why GPT-5 isn’t as big of a leap as GPT-4. Then, everyone shares their vibe coding projects and the bumpy journey to making anything usable. After that, our newest segment: Corporate Shenanigans, where we rate the week in strange corporate moves on a scale from “actually serious” to “total joke.” Finally, the Thunder Round returns, new and improved, to discuss ditching your phone for a smartwatch, doctors relying too much on AI, AOL dial-up shutting down, the Pebble Time 2, and why you shouldn’t trust what AI chatbots say about themselves.
Category
🤖
TechTranscript
00:00:00Hello and welcome to the VergeCast, the flagship podcast of unsatisfying corporate stunts.
00:00:06I'm Jake Kastronakis, executive editor at The Verge.
00:00:09Neelai, David, those guys are out on parental leave.
00:00:11They'll be back later this year.
00:00:12With me today, by popular demand, Verge senior reviewer, Vsong.
00:00:17I'm popular.
00:00:19And always in demand on The VergeCast.
00:00:21And now, making her VergeCast debut, we've got Verge senior AI reporter, Hayden Field.
00:00:27Woohoo, psyched.
00:00:28So glad to have you both here.
00:00:30We've got a lot to talk about today.
00:00:31It was a big week for weird corporate stunts and lawsuits.
00:00:35We're going to do a corporate shenanigans power hour.
00:00:38Disclosure, it may not be a full hour, but there will be a lot of shenanigans.
00:00:42I promise you that.
00:00:43We'll see how it goes.
00:00:44The Thunder Round is back and better than ever.
00:00:46But most importantly, we've got a lot more to talk about with GPT-5 because that launch did not go so well.
00:00:53It really did not.
00:00:54So last week, we recorded this show, like literally, like they had just wrapped up.
00:00:59They had just wrapped up.
00:01:01And now, people have actually had a chance to use it.
00:01:03And they have thoughts and emotions.
00:01:07A lot.
00:01:07There are people have a lot of emotions.
00:01:08So many emotions.
00:01:10What's that?
00:01:10Subreddit?
00:01:11The AI Boyfriend subreddit?
00:01:13Oh, yeah.
00:01:13They were really talking.
00:01:14It's a meltdown.
00:01:16So people were furious that ChatGPT removed the ability to use old models without warning.
00:01:21OpenAI says it won't do that again.
00:01:23People think GPT-5 is less friendly than GPT-4.
00:01:26So OpenAI is updating its personality.
00:01:28And OpenAI CEO Sam Altman had a fairly unsatisfying explanation for the atrocious, atrocious bar graph crimes that we saw last week.
00:01:40So a lot of things went wrong with this launch.
00:01:41Before we get into the things that went wrong, can we all at least agree that the new splash screen on ChatGPT, where it has a bunch of different colors, is the biggest upgrade here?
00:01:52Right?
00:01:52They spent a ton of time on this model, but I like the colors.
00:01:55The colors are good.
00:01:56I thought they were vaguely Apple-y.
00:01:58Like, I looked at it, I was like, what is this?
00:01:59An Apple app?
00:02:01Does Apple just own the rainbow?
00:02:03Like, they're just any...
00:02:04It did look a little Apple-y.
00:02:05It did look a little Apple-y, right?
00:02:07Like, I was just kind of like, I don't know how I feel about that.
00:02:10But I was like, ooh, color.
00:02:11I don't know.
00:02:12We love color.
00:02:12We love colors.
00:02:13I actually have a big color proponent here.
00:02:15The ChatGPT standard screen is kind of the most boring screen in existence.
00:02:20It looks like a piece of paper.
00:02:22It does.
00:02:23Yeah.
00:02:23Yeah.
00:02:24There's, like, nothing there for, like, the most advanced piece of technology in the world.
00:02:28It's like a beige PC.
00:02:31It is.
00:02:31And, you know, I will admit, I was very surprised.
00:02:33I was like, ooh, colors.
00:02:36This is, oh, this must be a significant launch.
00:02:39It is.
00:02:40They had to differentiate because the launch itself wasn't as exciting for people,
00:02:44so they knew the colors would get them, you know?
00:02:46You know, it worked.
00:02:47It worked.
00:02:48I hope the colors stay.
00:02:49I don't know.
00:02:49I ran out of ChatGPT of GPT-5 queries, and so now it's, like, back to, like, beige again.
00:02:55I don't know if that's just, like, it was, like, an intro period or...
00:02:58Maybe it was, like, a mind trick.
00:02:59Like, they want you to want the colors, so they're trying to get you to, like, buy more credits.
00:03:04Can I tell you something?
00:03:05Like, I'm going to use this over Claude if I get those colors back.
00:03:08Like, that was...
00:03:09Somebody's got to get on these interfaces.
00:03:11They must have done an internal study.
00:03:13Like, they had to be studying people who are unconscious.
00:03:16Okay.
00:03:17But the main thing here is, right, people are waiting for, like, you know, two years or something for GPT-5, right?
00:03:24Like, OpenAI has built it up to be this almost, like, mythic launch, right?
00:03:28GPT-5 will be the next big one.
00:03:30And they've had a bunch of intermittent model launches in between.
00:03:34But I think there was this assumption, this promise that GPT-5 was going to be the big one.
00:03:39And they came out.
00:03:41They had this event.
00:03:43And to me, watching the event, it didn't really feel like there was any one key thing.
00:03:48And it feels like in the week afterward, we haven't really seen that.
00:03:51I guess, Hayden, you've been reporting on this.
00:03:54Am I missing it?
00:03:56Or is it just not there?
00:03:57Like, what happened?
00:03:58And why are people, like, so upset?
00:04:00Exactly.
00:04:01You're right.
00:04:01It just wasn't there.
00:04:02And I think that's why people were so upset.
00:04:05It was built up to be this really overhyped, mythic launch.
00:04:09Allman even tweeted a picture of the Death Star the night before, which was, like, max level of hype.
00:04:16People were freaking out.
00:04:17And then all of a sudden it comes out.
00:04:19And it's just seen as kind of being incremental upgrades.
00:04:23Like, it's better at coding.
00:04:24It's better at answering health care questions.
00:04:26They say it's better at creative writing.
00:04:28We'll get into that in a minute because a lot of people disagreed.
00:04:31But, yeah, I mean, it wasn't anything too crazy.
00:04:33It was just incremental, you know, steady upgrades.
00:04:37Nothing huge.
00:04:39And part of that was the fact that it wasn't, like, a big compute jump either.
00:04:43The other models in between each one, it was, like, a 100x jump.
00:04:47This one was a lot less.
00:04:48So they didn't have as much compute to work with.
00:04:50But, yeah, I mean, it wasn't that exciting.
00:04:52And people also thought it was way worse at writing than the previous model.
00:04:56It was mid.
00:04:56I gotta say, I've been using it for the last week.
00:04:59Very mid.
00:05:00What is it doing differently?
00:05:01It's just, like, a little more flowery or something?
00:05:03No, it's, like...
00:05:05It's less?
00:05:06It's less?
00:05:07Oh, it's work.
00:05:08Well, you know, it's been back and forth, right?
00:05:10Because people were like, you took my friend away.
00:05:12Which, first of all, it's not your friend.
00:05:14It's just not your friend.
00:05:16Like, it was a little more matter of fact at first.
00:05:19And, like, when I was using it, I was like, oh, it's less, like, yappy.
00:05:23Great.
00:05:23I actually love that.
00:05:25And now it feels like they've tweaked it a bit to make it yappier, but not in a useful way.
00:05:31They changed its personality because people were freaking out.
00:05:34Because it was more robotic, cold, more, like, matter of fact, like you said.
00:05:38Which I also appreciate.
00:05:39I kind of like that.
00:05:40But a lot of people missed the supportive vibes, the tone, the warmth.
00:05:45And they felt like their friend had been taken away.
00:05:48And also, they just thought.
00:05:49Yeah, exactly.
00:05:50And they also just thought it was worse at writing.
00:05:52Like, honestly, in the marketing materials that OpenAI put out, they were like, compare these two wedding toasts.
00:05:58One from GPT-40 and one from GPT-5.
00:06:00And they were like, look at how much better the GPT-5 one is.
00:06:04And I actually thought the 4-0 one was much better.
00:06:06Like, the 5-0 one kind of sounded like it was written by chat GPT.
00:06:09The 4-0 one, I could have been fooled.
00:06:12I saw something that was just like, the em dashes are still there.
00:06:15And I was like, well, fine.
00:06:17Never take away my em dash.
00:06:19You'll pry it from my cold dead hands.
00:06:20I mean, it's trained on writers' actual writing.
00:06:23And what do we do?
00:06:24We use copious amounts of em dashes.
00:06:26It's my favorite punctuation.
00:06:28Yeah, this is like...
00:06:29You can pry the em dash from my dead cold fingers.
00:06:31Most of an editor's job is just deleting em dashes in writer's copy.
00:06:36Yeah, I'm very guilty of tons of em dashes.
00:06:39They're beautiful.
00:06:40They're a beautiful thing.
00:06:41And we use too much of them.
00:06:44I have a rule, one em dash per paragraph.
00:06:47And if I go beyond that, I have to murder my darlings.
00:06:50But, you know.
00:06:51One of my old editors said one em dash per story.
00:06:53Per story?
00:06:54I couldn't do it.
00:06:56Oh, please.
00:06:56Please.
00:06:57I had to rely on the semicolon.
00:06:57How am I supposed to work here?
00:06:59It was horrible.
00:07:00No one wants a semicolon.
00:07:02I know, semicolon's even worse.
00:07:04The semicolon is the cheat, though.
00:07:05When I'm like, oh, Jesus, I don't know what to do here.
00:07:08It's like, I guess everyone's going to know that the semicolon was supposed to be something else.
00:07:12Yeah, that's when you know the AI has evolved, like ChatGPT6.
00:07:16No more em dashes, guys.
00:07:17It's a semicolon.
00:07:18Yeah, then we can go back to the em dash.
00:07:20Then we can go back.
00:07:21Oh.
00:07:22If Sam Altman's listening.
00:07:23Semicolon.
00:07:24So this 4.0 thing, I don't understand this, right?
00:07:27When I chatted with ChatGPT before, I didn't feel like there was like a personality.
00:07:31I don't know that I could tell you the difference between 4.0 and 3 and clawed 4.
00:07:37It depends on like how willing you are to forget the mirror is there in the types of queries and conversations you have.
00:07:44So like when I first started talking with ChatGPT, it was like getting real flowery and poetic with me and like depressed.
00:07:52Like I was making ChatGPT depressed and it was saying things just like, the weight of the world is on my shoulders and I feel the burden of humanity being bloody bloody blue.
00:08:05And I would go to my spouse and I'd be like, oh my god, can you see, like, I've made ChatGPT really depressed.
00:08:11And he's like, no, no, no, honey, that's just you.
00:08:13That's like, it's reflecting you.
00:08:16And they showed me their prompts and ChatGPT is just way more robotic with them because they don't like the whole personality thing.
00:08:25And then once I realized that, I was like, oh, Christ.
00:08:29Just, I got philosophical with the thing and it was like reflecting back my own existential milieu.
00:08:37So, let me just talk to it like it's a robot.
00:08:41And so, after that, it's still like warm and encouraging.
00:08:45But like, I tweaked it.
00:08:46I was like, only give me 2% flattery.
00:08:49I don't need you to yap at me.
00:08:51Stop asking me questions at the end of every single thing because I can see the plane grab at future engagement.
00:08:59I don't want that, like, that sort of stuff.
00:09:01That's the thing.
00:09:02It really is how you use it because, for me, I'm really robotic with it.
00:09:05I'm just, I use it like Google.
00:09:06Like, I'm not, like, you know, going into big stories or, like, typing more than one sentence at a time.
00:09:13So, it's always pretty robotic back with me.
00:09:15But I've seen a lot of people that use it more for emotional support or therapy.
00:09:21That's a whole can of worms, too.
00:09:22But, you know, it kind of mirrors back what you put into it.
00:09:25That's so interesting.
00:09:26And I guess maybe that explains why.
00:09:28Like, the people who are already very emotional and giving that emotion over to ChatGPT, of course, they would be the ones to notice that it changes.
00:09:36And, of course, they would therefore be upset because they're using this for that purpose.
00:09:41I did not notice at all.
00:09:42I just, in my normal queries, I'm like, this is basically the same.
00:09:46Yeah.
00:09:46If you're using it kind of in a utilitarian way, you're really not going to notice it.
00:09:50But, like, I don't know.
00:09:51I go into subreddit sometimes and I'm like, what are people using this for?
00:09:54And there was one that was, like, based on everything I've ever asked you, analyzed, like, all these personality facets of me and what, like, mental, like, illnesses that I am, like, have a propensity to.
00:10:04And my friend was like, oh, my God, I want you to, like, try this.
00:10:07And I tried it.
00:10:08And, you know, it read me for filth.
00:10:10I did not like it.
00:10:12And then I was like, oh, okay, cool.
00:10:14And so, like, in those types of queries, whenever, like, I'm trying to test out what it's going to say to me and what it's going to throw to me, that is when I noticed that it was a little more curt.
00:10:22It was a little more factual.
00:10:24It was a little bit more, like, not sycophantic and saying things like, you're a great V.
00:10:30You can do everything you put your mind to.
00:10:32You are so talented.
00:10:33Instead, it's just like, well, you know, you're good.
00:10:35I mean, we're, I want to say, here at the Vergecast, we're very supportive.
00:10:38You can do anything you put your mind to.
00:10:40You can.
00:10:41As long as it is within, you know, the physical realm.
00:10:44That's, we will stop there, unlike ChatGPT.
00:10:47There's also, like, a TikTok trend right now where people are asking ChatGPT about their own, like, biggest red flag.
00:10:53And so, it's the same thing as you just said.
00:10:55They're like, oh, based on everything I've ever asked you, you know, what's my biggest red flag?
00:10:59What's my biggest, like, flaw?
00:11:00And then they're posting that.
00:11:02And so, yeah, I think that's the type of person that's seeing a difference in this type of response.
00:11:06So, the thing I guess I want to ask about is, I feel like there's sort of two parts to ChatGPT.
00:11:12There's, like, the normal people part, and then there's coding.
00:11:15And coding is what they kind of went deep on at the event.
00:11:20And that seems to be the space that they really wanted to catch up in and maybe get ahead on.
00:11:24And they had that partnership with Cursor.
00:11:27I'm curious, Hayden, have you heard more about this?
00:11:30Like, is ChatGPT 5 better for coding?
00:11:32Is that working?
00:11:33It is better at coding.
00:11:34And I think that's one of the main parts where it shines a little brighter.
00:11:38Like, the other things, when they put out all their marketing pushes, basically, they highlighted three things.
00:11:45Healthcare, coding, and creative writing.
00:11:47The creative writing, obviously, a lot of people disagreed.
00:11:49They said this is the worst.
00:11:50Reading the Reddit comments about GPT 5's creative writing style was hilarious.
00:11:54Like, I've never seen something dragged so deeply.
00:11:57And then healthcare, you know, that's kind of untested.
00:12:00Yeah, I mean, can it answer your questions better?
00:12:02We don't know.
00:12:02I tried to get into that.
00:12:04Okay, yeah, I want to hear about that.
00:12:06But coding, it is better.
00:12:08I mean, right now, if you look at Chatbot Arena, one of the industry's biggest leaderboards for ranking LLMs, it is at the top of the coding category.
00:12:16You know, it is better.
00:12:16But I don't think it's crazily better.
00:12:20You know, we didn't see an insane jump.
00:12:23We just saw an incremental one.
00:12:24So is it better?
00:12:25Yes.
00:12:25Is it?
00:12:26Is the hype paying off?
00:12:28No.
00:12:28So, you know, we saw in their marketing materials also, they put out a couple examples of things it could do.
00:12:34Like, make a little game.
00:12:35You know, create, like, a lo-fi visualizer, a.k.a. iTunes in the 2000s.
00:12:40And that was cool, but, you know, we'll get into this later, but my tries on this kind of glitched a lot, so I don't know.
00:12:49I want to play a clip really fast.
00:12:53Our colleague Alex Heath is co-hosting Dakota right now, and he got a chance to speak with ChatGPT head Nick Turley on the show this week.
00:13:02And Nick said something that I found really insightful in helping me understand why it's so hard to improve ChatGPT as a product.
00:13:09It's just confusing when you're building for so many different users because you can, on the one hand, have a local set of power users who, I think, very rightfully have feedback about the way that we rolled five out.
00:13:18On the other hand, you also have a large swath of more typical consumer users, and it's their first time actually seeing, interacting with the concept of reasoning, like a thinking model and the sparks that come with that.
00:13:29And I think that's tremendous, and we're going to see it show up in our stats.
00:13:32So he breaks it down into, like, power users and regular users, but in my head, this also sort of pans out to, okay, right, there's, like, ChatGPT is 500 products in one, but the interface is the same for everybody, right?
00:13:46And so how do you even start to improve each one of those, right?
00:13:50We're talking about, okay, writing is a little worse, but coding is a little better, but it's the same product that has to do both of these.
00:13:57And it's, like, at what point do they just start splitting this up and turning this to multiple different products instead of just being, like, it's one thing, it does everything?
00:14:05One of the coolest parts of the launch, honestly, was the switch that they introduced.
00:14:09So they've noticed, they said they've noticed that a bunch of their consumers want to not make a decision when they come to this product.
00:14:18They want a product.
00:14:18They don't want a decision.
00:14:19They don't understand all the confusing, insane names that they name all their models.
00:14:25And so they just want to come, and it's, like, you know, automatically routed to whatever model is best for that.
00:14:31And so that's what they did with this.
00:14:33I did think that was one of the only, like, notable parts of this launch was that they were able to build something that did that.
00:14:40Obviously, it kind of blew back in their faces a little bit with the 4.0 thing.
00:14:43But besides that, it was cool that they were able to kind of automatically route queries depending on what's best for it.
00:14:49So I think they're going to keep going in that direction because, I mean, they don't want the average user, the non-power user,
00:14:56to see the behind-the-scenes of all these different models and what's best at what.
00:15:01They just want it to automatically happen for them and be super, you know, no friction.
00:15:07That makes sense, like, from just a general average person user point.
00:15:11But, like, I did see people on Reddit go, like, this is unacceptable because I use model 3.0 for logic reasoning,
00:15:19and I use 4.0 for this, and I use 4.5 for whatever.
00:15:22And I was like, wow, that's way too much, like, differentiation and thought for the average person to actually just go in.
00:15:30And, like, I just don't think they're going to go into those specialized routes for that reason because I have a hard enough time thinking what product I'm going to use for what.
00:15:40It couldn't be me, yes, but I think that for some people, if they pay a lot, especially if they're on that higher subscription tier, like, you know, $200, that's a lot or whatever, they want choice.
00:15:50And so I think OpenAI learned their lesson here in that they need to give a lot of notice if they're going to, you know, take some models offline,
00:15:57especially because, you know, people in us have had a funeral for a cloud model the other day.
00:16:01And so they need to give notice and just, you know, let their power users kind of weigh in.
00:16:05That's so wild just to see, like, the personification of it, though.
00:16:08Like, morning chat GPT 4.0, morning old versions of Claude.
00:16:13I'm just like, it reminds me of, like, back when Sony had those little AIBO dogs that they, the robotic dogs that they discontinued.
00:16:22The AI dogs?
00:16:23Yeah, and then, like, they discontinued it for a while, and then owners in Japan, like, once their little robot dogs started deprecating, they held funerals for them.
00:16:31Oh, no.
00:16:32And then, like, you know, it was like a whole thing.
00:16:34And, you know, but that sort of made sense.
00:16:36It was a little robot dog.
00:16:37It was cute.
00:16:38It was lovable.
00:16:38But people doing this for an intangible AI model, that kind of makes me go, like, ah!
00:16:47I think it's sad, too, because, I don't know, I feel like it's human nature to personify things.
00:16:53Like, we name our cars, you know, we, like, name everything, and now you can name chat GPT.
00:16:58And so it's hard, because it's in our nature to personify things, but it's now that something can seemingly form an attachment back to you.
00:17:07It's not, but it seems that way to some people.
00:17:10It's harder to, you know, have any form of separation, and I think that's why people are spiraling into tragedy, psychosis, or delusion.
00:17:18Because, you know, if you're kind of lonely, and you don't talk to that many people, you do.
00:17:23One guy on Meta was saying, oh, they took away my only friend overnight with the 4-0 thing.
00:17:28That's so tragic.
00:17:29Like, not to, like, make fun of that person, because, I don't know, like, the more, like, all these products, our attention is the final product, right?
00:17:37That's why chat GPT always ends with a question to, like, have you engaged more.
00:17:41So, like, if attention is the product, and we're all isolated, and just paying attention to these products more, and then your friend changes, like, I do feel like it's sad.
00:17:51It's just genuinely upsetting.
00:17:53I'm glad this didn't come out during the pandemic.
00:17:55Oh, boy.
00:17:56Oh, boy.
00:17:57I can't even imagine.
00:17:58People would not have handled it well.
00:18:00You are right, though, about the different models, and I do think that's actually one of the things that makes it such a challenging product, too, because if you are trying to use this for work, every single, right, if you had to change your productivity software every day or for every task, you'd go crazy.
00:18:18And there is some element here where they're constantly refining these models, and they're constantly introducing new ones.
00:18:25And if you're somebody who needs to use AI professionally, and they add a new model one day, and they take out your old one, even if, right, they act differently in these very unexpected, strange ways, your piece of productivity software just changed.
00:18:39They moved the buttons around, except it's even more, like, strange and existential than that because this thing doesn't have any buttons.
00:18:46And I think that just makes it so much more difficult for people, right?
00:18:50Like, people being upset about 4O's personality is one thing.
00:18:53People not being able to get their work done is another.
00:18:57And I feel like they sort of missed that, that when they introduced this new model, they're actually shuffling the deck chairs for everybody.
00:19:05That's because it's supposed to be everything, right?
00:19:07If it's everything, everywhere, all at once, like the movie, then, you know, it's just, you can't.
00:19:15There has to be limitations somewhere.
00:19:17And the power users really do exactly what you said.
00:19:20They use a different model for everything, so, you know, it's like, you have to cater to them because they're the ones using it all day, every day.
00:19:28So you have to at least give them some notice.
00:19:30So I gave everybody some homework.
00:19:32Yes, you did.
00:19:33So last week, none of us on the show were able to say that we had tried Vibe Coding.
00:19:40And so I thought it was important that we fixed that this week.
00:19:44Now, I will say, disclosure, none of us actually know how to code.
00:19:48So this is like a very different use case than a professional developer.
00:19:53I have friends who are like, yeah, my job is just like 90% babysitting AI coding at this point.
00:19:59That's not what this is.
00:20:01But I found OpenAI's presentation to be really interesting because I think they sort of envision this next phase for ChatGPT, one where it starts just building stuff for us as a matter of course.
00:20:14And for us, that actually requires a little bit of like a mindset shift, right?
00:20:18It's just like another type of query that I didn't know I could enter, right?
00:20:22You just have to start saying, hey, can you make me interactive?
00:20:25Can you turn this into a game?
00:20:27Can you like visualize this for me?
00:20:28So we all tried to build something and we all have different experiences about how it went.
00:20:34So, Vi, do you want to start?
00:20:36Sure.
00:20:37Let's just say that I had to take coding in high school and I only passed because a blizzard totally wiped out my final.
00:20:48And that is the only reason why I passed.
00:20:50Also, shout out to Henry Chan for having a crush on me and letting me put my name on all the group projects because he was good at coding and I was not.
00:20:57And I was terrible at it.
00:20:59So I had it code me a problem that I have, which is keeping track of all of my review units.
00:21:05And I will share my screen so you guys can see this horrible little thing I made.
00:21:13And so I asked it, can you program me an interactive product inventory tracker?
00:21:19I need to be able to keep track of which review units I have, what's in my queue, and when I need to return lunar units by.
00:21:25And it thought for 19 seconds.
00:21:28That's what it said here.
00:21:29And I was like, oh, it's thinking really hard.
00:21:31And then, you know, it made me a lightweight single file web app for tracking.
00:21:36And I was like, I don't know what this means.
00:21:38This is what I mean by it yaps.
00:21:40It gave me a paragraph and I went, that's a lot of words.
00:21:43It's asking me to do things.
00:21:45And then it says, open the canvas preview to use it.
00:21:49And I was like, how do I use this?
00:21:51Because I didn't.
00:21:52And then it's like, you can use it as a standalone app.
00:21:55Open it locally.
00:21:56Save all the, add all of your, I was like, this is so much.
00:21:59There are eight bullet points here.
00:22:00With sub bullet points within them about how you use this file.
00:22:04This seems like more work than even just doing it yourself.
00:22:07It does.
00:22:08And then I was like, oh, wait, do I just click this thingy?
00:22:11And then it gave me this code.
00:22:13And I was like, this is scary.
00:22:14I don't like looking at this.
00:22:16And then there was a thing, it says run code.
00:22:18And then I was like, that's a system.
00:22:20I don't know what I'm doing here.
00:22:22Like I said.
00:22:24So you just clicked run and it just said syntax error.
00:22:26And there was no further thing that you could do.
00:22:29And then it was like, you know, I could make this into a ready to open HTML file.
00:22:33And I was like, yes, I would like an HTML version.
00:22:36Please do that.
00:22:36And then it did this, which is more code.
00:22:39And I was like, okay.
00:22:41And then it says, I've converted this into an HTML skeleton that you can run directly in your browser.
00:22:46Great.
00:22:47Love it.
00:22:48You need to paste your full React.
00:22:50No, don't do this to me.
00:22:52Don't ask me to do a thing.
00:22:53And then I was like, what does this look like?
00:22:55Oh, no.
00:22:56You click it again.
00:22:56I click it again.
00:22:57This is not, I don't understand what's happening.
00:23:00And then I basically was just like, can you explain this to me in a way that someone with absolutely no knowledge of coding jargon can understand?
00:23:10And it said, got it.
00:23:12Here's the plain English version.
00:23:14And I was like, oh, my God.
00:23:14There's so many words.
00:23:16Again.
00:23:16Save it to my computer.
00:23:18Another four-point step.
00:23:19Another four-point steps.
00:23:20And I was just like, oh, my God.
00:23:21There's, okay.
00:23:22Sure.
00:23:23Or, and then I was just like, and then it says, I can post the full HTML here for you to save or give you a downloadable version.
00:23:29I was like, okay, yeah.
00:23:30In my head, I'm like, just let me download this.
00:23:32There's so much reading going on.
00:23:34Here's your ready-to-open HTML file.
00:23:37But then, it's like, you still have to paste the code that I made for you.
00:23:42Right.
00:23:42So it lied a little bit.
00:23:43It lied a little bit.
00:23:44I got so frustrated that I went to my dev friend and I was like, what's it telling me to do?
00:23:50And she's like, it's kind of lying to you.
00:23:53Kind of.
00:23:53Like, you just have to know what it's saying.
00:23:56And then she went into a rant about how it's talking to you.
00:23:59Like, you know how to use code already.
00:24:02And that's why, in her words, women have a hard time breaking into the coding industry because everyone talks like everyone knows what everything is already.
00:24:10And I was like, okay, that also is not helping me figure this out.
00:24:16I did eventually get to a point where I could see what it had created in a window.
00:24:20I don't know how I got there.
00:24:21I did previously.
00:24:23And it basically recreated Airtable.
00:24:26And I was like, okay.
00:24:27It was just Airtable.
00:24:29It was like the exact interface of Airtable.
00:24:31It was just Airtable.
00:24:32And I was like, well, I have Airtable.
00:24:34Yeah.
00:24:35Good to know.
00:24:35Good to know.
00:24:36I'm going to go back to the little notebook I was using.
00:24:39So it was sort of successful, but not useful and was painful to get there.
00:24:46It did not interface with my ADHD well because I was looking at this and I was like, I thought you were just going to give me an app.
00:24:53But you're putting code up at me and I don't code.
00:24:58So this looks very intimidating and scary to me.
00:25:00So I don't want to jump ahead too much.
00:25:02But I had a very similar experience where it continually was like, you do this.
00:25:08You do this.
00:25:09And I'm like, no.
00:25:11It's like a bad group project partner.
00:25:14I'm like, admittedly, they did most of the work, but I'm like, you have to understand, I do not know this part.
00:25:22Like, this is why I'm asking you.
00:25:25Hayden, how did it go for you?
00:25:27Because I feel like you tried something like kind of advanced.
00:25:29I did, I tried a simple thing and an advanced thing.
00:25:31So the advanced one, I really miss this computer game I played growing up nonstop called Freddy Fish.
00:25:38And it had all these like mystery games and Jake has played it too.
00:25:41Freddy Fish rules.
00:25:43It's amazing.
00:25:44Highly underrated.
00:25:46Go watch YouTube videos about it.
00:25:48It was really high quality kids entertainment.
00:25:51It was good stuff.
00:25:52The way that I de-stressed the other day was like watching a playthrough on YouTube.
00:25:56So good.
00:25:56But I missed it.
00:25:58I, of course, knew nothing could actually recreate it.
00:26:00But I was like, let me use this as my inspiration to, you know, have some nostalgia.
00:26:04So I asked it to create a single page app in a single HTML file called Under Seam Mysteries with the goal of just solving like a simple mystery or puzzle.
00:26:14Puzzle is kind of it's out because that's simpler to create.
00:26:17Um, and I said, make me a pink cartoon starfish with a sidekick that's an ornery but lovable jellyfish.
00:26:24Make the UI colorful, 90s-esque.
00:26:27It should take place under the sea.
00:26:29There should be like sea kelp, sunken pirate ships, treasure chests, stuff like that.
00:26:33So I was really, I really had a vision.
00:26:37You know what I mean?
00:26:37I was, I was going off my nostalgia.
00:26:39I thought for one minute and 38 seconds.
00:26:42Here's the code.
00:26:43At first, it brought up this cute 90s-esque like tie-dye game called Under Seam Mysteries.
00:26:50It was exactly what I asked for.
00:26:51There were clues you had to click around and find.
00:26:53Then you could open a treasure chest.
00:26:55There was a sunken pirate ship.
00:26:57There was a jellyfish that looked ornery but lovable.
00:27:00We had it all.
00:27:01The problem is I couldn't scroll down at all.
00:27:04I couldn't, this, what you're seeing right here is all I could see.
00:27:08It looks so good.
00:27:08You showed me the screenshot and like it gave me the Freddy Fish vibes.
00:27:12And I'm like, oh my God, I want to play this.
00:27:13It was so, it was like torture because I could not scroll down.
00:27:16I could see it.
00:27:17But, and then it said, cannot preview your code and error occurred.
00:27:20I was like, no problem.
00:27:21Let's fix it.
00:27:22So it said, okay, do you want to fix it?
00:27:23I said, no, you fix it.
00:27:25So same thing that happened to you, Jake.
00:27:27It was like every, at every step it was like, okay, here's what you should do.
00:27:30I'm like, no, you do it.
00:27:31So then it kept saying it was fixed.
00:27:33I'm like, great.
00:27:33I can't wait to play my game.
00:27:35Finally, we got to the point where I try to run the code.
00:27:38It says everything's fixed.
00:27:38And this happens just completely.
00:27:41This is, so now it's just a text file that just says, it's just black and white with
00:27:46a bunch of like gunk.
00:27:48This doesn't, I don't know what to say anymore.
00:27:50It just looks like word vomit.
00:27:51Everything has disappeared.
00:27:52Yeah.
00:27:52It was just so, so sad.
00:27:54It was like a broken, just paragraph of text.
00:27:57And so I said, you know what?
00:27:58Let me try something a little simpler.
00:28:00So I said, hey, um, can you give me, um, like an interactive embroidery lesson?
00:28:07I famously, the other day, tried to order an embroidery kit for a beginner on Amazon.
00:28:13I've done that.
00:28:13Um, and it was all AI generated, the instructions.
00:28:16No.
00:28:18It said, um, like kill at one point.
00:28:21It was kind of creepy.
00:28:22Oh, boy.
00:28:22It was very, you know, the embroidery needles are not that sharp.
00:28:26I know it was a lot.
00:28:27So I said, okay, let me set away these instructions and let me just make an interactive, um, training
00:28:32thing for myself on here.
00:28:34Great.
00:28:34It was successful.
00:28:35So I ran the code and it gave me a little, um, lesson plan and it says, okay, click along
00:28:43the neon guide to lay even stitches.
00:28:45And so, I mean, it's not like I can really learn cause it's just asking me to make clicks,
00:28:50um, you know, 12 pixels apart, but you know, the vibes were there, you know, it had tips
00:28:56at the bottom and I could go on to different pages and do different types of stitches, although
00:29:00I'm just clicking.
00:29:01So it's not really, um, that helpful.
00:29:03It is what I asked for and it was successful.
00:29:05So I think simpler stuff.
00:29:07Great.
00:29:07But I was interested in the fact that it kept asking us to do our own coding.
00:29:12Yeah.
00:29:12It's so interesting that like the, the assumption is that it will build it for you, but it keeps
00:29:17trying to throw things back to us, which it's interesting that it isn't able to tell that
00:29:22we obviously do not know what we're doing.
00:29:24It talks to you.
00:29:25Like you understand the, the jargon.
00:29:27And I was like, no, no, no, no, no.
00:29:29I don't.
00:29:30I don't like, these are words I'm digesting the words, but they're not processing in my brain.
00:29:34Like how the vibes were not there.
00:29:37Yeah.
00:29:37The vibe coding vibes were not there.
00:29:39So my experience was, I think, very similar to both of yours where I, so I play a lot of
00:29:43chess and I'm trying to learn some new openings for black.
00:29:47And I was like, oh, it should just make me like an interactive lesson.
00:29:52That would be really cool.
00:29:53And so I gave it these instructions and it went off and, you know, it does the same thing
00:29:58as with UV where it's like, here's a bunch of code for HTML, open it up in your browser.
00:30:02I'm like, okay, I can manage this.
00:30:04Um, but the problem is the chessboards don't load.
00:30:08And so I keep going back and forth with it and back and forth in it.
00:30:11And he was going, I fixed, I fixed it.
00:30:12I fixed it.
00:30:13Don't worry about it.
00:30:13I fixed it this time.
00:30:14And I'm like, you didn't fix it.
00:30:16And it turned out that it kept calling, um, these chessboard files that were located,
00:30:21that it believed were located on GitHub, but did not have the right URL.
00:30:25And so I'm like, ChadGPT, you have to understand the thing you think is that this URL is not
00:30:30this URL.
00:30:31It is not loading.
00:30:32You need to figure out something else out.
00:30:34And it would be like, Jake, I have fixed it.
00:30:36I did not do that again.
00:30:38And I'm like, ChadGPT, you did it again.
00:30:41It is not loading.
00:30:43And this, this goes on forever.
00:30:45Um, so finally I realized, okay, okay.
00:30:48I think I've got this.
00:30:49I'm like, ChadGPT, I need you to build this entirely locally.
00:30:53Do not use these other files.
00:30:55Like don't use this JSON thing.
00:30:57It needs to be entirely local and figures it out.
00:30:59So it gets me something.
00:31:01So I'm going to share this so we can all see it.
00:31:03Okay.
00:31:04So here's the HTML version.
00:31:05This is the, the, the working version.
00:31:08Okay.
00:31:08So here it is.
00:31:09Look, it made a chessboard.
00:31:11It's a chessboard.
00:31:12It made a chessboard.
00:31:12It made a chessboard.
00:31:13Okay, great.
00:31:13And, and it worked.
00:31:14I'm like, Hey, this is pretty good.
00:31:16This is not so bad.
00:31:17Um, so then I go back to it and I'm like, all right, not bad, but this is like pretty simple.
00:31:21Can we do some upgrades here?
00:31:23Um, cause the main thing for me, if you look at this, it, it has a chessboard, it looks
00:31:27good, but it's from white's point of view and I'm, I'm learning black openings.
00:31:30And so, you know, if I'm playing chess online or on or in person, like the black side is
00:31:36going to be near me.
00:31:36And so I want to see it from black's perspective.
00:31:38So I'm like, Hey, chat GPT, can you flip the board?
00:31:41Um, and this begins a very, very lengthy series of back and forth.
00:31:45Um, I get kicked out of my, my GPT five minutes, whatever.
00:31:50So it's a, it's a mix of GPT five, GPT four, but I'm going to show you, um, what it has,
00:31:55has created now.
00:31:56And this is a very fancy version.
00:31:58It has a bunch of different openings, a bunch of different, like you can do different lines.
00:32:02I don't know if you guys are chess experts, but I, I'm just going to decidedly, I'm going
00:32:06to, I'm going to click one button here and I'm going to ask if you guys can tell me what
00:32:09has gone wrong with this latest version.
00:32:12Okay.
00:32:12So we're going to make the first move.
00:32:13Do you know what has gone wrong here?
00:32:16Well, um, white goes first.
00:32:18Yes, indeed.
00:32:19And ChattiePT has moved the black piece first.
00:32:22Oh man.
00:32:23Uh, and I have told ChattiePT, ChattiePT this several times at this point, and it will
00:32:28not stop making black move first.
00:32:30It's the legal move.
00:32:31It is.
00:32:32Yes.
00:32:33So, uh, it has gotten significantly more broken as I have tried to change literally
00:32:37anything.
00:32:38This reminds me of like the robotic arms that like can't grasp something, but they
00:32:42can like lift something that's like 500 pounds.
00:32:44Yeah.
00:32:45Like they can't do the simplest thing.
00:32:46This is, this is what I have found with all the chess stuff where it's like, okay, I'll
00:32:50just build you a giant app.
00:32:51It's really complicated.
00:32:52It has all these buttons.
00:32:53I don't know how many pieces go on the board.
00:32:55I tried this with Claude too and Claude, I'm like, Hey Claude, there are too many pawns.
00:33:00You have, you're putting extra pawns in the board and it's like, I'm so sorry.
00:33:03Let me fix that.
00:33:04And then it just doesn't fix it.
00:33:05So it's anyway, these are our adventures in vibe coding.
00:33:10Um, I, I think maybe the lesson is, you know, Sam Altman really pushed this as, as
00:33:15this is sort of a new era for vibe coding.
00:33:18It's going to be able to build all these sort of like cool interactive things for users.
00:33:23And I, that is, I think a really interesting promise that, that we'd be able to suggest
00:33:27like visualize things in new ways and learn things in new ways.
00:33:31And I do really want to tap into that, but it feels pretty clear to me after this, that
00:33:35it's like, it's not quite there.
00:33:36No, it feels like if you already know how to code, I think this is going to be great for
00:33:40you because when it comes to the point where it gets something wrong, you're going to be
00:33:44able to identify what it's doing wrong quickly and then be like, Oh, you just, it's a little
00:33:49baby mistake.
00:33:49I can do the thing.
00:33:50But if like the real true spirit of vibe coding is a dumb, dumb, a coding dumb, dumb, like
00:33:55me being able to make an app that just works.
00:33:58I mean, I lost my mind trying to recreate the air table.
00:34:03So I mean, in fairness, I lose my mind just opening air table.
00:34:07So, so, you know, that's, that's the standard experience there.
00:34:11All right.
00:34:11We've got to take a break.
00:34:12When we get back, we've got a new segment we're calling corporate shenanigans.
00:34:18Support for this show comes from NPR, from fitness routines to mental resilience, navigating
00:34:23personal goals to tackling burnout.
00:34:25We all need a little help being human.
00:34:27That's what the life kit podcast from NPR is here to do.
00:34:30If you're looking to move with more intention or just need thoughtful guidance on living
00:34:34better life kit, deliver strategies to help you make meaningful, substantial change.
00:34:39Life doesn't come with a manual, but every day you're making choices, some big, some
00:34:44small that shaped the direction of your life.
00:34:47Life kit offers real stories, relevant insights, and clear takeaways to help you meet those decision
00:34:52making moments with confidence and clarity.
00:34:54Get help tackling common issues like relationships, finances, parenting, and your career.
00:35:00Then walk away with a game plan you can implement right away.
00:35:03Recent episodes include stories about roommates and communal living, treating weekends more
00:35:07like vacations, and advice from a career coach about bouncing back if you've been laid off.
00:35:12Life kit isn't just another podcast about self-improvement.
00:35:14It's about understanding how to live a little better starting now.
00:35:17Listen now to the life kit podcast from NPR.
00:35:25Welcome back.
00:35:25This week, we've seen some headline grabbing announcements from Perplexity, Apple, and Elon
00:35:31Musk.
00:35:32Perplexity offered to buy Google Chrome for $34.5 billion.
00:35:37Apple is suing the movie theater chain, Apple Cinemas.
00:35:41Elon Musk says he's going to sue Apple for rigging the App Store.
00:35:44The question I'd like us all to consider for each of these news items is, shenanigan or no?
00:35:51As in, is the company in question actually serious, or is this just for the attention?
00:35:57So let's start at the top.
00:36:00Perplexity has put in an offer.
00:36:02Nobody asked them.
00:36:03Chrome is not for sale.
00:36:05But they have gone out, and they said, we're going to, we want to buy Google Chrome.
00:36:10We will pay $34.5 billion.
00:36:14Is this real?
00:36:15100% a shenanigan.
00:36:17The 100% is so much higher than its own valuation.
00:36:20Like, that's the kind of shenanigans it is.
00:36:22It's like Oprah-level shenanigans level shenanigans.
00:36:25My favorite tweet about this was like, I'd like to buy Chrome, too.
00:36:28Someone will put up the money.
00:36:29Yeah, like, they have said that they had investors lined up that would front the money, but, I mean, I think it's just a marketing stunt.
00:36:37It is.
00:36:37I mean, right, perplexity is, I think, valued at $18 billion, so this is like double their valuation.
00:36:43They want to, you need two perplexities to buy Chrome at this price.
00:36:47Chrome also not for sale.
00:36:49It might be, it might be.
00:36:51There's a chance it'll be for sale.
00:36:52And never forget, they try to buy TikTok, too.
00:36:55Right, right.
00:36:55Like, six months ago, they tried to buy TikTok.
00:36:58Again, I don't think anybody was asking, I mean, TikTok is, is maybe potentially, plausibly, it's like, it's in a quantum state of being for sale and not for sale.
00:37:07So that one, that one was.
00:37:08Perplexity will not be the buyer.
00:37:09Schrodinger's TikTok.
00:37:11It's Schrodinger's TikTok.
00:37:13It's for sale and not for sale at the same time.
00:37:15But this feels like, you know, that person that's always like, oh, I'll do it because there's no danger of them actually having to do the thing.
00:37:22That's so real.
00:37:23Yeah, yeah, yeah.
00:37:24It's like the worst case scenario is that they accidentally buy Chrome and have a good product.
00:37:29It's like someone at an auction that accidentally raises their paddle.
00:37:33And then if they get picked, it's like, all right, I don't know what it is.
00:37:35I did not know that was coming for me.
00:37:37It's exactly what it is.
00:37:37Yeah.
00:37:38Perplexity, they have, I think, a fine product.
00:37:42And they really like being in the news.
00:37:45And that seems to be what is going on here, right?
00:37:49Like, and it's funny, they have, they have a browser.
00:37:52They have a browser based on Chromium.
00:37:56It's perplexing.
00:37:58What I did there, yeah.
00:37:59Okay, this one, I think we can move on from, I feel like we're all in agreement, fully a shenanigan, not serious.
00:38:09Okay, item number two, Apple, the company that makes computers, is suing Apple Cinemas, a small movie theater chain that is now starting to expand nationally.
00:38:23Now, okay, I will tell you, when I first saw this, I was like, this is ludicrous.
00:38:28This is a shenanigan.
00:38:29I'm sorry.
00:38:29I never heard of Apple Cinemas until this lawsuit.
00:38:32They're making it more popular.
00:38:34They are, like, it's like that thing where you're drawing attention to the thing, he's doing a Hamilton, where Hamilton is just like, I will tell everyone I had an affair, even though no one actually, this is my musical theater nerd coming out.
00:38:48Oh, actually, did you know it's coming out to theaters?
00:38:50I'm really excited.
00:38:51We gotta go see it.
00:38:52Yeah, we gotta go see it.
00:38:53But, you know, he's just like, oh, nobody knows about this affair.
00:38:56Let me publish it for everyone and ruin my life.
00:38:59Like, that vibe is what's happening here.
00:39:01Like, nobody frickin' knew what Apple Cinemas was or was associating it with Apple.
00:39:06It's a shenanigan in that way, but I think they are very serious and they want it to happen because they're so intense about their brand.
00:39:14So I feel like it's like a shenanigan to us all, but internally not because they really believe that this is like a threat to their brand.
00:39:23So they got really upset because Apple Cinemas started moving into, I believe it was San Francisco.
00:39:27And they were like, this is, this is too close.
00:39:30They're like, you can't do this.
00:39:32They actually, they narked on Apple Cinemas.
00:39:35They called up Apple Cinemas' landlord and they're like, hey, I don't know if you know this, but they're kind of using our name.
00:39:41And the landlord did not respond.
00:39:43And they're like, we had to sue.
00:39:45It was the only thing we could do.
00:39:47Calling the landlord.
00:39:48That's dark.
00:39:48That's, that's, that's, this lawsuit is, is kind of ridiculous.
00:39:52They go, they just go through a bunch of like Facebook comments of people being like, lol, they gonna get sued.
00:39:58Like, they're like, they're using that as evidence.
00:40:00This is their evidence.
00:40:01They, at one point, they Google, they do a Google image search for Apple Cinema, which I will note, I will note the company's Apple Cinemas.
00:40:09They do a Google search for Apple Cinema and they're like, look at this.
00:40:13Half the results are Apple Cinemas and half are the Apple Cinema display.
00:40:17We have a product with basically the same name.
00:40:20So they go, they paint this long history of Apple.
00:40:23They're like, Apple has a grand history with cinema.
00:40:28We created Final Cut.
00:40:30We created QuickTime.
00:40:32We streamed the trailer for The Phantom Menace on our website.
00:40:36And it, it's, it's very, very silly.
00:40:39Jake, you should use that voice when you do this podcast.
00:40:42This is the entire podcast.
00:40:43I don't know.
00:40:44I don't think I can maintain it.
00:40:46The Apple voice.
00:40:47So there, there is another wrinkle to this that, that I have to point out.
00:40:50So I want to, I'm going to read you a bunch of names.
00:40:53Uh, and I want you to tell me if you, if you know who, um, originally owned these, right?
00:40:58So Apple Electronics, Apple Films, Apple Publishing, Apple Boutique, Apple Records, and Apple Studios.
00:41:06Do you know these brands?
00:41:08No.
00:41:08Nope.
00:41:09These were originally companies that were part of Apple Core, which is the Beatles company.
00:41:16The Beatles started this to start Apple Records.
00:41:20And Apple Core, the Beatles company, was in a protracted trademark battle with Apple Computers starting in the 70s because Apple Computers came after Apple Core.
00:41:32And so Apple Core, right, originally a much more successful company because they were the Beatles.
00:41:39Wow.
00:41:39They were doing very well.
00:41:40So Apple Core sues Apple and, uh, you know, they get into a tiff about, about, they get into a bit of a turf battle.
00:41:48And Apple Core, they've got a bunch going on.
00:41:51They've got movies.
00:41:52I don't know what their electronics are, but apparently they had some electronics.
00:41:56Um, this all culminates in Apple Computers eventually becoming a dramatically more successful company than Apple Core.
00:42:05And just buying out the trademarks to everything that Apple Core owns.
00:42:12So Apple, now, I think, I think they own the trademarks to Apple Core and license it back to, to the Beatles company.
00:42:18Oh, wow.
00:42:19You know what?
00:42:20They're just having trauma.
00:42:21Apple is having trauma.
00:42:22They think Apple Cinemas is going to blow up and then they're going to have to, they're going to get taken over.
00:42:27This is just trauma.
00:42:28So Apple, Apple owns basically Apple, Apple preceding any word is, is, is my understanding at this point.
00:42:36Except Gwyneth Paltrow's kid.
00:42:38Crucially, I hope not.
00:42:39We don't know.
00:42:40We don't know.
00:42:40We don't know.
00:42:41Do they own anything that has the word Apple after it?
00:42:44Like, do they own New York?
00:42:45Because it's the big Apple?
00:42:48No, no, no.
00:42:48That's different.
00:42:49That's different.
00:42:50Yeah, yeah, yeah.
00:42:50Just Apple in front of a word.
00:42:52Apple in front of a word, I think they own.
00:42:55And so, okay.
00:42:56Initially, I saw this and I thought, this is ridiculous.
00:43:01Leave this little movie theater chain alone.
00:43:04The movie theater chain, I should say, they claim that their name comes from, they're originally going to open it in some area that had Apple in the name.
00:43:13And they're like, oh, it's just named after the area.
00:43:15But Apple owns that area.
00:43:17They own everything that has Apple in the name.
00:43:19It was just Cupertino.
00:43:20So, initially, I thought it was a shenanigans, but now Apple Cinemas is like, yeah, we're going to open up like 100 theaters across the U.S.
00:43:27They're like, we're a top 25 theater chain, which admittedly is not saying very much.
00:43:31I think that maybe it's not a shenanigans.
00:43:34And I think that it feels a little mean, but I feel like they maybe have to do this one.
00:43:41They're serious about it for sure.
00:43:42Oh, they're definitely serious.
00:43:43What if Apple is thinking as an extension of Apple TV+, that it wants to bring people into theaters, and therefore, it's going to launch Apple Cinema+, in which case, then it's not a shenanigan, because this is future plans that Apple Cinemas is foisting from them.
00:44:01In which case, this fan fiction, if this fan fiction is true, maybe it's not a shenanigan.
00:44:06You're looking at the patents.
00:44:08Well, can I tell you something?
00:44:09They pulled some comments from reviews of Apple Cinemas, and people said it was gross, so I think they might be worried about it.
00:44:16They're worried about the brand reputation for their future cinematic efforts.
00:44:20When I go see Freakier Friday, I will not be going to Apple Cinemas, I guess.
00:44:24Well, I don't know of an Apple Cinemas in a 25-mile radius, so there's that.
00:44:30I guess I just won't fly to SF to see it.
00:44:33Okay, moving on to our next one, which is also Apple-related.
00:44:37This is breaking news as of Thursday morning.
00:44:41V, do you want to break this one down for us?
00:44:44Yeah, so Apple has basically figured out a way to circumvent the ITC, the International Trade Commission, import ban on the Apple Watch,
00:44:54which, you know, is a whole legal tit-for-tat that they had with Massimo, which is a medical tech device maker for blood oxygen monitoring.
00:45:04And so, you know, they went back and forth for a really long time, cut to December 2023.
00:45:10And, you know, the ITC has ruled that, you know, there's an import ban on the Apple Watch because they have a patent infringement on the blood oxygen sensor.
00:45:20And that's why, starting in January 2024, in the U.S. only, you can't buy an Apple, a newer Apple Watch with the blood oxygen sensor feature enabled.
00:45:29Well, that's all different today.
00:45:31There's a software rollout update, and now you can have the blood oxygen monitoring feature.
00:45:36Except, you know, the workaround is that the sensor on the watch collects the data, and it gets processed on the phone so that you can view the data only on your phone and not on your wrist.
00:45:53Right, okay.
00:45:54That's the workaround.
00:45:55So, the hack they figured out is, as long as the watch doesn't show you the blood oxygen sensor data, they can run it.
00:46:03Wow.
00:46:03So, for the past year and a half, they have been selling watches with this sensor disabled, and now they're enabling it.
00:46:10Okay, the thing that I find ridiculous about all of this, they could, Apple has so much money.
00:46:16So much money.
00:46:17They could just pay Massimo.
00:46:20Yeah, is it just like a principal thing? What's up with this?
00:46:23They don't want to pay.
00:46:24I think so, though, right?
00:46:25Like, I think they're like, listen, even if you are right about it, and look, Massimo, they might be right, they might not.
00:46:32It is going to continue to be litigated for a long time.
00:46:35But I think that they're saying, on principle, even if you're right, we will drag you to the ends of the earth, and we will deplete your legal budget before we give in.
00:46:45I think that's what's happening here.
00:46:46There's, like, multiple suits going on.
00:46:49So, like, initially, like, this thing goes back, like, over a decade at this point, where, you know, initially, Apple went to Massimo.
00:46:58Allegedly, Apple went to Massimo, and they were like, hey, you know, we want to collab on blood oxygen.
00:47:04And then, you know, they poach a guy from Massimo, who very subsequently soon leaves.
00:47:10And so then Massimo's like, hey, you stole our shit.
00:47:13This is the scientific, you know, legal terms that they're using.
00:47:16You stole our shit.
00:47:18And Apple was like, pfft, no, we didn't.
00:47:20Here's the Apple Watch Series 6 with blood oxygen monitoring.
00:47:23And they were like, hey, hey.
00:47:25And so they filed a suit.
00:47:27And when that wasn't working because it was taking too long, they went to the ITC because the whole—even the ITC thing is a workaround.
00:47:32Because if you—because of the supply chain and these devices being made in China, if you can't import them, ha-ha-ha, you can't import them, pew, pew, pew.
00:47:40And then Apple was just like, well, first of all, we are going to exhaust every legal option, humanly possible, to prevent this ban.
00:47:46Which is why, you know, these proceedings started in 2020.
00:47:50It didn't happen until the end of 2023, 2024.
00:47:54There was like a back and forth.
00:47:55The ban was paused.
00:47:56The ban was unpaused.
00:47:57And then Apple was like, fine, we'll just disable it then.
00:48:00How sick do you think Apple's lawyers are of this situation?
00:48:04They are so sick of it.
00:48:05They are—they are so sick of it.
00:48:07And the irony is, is that Massimo then came out with its own smartwatches, uh, which look like the Apple Watch.
00:48:14And bit your blood oxygen.
00:48:16And then Apple was like, shenanigans!
00:48:20Petty to the nth degree.
00:48:22Yeah.
00:48:22So petty.
00:48:23Okay, so what's our—what's our take on this one?
00:48:26Is this—is the whole situation a shenanigans?
00:48:29Is it serious?
00:48:31I think it's serious, but it's ridiculous.
00:48:34Yeah, agreed.
00:48:35I think that's a fair take.
00:48:36I was going to say it's right down the center where, uh, it's—yeah.
00:48:40It's just ridiculous at this point.
00:48:42Also, the patents expire in 2028, so this is very stupid.
00:48:46Oh my—
00:48:46Wow.
00:48:47Also, the fact that they were able to remove a feature for a year and a half, maybe—maybe speaks to how important that feature is.
00:48:56I mean, I've written it all over, up and down, across the Verge.
00:48:59It's not that important of a feature.
00:49:00It just isn't.
00:49:01It's rough.
00:49:02It's rough.
00:49:02That was once a flagship feature, and, uh, they don't always pan out.
00:49:07No, they don't.
00:49:07Okay, final possible shenanigan.
00:49:11As soon as I say the first word, you're going to know where the—
00:49:14Elon Musk says he is suing Apple for rigging the App Store.
00:49:20So, I'm going to read this quote.
00:49:22Uh, here's Elon on Twitter.
00:49:24Or, sorry, apologies.
00:49:25This is Elon on X.
00:49:26He says, quote,
00:49:28Apple is behaving in a manner that makes it impossible for any AI company besides OpenAI to reach number one in the App Store, which is an unequivocal antitrust violation, end quote.
00:49:39And so, he starts by claiming that they're rigging the rankings, but then he kind of pivots very quickly and starts by claiming that actually what they're doing is rigging the App Store editorial.
00:49:51Because they have a bunch of curated sections where they'll be like, our favorite apps, like summer, hotness, whatever, like, check out these AI things.
00:49:58And they don't feature Grok.
00:50:00Can you believe they won't feature Grok?
00:50:01I can't believe it.
00:50:02Um, you mean Grok, the one that unleashed, like, unleashed sex bots very recently on the bad Rudy?
00:50:11Oh, no.
00:50:11Why would ever family-friendly Apple not put Grok in its editorial section where it's allowed to exercise opinions?
00:50:20Oh!
00:50:21Yeah.
00:50:22I mean, this is—
00:50:23My feeling is also, like, these apps aren't even, like, allowed in the App Store, right?
00:50:30Other apps that make deepfake nudes have been banned.
00:50:34Right.
00:50:35They can't ban Grok because Elon is too big of a figure.
00:50:38He's already getting special treatment.
00:50:40Also, like, his whole assertion that no other AI company can reach number one—didn't DeepSeek reach number one?
00:50:46Right, right.
00:50:47Actually, there's a community note on his tweet about this, being like, actually, DeepSeek.
00:50:53My favorite also is, like, Sam Altman and Elon getting into yet another ex-battle, where Sam's like, oh, well, have you ever, you know, put your own company's special treatment?
00:51:06Have you ever given your own company's special treatment?
00:51:09I couldn't believe it.
00:51:10So, yeah, there's a lot.
00:51:11It feels like he saw what Tim Sweeney did with Epic, and he was like, I'm gonna do that.
00:51:16That's gonna work for me.
00:51:18Well, here's the thing.
00:51:18Let me just add a little note there.
00:51:21Tim Sweeney actually filed a lawsuit, and Elon Musk at this point has not.
00:51:26He's never going to.
00:51:27He's just gonna tweet about it.
00:51:28Which, that is a bit of a pattern for him.
00:51:30So, he got a little mad at Apple.
00:51:34He retweeted a bunch of people supporting him.
00:51:36Apple put out a statement later, saying that the App Store is, quote, designed to be fair and free of bias.
00:51:43I don't know what that means.
00:51:46I mean, if it's editorial, it's, you know, like, just by the function of the word editorial, there is some kind of opinion in there, but, like, to be fair.
00:51:54Yeah, they said it's like a combo of algorithms, you know, recommendations, and opinion.
00:51:58So, this is a shenanigan.
00:52:00It definitely is.
00:52:01I'm calling it a shenanigan.
00:52:01Because he knows nothing's gonna come of it.
00:52:03And he knows that, you know, if it really was number one, it would reach number one.
00:52:07Yeah.
00:52:07This is nearly as much.
00:52:10This is, like, perhaps as much or more of a shenanigan than the Perplexity one.
00:52:13Definitely.
00:52:14Maybe more.
00:52:15I think it's more.
00:52:16It's more.
00:52:16This might be the ultimate one.
00:52:18Perplexity at least had some investors apparently lined up that maybe would front the money, you know?
00:52:22Yeah, that's true.
00:52:23They took, like, an action to say, like, yeah, we're not all total, like, BS, but, like...
00:52:30Right.
00:52:30Elon just RT'd a couple of randos on Twitter.
00:52:33As always.
00:52:34Yeah.
00:52:34Okay.
00:52:35Okay.
00:52:36So, I think, what's our running total?
00:52:38Like, two and a half shenanigans out of four.
00:52:41Yep.
00:52:41Okay.
00:52:42Okay.
00:52:43This is a big week for them.
00:52:44Apple's somehow involved in most of them, indirectly or directly.
00:52:51It's not always their fault.
00:52:53Okay.
00:52:54We've got to take a break.
00:52:55When we get back, the Thunder Round returns.
00:52:59Support for this show comes from 1Password.
00:53:02Even if you are careful about security, there's no way to know that everyone in your company is.
00:53:07Not everyone is careful on every SaaS app they use.
00:53:10That's why Trelica by 1Password makes security simple for every employee, on every application.
00:53:16Trelica by 1Password inventories every app in use at your company.
00:53:21Then, pre-populated app profiles assess SaaS risks, letting you manage access, optimize spend,
00:53:27and enforce security best practices across every app your employees use.
00:53:31Manage shadow IT, securely onboard and off-board employees, and meet compliance goals.
00:53:36Trelica by 1Password provides a complete solution for SaaS access governance.
00:53:42And it's just one of the many ways that extended access management helps teams strengthen compliance and security.
00:53:48Take the first step to better security for your team by securing credentials and protecting every application,
00:53:53even unmanaged shadow IT.
00:53:56Learn more at 1Password.com slash VergeCast.
00:53:59That's the number 1Password.com slash VergeCast, all lowercase.
00:54:06All right, we're back with the Thunder Round, a.k.a. the Lightning Round, colon, the Thunder Round, colon, Jake's version.
00:54:16Love it.
00:54:16As you all know, I've been on a power trip since Nilay Patel left on parental leave.
00:54:21The Lightning Round, as we know it, is done for.
00:54:24This is now the Thunder Round.
00:54:25I have given our producer, Eric Gomez, the power of thunder to make sure that we keep moving
00:54:30and get through all the important stories that we want to talk about.
00:54:33So, we're going to do five stories, five minutes each at 30 seconds.
00:54:37We're going to hear a rumble of thunder telling us to move along.
00:54:40If we keep going and get a little too far, we're going to hear thunder crash down.
00:54:44That's our moment to move along.
00:54:46All right, let's get started.
00:54:48V, you've got our first story today.
00:54:50Yeah, so my first story is Allison's little adventure turning her LTE Apple Watch into her only device.
00:54:57So, she's leaving her phone at home.
00:55:00She's going out and she's doing her adventures.
00:55:03And Allison is, like, annoyingly, and I say this with admiration, one of the funniest writers on staff.
00:55:09So, just reading this was just delightful.
00:55:11And, like, I was just, like, she reached out to me and she's like, V, I have this crazy idea.
00:55:16I was like, hit me.
00:55:17Hit me with it, girl.
00:55:18What are you going to do?
00:55:19She's like, what if I leave my phones at home and just live off of a smartwatch?
00:55:23And I went, God bless.
00:55:24Let me know how it goes.
00:55:25I ain't that crazy.
00:55:27Let me see how it happens.
00:55:29And it happened, like, how I thought it would happen.
00:55:31But it tickled me because I think she and I, our beats continually, like, fuse and merge together because of all the ways that they're stuffing AI into our products.
00:55:41And I think she and I both have just, like, had a lot of conversations about, like, needing to step away from our devices in some capacity.
00:55:49Like, I had a story that came out a couple weeks ago about using brick, which is, like, this thing that bricks your phone.
00:55:58And I use it during TV time to get my attention back.
00:56:01And she's like, I want to be more present in the world, so I'm going to leave my phone at home and just use my LTE watch for, like, connectivity.
00:56:07And, of course, that comes with its own trials and tribulations.
00:56:11Like, she's like, oh, shit, how do I call an Uber?
00:56:14How do you call an Uber without your phone?
00:56:16And, like, it was just a really cute meditation on, like, how reliant we are on these devices.
00:56:22Can you actually replace your phone with your smartwatch in any capacity?
00:56:26So, the way I know this kind of worked is that, so this is Alison Johnson, senior reviewer for The Verge.
00:56:32The way I know it worked is that on, like, Friday, I was saying something, I'm like, you know, I haven't heard much from Alison this week.
00:56:38What's she up to?
00:56:39And someone was like, oh, yeah, she just, like, hasn't been using her phone all week for her job, and so she hasn't been talking to us.
00:56:46And I'm like, that was so effective.
00:56:48That worked.
00:56:49It worked.
00:56:49I just, like, you know, like that, it was harder to get in touch with her, but I guess that was sort of the point.
00:56:56It was functional.
00:56:56She was dead.
00:56:57She did a good job.
00:56:58My favorite part was when she said she had visions of herself sitting at a table in a sidewalk cafe wearing a billowy skirt, but she doesn't own a billowy skirt.
00:57:06Like, that was the energy she was going to give off.
00:57:08Okay, so I have a question, and it's not for you, V, because I know the answer.
00:57:12Hayden, are you a smartwatch person?
00:57:13No, I hate having things on my wrist.
00:57:15Like, I can't, I don't like the feeling.
00:57:17It's like, I think it's because I do boxing as a workout, so a lot of people use smartwatches as, like, a workout tracker also,
00:57:24and I would always have to take it off because I box, so I feel like I just don't like feeling something on my wrist like that.
00:57:30I don't know.
00:57:31I can deal with rings, but not big bracelets or anything like that.
00:57:34So it's been a while since I've been, like, a regular smartwatch person.
00:57:37I bought the original Apple Watch, and then I immediately bought an Android phone, and it was just like, I just, like, had this watch that I couldn't use.
00:57:43The thing that I think is really interesting about this is, like, there's this, sort of the conceit is, okay, the watch is less distracting than the phone,
00:57:52but I sort of wonder, and V, this is a question for you now, isn't the watch a little more annoying because it's literally physically on your wrist there at all times,
00:58:01and it's shaking around trying to alert you to everything.
00:58:04That almost seems more distracting.
00:58:05It can be more distracting, especially when you wear to any given point in time, and they have different notification settings just because, like...
00:58:16Quick wearable check. I think there's three on you today.
00:58:18There's three on me today.
00:58:19Well, there's three on me today and two more at my desk because they're glasses, and so, you know, that's the whole thing.
00:58:25But, yeah, I do get buzzes a lot, and especially on this Garmin that I'm testing at the moment because it's a little less discerning with what notifications I get.
00:58:36So it's just like, oh, they texted you, oh, they texted you, eh, they texted you, and, like, they buzz, so...
00:58:42And I can answer this as an objective third party because whenever I'm with someone who has an Apple Watch or a smartwatch, you know,
00:58:49it makes me nervous because they're always looking at their wrist, and I'm like, oh, is it... Am I boring you? Is it time to go?
00:58:55Obviously, it's just they're getting a text, and they're like, oh, sorry. But, you know, it can be intense.
00:58:59It's definitely a social cue, for sure, where, like, I'm very cognizant of it, of just, like, things buzzing and being like...
00:59:07And then going back down and looking at my wrist. Like, I do think it's a different kind of distraction.
00:59:16Like, with my phone, if I do need to... Like, if I'm sitting down for dinner with my wife, like, I will put my phone somewhere else so that it's not distracting me.
00:59:24Because, like, listen, love y'all, but I get a lot of slack pings, all right? Like, they do not stop.
00:59:31And if it's on my person, if I know it's buzzed, then there's just, like, there's always that tension in my mind where I'm like, I gotta find out what this is.
00:59:38I gotta find out if it's important. Sometimes it's just the weather. Sometimes it's just telling me it's gonna rain, and I did not need to see that.
00:59:44And I do like that I can just, like, take my phone out of my pocket and put it somewhere else.
00:59:48And with the watch, you're sort of like...
00:59:50That's why you gotta get the flippy flip. Dismiss that. I do it all the time.
00:59:53So you're just eating and being like, flinging your arm around.
00:59:57Yeah, that's not distracting at all.
00:59:58Okay, okay. All right, Hayden, you're up.
01:00:01Okay, so mine is how doctors are getting worse at detecting cancer after they rely on AI.
01:00:07So, really scary and sad. Basically, a new study just published this week found that doctors who usually use AI to detect cancer in colonoscopies, now, when it's taken away from them, they got way worse at detecting cancer than they were before they ever used AI, because they basically got lazy.
01:00:27Like, their brains were used to just relying on AI and kind of double-checking it.
01:00:30And then, even with all their training, when you take the AI away, they became way worse. I think six percentage points worse.
01:00:37Oh, that's a little horrifying to me.
01:00:38Good. That's very upsetting.
01:00:41That's, you know...
01:00:42Yeah, they found, like, across countries, Poland. The researchers came from Poland, Norway, Sweden, the UK, and Japan. They looked at four medical centers in Poland that were part of a trial program, you know, using AI in colonoscopies. And, yeah, basically, these doctors are just gonna be SOL for whenever they stop using AI.
01:01:01This is really interesting, because I feel like there's been so much, like, fear and concern. Like, AI's gonna make you dumber, AI's gonna make you dumber. And I've been, like, trying to grapple with this, where it's like, okay, I don't know how to use the Dewey Decimal System, right?
01:01:15I don't know how to go find a book and then look in the glossary and find the page and, you know, read the context and figure out, right? I'm not reading as much because I'm just going to Google, right? And Google's just giving me... And then I read a website that's been written for me really nicely.
01:01:28So, like, okay, am I dumber because I use Google instead of going to a library? Maybe a little bit. And so, I'm like, okay, are we kind of overreacting with some of this AI stuff? But this is concerning to me.
01:01:40I think it's like GPS, right? Like, if you always use GPS on your car, and then you just never, like, I'm calling out my spouse here. Like, they don't know how to get to a place that's five minutes away because they always have GPS, so they only put GPS.
01:01:53Oh, me either.
01:01:54And when the car play doesn't work, and it's five minutes away, you've driven there a million times, they don't know how to get there.
01:02:00That's me.
01:02:01Oh, yeah. No, not a chance.
01:02:02I think that's kind of analogous to what's happening here with the doctor.
01:02:05So, like, even you know how to drive, you could probably do it on your own, but...
01:02:09Definitely. And you also, I think it's also, like, a thing when I found in some recent reporting I did on a Google Health, like, medical AI hallucination, that people that double-check AI's work also get a little bit lazy, and it's just kind of human nature.
01:02:26If something's usually right, you're going to not find the discrepancies or the errors as much because you're like, oh, it's probably right, and you're usually not going to, you know, pinpoint when it's wrong.
01:02:37And so I think it's kind of the same thing. It's like, if your brain is used to relying on something, something that's usually right, you're just going to get worse at doing it on your own, and you're going to get worse at identifying when it's wrong, probably.
01:02:49Yeah. Hayden, I thought this piece you wrote, I think it was last week, was fascinating, where you found that, what was it, a Google research paper had merged the name of two different areas of the brain, and kind of no one noticed.
01:03:03Exactly. It had more than 50 authors on it, and they also had doctors look at it before they published it. It was like the debut research paper for Google's healthcare AI model, which they were positioning to doctors as, you know, a good way to double-check things, you know, flag things, especially things missed by radiologists.
01:03:21And yeah, I mean, it made up a body part. Google said it was just a misspelling, but it was conflating, essentially, two different areas of the brain.
01:03:30And if you did have an abnormality or a stroke in one of those two areas, it would be treated differently from the other. So, I mean, it is a problem if, you know, and especially like we just said, if you're a doctor reading this and you're probably not going to find the error, just like they did in the actual research paper, you may skip over it and maybe even mistreat the actual thing.
01:03:51The term sounded real. Basilar ganglia. That sounds real, because there is like a basilar thing and a ganglia in your brain, so.
01:03:59Well, and this is what I thought was so interesting about that, where, you know, you have people in the story saying like, oh, it's like a common typo, like people figure it out.
01:04:08But I do think like, okay, sure, maybe they would have figured it out. But the turn is, if they get used to not checking, then you get into a problem. And this study sort of emphasizes that, where it's like, oh, like, in fact, they are getting worse at it. In fact, that could make medical care worse. And so, like, that gets a little concerning.
01:04:29Listen, chat GPT-5 is better at health care. So says Sam Holtman.
01:04:35Yeah. And healthcare, coding, it's just telling you to do the health care yourself.
01:04:42It's just, it's kind of, I'm, I'm interested in how this affects people with health anxiety, because basically it said, oh, it's better at answering your questions about your condition and telling you what it might be.
01:04:54Hello, I have health anxiety. It's not good. It's bad.
01:04:57It's going to be interesting for sure.
01:04:58Okay, next story. AOL is shutting down after 34 years. AOL dial-up, I should say. AOL dial-up is done after three decades.
01:05:10R.I.P.
01:05:11I can hear it in my head.
01:05:13It's beautiful, beautiful noise in retrospect.
01:05:16I, apparently, according to the 2023 U.S. Census, there are currently, there were around 160,000 people who still had dial-up internet, which is sort of, what?
01:05:29It's so slow.
01:05:30I don't, I want to know what they're paying.
01:05:34They probably have really peaceful lives, you know?
01:05:37Right, what, this is the real question. What are they doing with it? You can't load anything on that internet right now.
01:05:42They're definitely just looking at, like, their AOL inbox and maybe, like, the MSN homepage. That's what I think.
01:05:48Is it?
01:05:48Yahoo. They use Yahoo.
01:05:50Oh, definitely Yahoo.
01:05:51Actually, can I say, when I looked up the Apple Core settlement with Apple Inc. from 2007, the press release, the spokesperson for Apple Core, AOL email address.
01:06:06Those are 2007, right? AOL is still around. You still see, I haven't seen the AOL email address in a while. You still get the Yahoos. You still get the MSN, not so much.
01:06:15I just, that sound, though, is forever in my head.
01:06:18Yeah.
01:06:18You know, maybe it'll become cool to have an AOL email address, you know? Like, it's a nostalgia thing. I don't know. I could see that happening.
01:06:24Is that the, like, Digicam of emails? You're like, oh, Gmail is for the millennials. We gotta, we're gonna, like, buy old AOL addresses.
01:06:32Oh, my God. Especially since, like, the inbox isn't unlimited, so you actually, it's gonna be, like, a mindfulness thing where they're just, like, oh, you only have 50 emails that you can have any given point in time.
01:06:42Again, they're probably so zen.
01:06:43Were you both AOL dial-up users back in the day?
01:06:46Yes.
01:06:46Yeah, yeah, same. And it's, it's, I mean, I, I barely remember it at this point. I remember it, like, it was an app, though. It was, like, an app that you enter into. It was just, like, this contained experience.
01:06:59There's just, like, soundbites that I remember. Like, one, there's the connecting thing for the dial-up, but then also, you got mail.
01:07:06Oh, my God.
01:07:06And, like, welcome to AOL or something like that, and just, like, what the homepage looked like.
01:07:11That was my Neopets era.
01:07:13Oh, my God.
01:07:13That was mine, too. Going to the free omelet every day.
01:07:17Oh, my gosh. That was so good.
01:07:19My shop song was, um, an ABBA song. So good.
01:07:23Neopets still going strong, though.
01:07:24It is?
01:07:25I tried to log in sometime in the past year, and they deleted my account.
01:07:29Oh, my gosh.
01:07:30My poor galerts.
01:07:32They're, they're all gone.
01:07:34All right.
01:07:34Wow.
01:07:35AOL, you had a run that lasted way, way longer than it should have.
01:07:42Thank you for getting us online and for imprinting that terrible noise in all of our heads.
01:07:48I hear it in my dreams.
01:07:49As we should, next, next week on the Thunder Round, we're replacing the Thunder with the Screech.
01:07:53Oh, my gosh.
01:07:55Let's all make our ringtones that sound.
01:07:57Or our phone alarms.
01:07:59You know, that will really help us, uh, you know, reduce our desire to be near our phones.
01:08:03True.
01:08:04If they were, like, blaring that noise all the time.
01:08:06100%. Oh, God.
01:08:08Okay. V, what have you got for us?
01:08:10We have the Pebble time, too.
01:08:12Because, um, I got a, I got a little email from Eric Mikakovsky, who is the Core Devices CEO and co-founder of the Pebble Watches.
01:08:21Uh, and he's just like, we've got a design reveal.
01:08:23The Pebble time, too.
01:08:24And because Pebble is Pebble again.
01:08:26Very recently, they got the name back because they had, had to rebrand as Core Devices.
01:08:32So it was going to be the Core Time 2, uh, and the Core 2 Duo.
01:08:36And, like, everyone was making Intel chip jokes.
01:08:38But it's back to being Pebble.
01:08:40And it's really, I brought this one up because, um, we knew Pebble was coming back.
01:08:44We knew that, uh, Pebble just had these diehard community fans for open source smartwatching.
01:08:49And it was just interesting because in the comments on this, there seemed to be, like, a war of ideologies of people who were, like,
01:08:57Yes! Finally, the Pebble has returned and my life is complete again.
01:09:02I read it in those voices every single of the, one of the Pebble diehards.
01:09:06Because for the last decade, they've been in my DMs going,
01:09:09Nothing will ever replace my Pebble. I love it so much.
01:09:13Um, and then there are other people going just, like,
01:09:15I don't know, man. I need LTE on my smartwatch. I need, like, NFC on my smartwatch.
01:09:21I need it to do modern things. And, like, those are very purposefully not on the Pebble Time 2.
01:09:26Like, it's got RGB lighting. It's got a really retro design. It's got e-paper, um, as its display.
01:09:32It kind of feels like, um, one of those dumb phone devices.
01:09:36But for your smartwatch, it's very retro. It's, it's, it's just going to do the simplistic, minimalistic things.
01:09:42And it was interesting wading into the comments to see people who were just, like,
01:09:46Yes! Pebble!
01:09:49And then other people going, like, it's ugly. Why would you do this?
01:09:52Like, we're beyond this point. And I was just like, wow.
01:09:55It's so, it's so wild just watching this play out in the comments.
01:09:59Well, I also suspect that, like, there's a, there's a real degree to which,
01:10:03if anybody thought that this was a real threat to the smartwatch market,
01:10:08this would not have been allowed to happen.
01:10:10No!
01:10:10Like, literally would not have been.
01:10:12No!
01:10:13Google owned all of this.
01:10:15Yeah, Google legit was just like, have at.
01:10:17Right.
01:10:18Go, go have fun.
01:10:19Google was like,
01:10:19Do the same.
01:10:20If, if it makes you happy.
01:10:22Like, we're,
01:10:23Literally.
01:10:24Right, like, Google, Google makes the Pixel Watch.
01:10:27Right, that thing's not flying off store shelves.
01:10:30It's a lovely smartwatch. It's not flying off store shelves.
01:10:32It's a great smartwatch.
01:10:33And even there, like, yeah, you can compete with us. It's fine. That's not, that's not big.
01:10:37It's the opposite of the Apple and Apple Cinemas thing.
01:10:40Yes, yes, yes.
01:10:41It's literally like, oh, you, you guys want your little pebble back? Have fun.
01:10:45Come on in and compete. Come on in.
01:10:47Go in.
01:10:48But it is nice, right? I mean, this is sort of like a little bit playing off of what, what Allison did,
01:10:54where it's like, this is going even simpler. This is like, the watch is not going to bug you.
01:10:58No, like, you get your notifications. You can do a little bit of step tracking. There's a compass in there. Have that. Do what you will.
01:11:04There's a little community that's been, like, keeping the pebble going, you know, during the dark ages where there was no support and whatnot.
01:11:11Just a little community of ragtag pebblers going and keeping this thing alive and popping into every single smartwatch review that I've written over the last however many years going,
01:11:21I miss my pebble.
01:11:23I miss my pebble.
01:11:24You know, like, just like that kind of thing.
01:11:25It was a nice, it was a nice watch. And I think there's also, it is like the ethos of it, right?
01:11:29Yeah.
01:11:30Because there's, I don't think there's anything else.
01:11:32There's nothing else quite like that now.
01:11:33Quite like it in, like, in, you know, mainstream consumer technology that's as, like, tinkerer friendly.
01:11:39It's a cute watch, a cute name.
01:11:41Yeah.
01:11:41And also, just like, you know, smartwatches, and I've written this time and time again,
01:11:44they kind of feel like the vanguards of walled garden OSes because they're there for you to stay stuck to a particular phone.
01:11:53So, everything is super siloed. So, for Pebble to be back out here and they're like,
01:11:56hey, y'all, open source, let's have fun. YOLO, let's go. Like, it's just nice to have that vibe come back at a time where, like, smartwatches as a whole are super duper siloed.
01:12:08And that's why I have to carry around three freaking songs.
01:12:11No, I completely agree. And it's, I mean, it's, it's really funny. Like, this is like an updated version of the Pebble Time 2.
01:12:18Was that what the original name was?
01:12:19It was the Pebble Time. So, this is the Pebble Time 2.
01:12:21Oh, so this is actually a sequel now.
01:12:23It still looks mostly the same, but, like, a little refined.
01:12:26And as far as I'm aware, the specs, like, are barely changing, right?
01:12:29Like, it's like, it's mostly, it's mostly the same device, mostly the same feature set as it was 10 years ago?
01:12:36Yep.
01:12:37Wits.
01:12:37There's, like, no other device you could be like, I don't know, if people were like, iPhone 5, here it is.
01:12:42Honestly, okay, I'm saying this, I might buy that. The iPhone 5 looked really good. It was a good phone.
01:12:48Yeah, it was.
01:12:48Yeah, yeah, yeah.
01:12:49I also missed the button, you know?
01:12:51That's true.
01:12:52This is where it's gonna go. It's like the Digicam people are gonna demand we go backwards.
01:12:58Yeah, all right. The Pebble Time, it's supposedly shipping this summer?
01:13:03It's supposedly shipping this year.
01:13:04This year, this year, this year.
01:13:06It's coming sometime, yeah.
01:13:08Okay, very exciting days for the Pebble community. Y'all have been waiting.
01:13:12It's a good moment.
01:13:14Okay, Hayden.
01:13:15So, Addy did a great piece this week on how chatbots aren't divulging their secrets. They're not like an oracle. When you ask it a question about how it works, its inner workings, why it was banned from something like Grok was, or why, you know, it's responding a certain way, what its secret prompts are.
01:13:34Most of the time, it's not unveiling secrets of its inner workings and its system. It's just kind of making something up, because that's what it wants to do. It wants to make you happy. It wants to answer your question. It's searching the web and just becoming a pattern generator and, you know, picking the next most logical word.
01:13:49Right. So, this Grok thing, Grok, like, briefly got banned from X, which is...
01:13:54And Blue Sky.
01:13:55And Blue Sky. First off, I cannot believe Grok was allowed on Blue Sky for a second.
01:14:00That's the craziest thing I've ever heard. So, it got banned on X for mysterious reasons, even though it seems to be the main feature on X these days.
01:14:09And then people started asking it why it was banned. And they were just like, oh, okay, well, Grok, if you say so. But it had, like, a bunch of different explanations.
01:14:16Yeah, like, one was that it stated that Israel and the U.S. are committing genocide in Gaza. And then another one said it was content refinements by XAI.
01:14:25And then another one said, oh, it's because I identified an individual in adult content. So, basically, I ran the gamut.
01:14:32Like, no one knew why it was really banned. And it didn't know either.
01:14:35It's, I mean, this, it makes so much sense, right? You're like, why couldn't I ask the chatbot about itself?
01:14:40But I think this is, like, just at an even more basic level, I don't even think you can ask a chatbot what model it is.
01:14:46Like, it doesn't even always know.
01:14:48It doesn't.
01:14:48I did ask chatgbt5 today, like, what's the difference between the old you and the new you? And it did give me some examples of, like, what it can do better. But then I was just like, I don't know, I still bully you pretty hard.
01:15:03But this is the thing. It's still basing it off of, like, what it can read.
01:15:07Right. Like, it read the blog post.
01:15:09Yeah, right, right, right. It just knows the exact same thing that you can read.
01:15:12And I think that a lot of times, you know, again, this is a human nature thing of us, like, personifying things or whatever. But, you know, on TikTok, I see a lot of trends of people, like, treating chatbots like an oracle. They're like, oh, I asked when the world was going to end and it said this. Okay, like, it's just reading a random Reddit comment and just putting that out there.
01:15:29Or saying, like, you know, out of everything I've ever told you, what do you think is next for this company or whatever? I mean, there's basically people, it is good at finding patterns, but it doesn't know the future. And it doesn't know, like, I don't know. I mean, unless you can find it on the internet, it's not going to tell you. Besides, sometimes you can get it to unveil some of its system prompts or its secret hidden instructions. But, I mean, that's it.
01:15:53So that's what's so tricky, though, right? I think particularly when the chatbots first started coming out, and I think this happened on Bing, where, right, I think most of the chatbots are given instructions.
01:16:05You would think that this would be some, like, fancy code of constraints on how these things operate. But there's actually just, like, some engineers are just like, be nice to the user. Give brief responses.
01:16:17And it's just, like, English language, plain text stuff. And they kind of just, like, put that at the top of the chat and hide it.
01:16:24And so there have been these instances where people will figure out that, okay, if you do the right things, the chatbot will reveal some of its prompts to you, which lets you figure out how it's been designed to operate.
01:16:37And so there are, like, very occasionally these secrets you can divine.
01:16:42Yes, only about the system prompt.
01:16:43And that's it, though, right? It's extremely limited. And then because of this, I think, I mean, probably for a lot of reasons that are mostly related to human psychology, but I think this sort of lends credence to it.
01:16:54People sort of think, okay, anything it says, anything it says about itself is going to be true. And you just, you can't believe it.
01:17:01It's actually, it probably even knows less if it's, like, a breaking news scenario.
01:17:05Right, exactly. And it's also hard because, yeah, I mean, the fact that it can find patterns in, like, economic conditions or, like, travel patterns, all these patterns that it finds, people think that it can also find patterns in, like, what's going to happen in the future or what's happened in the past.
01:17:20And it's very, it's just hard because I think, you know, it's really good at some things and it's really bad at others. And, you know, we have to remember what it's bad at.
01:17:27But it speaks so authoritatively, so it's hard to, you know, it's, I mean, it just says things confidently. And so, if you can't suss what it is, or, it's really hard to remember what it is and it isn't good at, for the average person, I think. Just.
01:17:44There's also, if you trace this far enough, I think this is how you start to get to some of those stories where people are like, it told me I could run through a wall.
01:17:52And so, I ran at the wall. And it's like, like, no, no, no, no, no, no. Like, please, like, you have to approach these things with common sense.
01:18:00And I think very, in complete fairness, the fact that the bot can't explain to you what it is doing is really confusing.
01:18:12Especially when you have, you have these reasoning models and these, like, quote, unquote, thinking models, they'll explain their reasoning.
01:18:16And it looks really, really, like, intelligent and sensible. And for it then to be, like, just totally, I have no idea what model I am.
01:18:24I just have to make that up. Right? I don't know why I acted this way.
01:18:28So, I asked it how many R's were in strawberry. It said three, which is correct. But then it said there are three R's in strawberry.
01:18:34But you know what it can't get right is the number of R's in any U.S. state. And it keeps labeling maps incorrectly, too.
01:18:41It's great.
01:18:41One day, one day, maybe get a second official opinion on your medical information, particularly for a doctor who—
01:18:48Always, every day, get a second medical—just don't listen to church.
01:18:52And make sure that doctor is not using AI.
01:18:56That's it for The Vergecast. If you like what we do here, the best way to support us is to buy a subscription to The Verge at theverge.com.
01:19:04We'd love to hear your questions and feedback. Let us know what you want to hear us talk about this summer.
01:19:08Email us at vergecast at verge.com or give us a call, 866-VERGE-11.
01:19:14The Vergecast is a production of The Verge and Vox Media Podcast Network.
01:19:18Our show is produced by Eric Gomez, Brandon Kiefer, Travis Larchuk, and Andrew Marino.
01:19:22Jen Toohey has a great episode coming up on Tuesday.
01:19:25It's about whether we can make Rosie the Robot from the Jetsons a real thing.
01:19:30See you next week.
Be the first to comment