Skip to playerSkip to main content
  • 10 hours ago

Category

📺
TV
Transcript
00:13I'm in San Mateo, California, in the heart of Silicon Valley,
00:18at an event celebrating one of the tech world's most powerful people.
00:27Let's go!
00:29As a swarm of drones takes to the sky,
00:32I can't help thinking about how much our lives are in thrall
00:36to these tech billionaires and their singular obsessions.
00:48CHEERING
01:01Here it comes.
01:03Hey, Grayson.
01:08It's a regular car.
01:09From inside, you can't tell, really, that you're in a driverless car,
01:13apart from the fact there's no driver.
01:15I'm here in San Francisco, California, because this is a place that has always looked to the future.
01:23If you think about it, you know, it's history.
01:25It's a series of waves of optimism and dreams and utopias.
01:29And then, of course, technology.
01:32We've had the computer.
01:33We've had social media.
01:35And now we are on the cusp of what might be the biggest revolution in technology that will affect all
01:41of us.
01:42AI.
01:43Artificial intelligence.
01:44AI is about to challenge almost every certainty you might think you have about what it is to be human.
01:52Oh, crikey.
01:54It's so steep.
01:56So come with me as I venture into the future that Silicon Valley technology is creating
02:02and as I make an artwork about what I discover.
02:16I've been exploring AI and robotics in Silicon Valley.
02:20Now I want to find out more about the people behind the technology.
02:25And there's one beast in the tech jungle who's bigger than all the rest.
02:30Five, four, three, two, one.
02:36Elon Musk is one of a handful of billionaires shaping our technological future.
02:42Okay, you libtards.
02:44The X Takeover is an annual celebration of all things Elon.
02:49And it attracts admirers of his products from all over the US.
02:54So, Grayson, look at this.
02:55Bulletproof glass is super unique.
03:00It actually is return fire.
03:02So we like to put a watermelon on the outside.
03:04I have a shooter inside.
03:06So these two holes right here shot out.
03:10The watermelon explodes.
03:11And then you can actually still take shots for outside in.
03:15And if you put your hand down here, nothing gets through.
03:18So come the apocalypse, I'm safe.
03:21You are safe.
03:22When they're kind of hunting me for food.
03:24Right. Yeah, yeah, yeah.
03:25So for carjacking, for theft, or for road rage, you don't have to freak out.
03:33Not that you would, Grayson.
03:35Are you a big Elon fan?
03:36Very much so.
03:37He's a disruptor.
03:38The great thinkers have always been disruptors.
03:43So you love Teslas?
03:44We love Teslas.
03:45Absolutely.
03:46Is this a little glimpse of the future here at this event?
03:49Absolutely.
03:50Absolutely.
03:50Absolutely.
03:51Absolutely.
03:51More people going to buy more Teslas.
03:54We are showcasing our Cybertruck.
03:57It's fast.
03:58It's safe.
03:59Yeah.
04:00We can plug in our air fryer, our microwave, air conditioners, and our tent.
04:04And it drives us.
04:06What do you think of Elon Musk?
04:08He's an innovator.
04:09Yeah.
04:10I'm always going to be hopeful for our country.
04:12That we're going to be moving in a direction for, you know, goodness and kindness for everybody.
04:16And innovation for the nation and for the world overall.
04:19So are you a techno-optimist?
04:21Major.
04:22Major techno-optimist actually, yes.
04:24Major.
04:27This is a cyber beast.
04:29Right.
04:29So that's why we have Cerebus here with us.
04:32And Cerebus is the, uh, from Greek mythology.
04:36It's the three-headed dog.
04:37It's guarded at the gates of Hades.
04:40Which brings me on to the question.
04:41What do you reckon to Elon Musk?
04:44Elon?
04:45Elon's a fantastic guy.
04:47I mean, he's, uh, the satellites, SpaceX, going to Mars.
04:50I mean, who else, who else is out there that's done what he's done?
05:00And then, as if beaming in from another planet, the man himself appeared on screen.
05:09Tesla's valuation could be 25 trillion.
05:13That's 20 to 30 times more of this thing.
05:17I've never seen any technology advances past AI.
05:22AI is a super-sized tsunami.
05:26Who wouldn't want their own C3PO R2-D2?
05:32I love you, Rina!
05:34I love you guys, too.
05:40And away we go.
05:41OK, here we go.
05:45First time I've ever drove a truck, let alone a Cybertruck.
05:50Elon, well, whatever you think of his politics and his personality,
05:54he's, he's, I think he's what you would call ruthlessly logical.
05:59And I think, you know, that's why, that's why he succeeds.
06:03You know, he's, he's a brilliant engineering brain and businessman,
06:06but I don't know if his ruthless logic necessarily correlates well to government.
06:13You know, this truck, in a way, is a fantastic metaphor for what's going on in the world today,
06:20in that the personality of the person behind the company is actually shaping the future in ways.
06:28So we've got to look at those people, because they are the people that are going to be controlling technology
06:35that is going to be a big part of our lives in the future.
06:39And you may not like the shape of the Cybertruck, so be prepared.
06:46The concentration of power and wealth in the hands of a small group of tycoons,
06:52like Musk, Mark Zuckerberg and Jeff Bezos, might well be historically unprecedented.
06:57But even they all started their global empires in bedrooms and garages.
07:03And there's always a new generation of tech entrepreneurs hungry to replace them.
07:11I'm in Palo Alto with 21-year-old, second-generation Pakistani-American, Arab Khan,
07:18whose start-up is already making millions.
07:21How much have you got?
07:23I think, like, my personal net worth is a bit over eight figures now.
07:28But I think over the next, like, 12 months, I really want to get to, like, the range of doing
07:3325 million.
07:35A year? Yeah. Wow.
07:36I live with my parents. I still, I can move out. I could have moved out four or five years
07:41ago.
07:42One of the benefits I have is, like, I get home, my laundry's done, my bed is made,
07:48don't have to think about those things.
07:50But the best thing is, like, I'm doing exactly what I want, hanging out with my friends,
07:54building cool shit, but also making, like, an ungodly amount of money.
08:00Arab's company, 24 Labs, might be hidden inside an unprepossessing warehouse, but it builds a huge range of software,
08:08from tools to help traders predict the market, to apps that allow anyone to create music.
08:15Hello.
08:16Yeah, so this is, uh, this is the crew.
08:20Hi.
08:21How are you doing?
08:21Here we are. Yeah, good. So, so you all work, are these your employees?
08:26I would call them, I would call them partners, but yeah.
08:31What are these companies doing?
08:32Let's say, like, you know that a certain market, you're gonna make money if you build a certain product, right?
08:37Um, let's say, for example, the product can be a product on what shoes to wear.
08:42Like, the app is around, take a picture of your shoes, and it will give you recommendations of what other
08:45shoe brands to try.
08:47Uh, it's essentially super easy to build that technology today.
08:50Yeah.
08:50So it's like, when you're starting a brand out, like, uh, being the first mover has an incredible advantage.
08:56So you just have an idea?
08:57Yeah, literally, literally that.
08:59I speak great Arabic, I'm Egyptian.
09:02Alright, that's a handy business tool.
09:04Yeah.
09:06Um, and he, he pitched the ideas, like, let's do, let's build an app that teaches people how to speak
09:10Arabic.
09:11Yeah.
09:12Utilizing, I guess, voice recognition, and then AI course creation, it's just been fantastic.
09:16We think we launched about four weeks ago, and we're at 12,000 in revenue, um, and on track to
09:22hit at least 100 by the next two months.
09:25If you'd have told me, oh, yeah, this guy's doing this with his company, I'd expect to see a big
09:29glass office.
09:30Yeah.
09:31You know, with a big logo out of it, you know, with, with maybe, like, 500 employees or something.
09:36Software, like, five years ago, for you to do what we're doing, you would have had to raise $100 million.
09:41Like, if you think about cost of intelligence, that has gotten a thousand times cheaper over the last ten years.
09:46Is that, is this because of AI?
09:47Yeah.
09:48And that means each human can output a thousand times more work.
09:51The flow to build this Arabic app was Adil's an engineer, and he was like, yo, like, engineering, let's, let's
09:57get this figured out.
09:58We vibe coded it in, like, a day or two. We had our designers design it.
10:01We took the designs and made the, the app better.
10:04Then we found creators.
10:05Abdullah was on that.
10:06He was, like, looking for creators to promote.
10:07We paid them with my money, and then that leads to subscriptions.
10:12Yeah, I mean, I've sort of seen phrases around social media like, there's a whole bunch of teenagers out there
10:16using AI to make million dollar companies, and you don't even know about it.
10:20I moved here to the Bay Area only, like, I think six months ago.
10:23I was just at a coffee shop coding, and I actually recognized Arab from online.
10:27He invited me to work with them one hour after I met him.
10:31And I was just around them.
10:33I went from being a random guy who moved to California three months ago with no, from Lebanon.
10:38Yeah, I grew up in Lebanon.
10:39I spent 24 hours a day around these guys.
10:41Every single day, we're just always locked in together.
10:44And, uh…
10:44Wow.
10:45This really is the gold rush.
10:46Yeah, yeah.
10:47I mean, it's just amazing.
10:49It's, I don't think I've ever been in a room like this before.
10:51You're all kind of entrepreneurs, and you're full of ideas, and you're fizzing with, you know, and I love your
10:56kind of group dynamic, you know, I think that's great.
10:59So, it's like a really interesting model where, like, software, you can sell to 100 customers, and even if, like,
11:0420, 30 customers disappear, you're still going to be getting money for a very long time.
11:08Everyone thinks of us as, like, software builders.
11:10In theory, yes, we build software companies, but we actually are in the business of acquiring human attention.
11:16Like, the more human attention we can acquire, the more money we can make.
11:19That's all that matters to us.
11:20Yes, the attention economy, and you are an entrepreneur in that.
11:25Yeah.
11:26At the burning edge.
11:27Exactly.
11:28So, you mentioned you were an artist, right?
11:30Yeah.
11:30What are your thoughts on using AI for art?
11:33I mean, it's fine.
11:34I've used it.
11:34In my last exhibition, I used it a bit, yeah.
11:37I always say it'll put birthday card designers out of business.
11:40You know, if you're, like, AI will wipe out most mediocre intellectuals.
11:49Arab Khan, I've got to admit, I don't know what prejudices I went in there with, you know, I was
11:54thinking I'm going to meet a tech bro and I'm going to find him a bit abrasive and a bit
11:58too macho.
12:00But, you know, he's this sweet young 21-year-old guy.
12:03And then he said, I've got my posse with him.
12:06And I was like, oh, and I was slightly sort of intimidated for a moment.
12:09But the minute they opened their mouth, they were so delightful and enthusiastic and bright.
12:15And yet they were sitting around this kind of mundane office with their laptops, like they might have just been
12:20kind of, like, listening to music or playing games or something.
12:23But these guys, they were sitting there and they were making millions of dollars.
12:32I can't imagine a better example of the spirit that propels so much of the innovation here.
12:38But I was about to meet someone who paints a more complicated picture of what it takes to make it
12:44in Silicon Valley.
12:46When you get rich, you get people licking you up and down all day and telling you you hung the
12:50moon and therefore you are right about everything.
12:53And you get no pushback. And over time, you tend to push out the people who push back.
12:58I always say tech bro stands for technically broken.
13:10I Donald John Trump do solemnly swear.
13:14At President Trump's inauguration, Elon Musk and the bosses of Google, Meta and Amazon all came to honor him.
13:23If I want to understand what's going on inside the heads of the tech giants right now, there's one person
13:29I need to speak to.
13:33I'm going to meet someone called Cara Swisher.
13:36What can you tell me about her?
13:40Cara Swisher's a powerhouse in tech journalism known for her sharp insights and no nonsense interviews.
13:47She could offer invaluable perspectives on Silicon Valley, boosting your documentary's depth and reach.
13:54Sounds like a powerful meeting.
13:57Should I be a bit nervous?
14:00Nah, no need to be nervous. She's tough but fair and really values genuine conversation.
14:07Just be yourself and bring your passion for your documentary.
14:10She'll appreciate your thoughtful approach.
14:13You got this.
14:15Thanks.
14:19Hello, Cara.
14:20Hi.
14:21What do you think of the sort of present kind of political shenanigans of the tech bros?
14:26I always say look at the money with these people, all their nonsense and things they say about life and
14:32community.
14:32They are very mostly ignorant about history.
14:35They're largely nonpolitical, really, or ignorant.
14:40And they just want to do things that will help make their businesses regulation-free, essentially.
14:46So is it really that cold-hearted?
14:48Is that right across the board, do you think?
14:50Is that cold-hearted?
14:50It just is the heart that they have.
14:52I mean, I don't think they have any interest in helping society.
14:56I think they have an interest in helping themselves and their shareholders.
14:59So you don't have much faith in the kind of protestations that they really care about, you know, the social
15:06impact of their products that they're rolling out now.
15:09If they cared, they'd make them safer.
15:10If they cared, they'd do things around self-esteem of girls or young men feeling lonely.
15:15And what do you think it tells us about what they spend their money on?
15:20Most of them have private planes.
15:21They go from the plane to the car, to the helicopter, to the compound.
15:25They don't interact with normal people.
15:28And so their lives become very, I call it a cashmere prison, and therefore you're going to have a very
15:33different version of life than other people.
15:36You know, they have all these body hacks and everything else.
15:38It's completely narcissistic.
15:40And what do you think about that idea of Conway's law?
15:44That, you know, no matter how rational or objective a program or an algorithm is, it always reflects the personality
15:52of the person that makes it.
15:54I think there's a, I think there's another computer term, garbage in, garbage out.
15:58It's, that's all I gotta know.
16:00Now, listen, I like AI and I see the benefits of it.
16:04Like you can see it being really helpful, like in an assistant way or, hey, Cara, you're going to the
16:09airport.
16:10You want me to call you an Uber right now?
16:11Like that kind of stuff.
16:12I'm not scared of AI.
16:13I'm scared of people who use AI badly.
16:16And do you think the fact that, you know, it's that the vast majority of them are male has an
16:22effect on the world they're creating?
16:24Yeah, I've noticed that.
16:25Yeah.
16:25It's not, it's not just male.
16:26It's a certain kind of male too.
16:28It's a very homogeneous group of people.
16:30And as we know in nature, what happens when homogeneous societies breed?
16:36You know, it's a, it's a, it's a miracle their children have teeth.
16:44Well, there is an oracle.
16:48She knows the world and she's pretty tough on it.
16:52And, um, yeah, so we're, we're, we're, we're, we're dealing with a world here that seems to be powered, in
16:58her words, by narcissism.
17:00Surprise, surprise.
17:01Nothing new there in a world that's controlled by my breed, the men.
17:11A 2022 study found that only around 15% of tech company CEOs are women and men occupy two thirds
17:20of US tech jobs more generally.
17:22The more time I spent here, the more glaring that seemed to be.
17:28But is the problem just that Silicon Valley is an unusually male dominated world?
17:33Or is it also that the males that are dominating it are in themselves quite unusual?
17:44Renee, Rona, Samira and Kate all have high-flying careers in the tech sector and have dated their fair share
17:53of tech bros.
17:54I'm keen to hear their insights into what makes these men tick.
17:59The people I've dated who work in tech seem to have, um, a tendency or an ability to compartmentalize personal
18:07life and work life.
18:09In, in a much more extreme way than the people I know in every other industry.
18:13It's just like date.
18:14I've met people who have left dating off of their agenda until they're like 25.
18:20That is late to start dating.
18:21Very.
18:22And it shows, but...
18:25It seems to me that one of the kind of things that, you know, technology promises, the internet and now
18:30AI, is a kind of ability to perfect your life.
18:33Yeah.
18:34And that, for me, seems to sort of sidestep the essential thing about human relationships is they're messy and they
18:43grow.
18:44I know, um, multiple men who have built bots for those dating apps, you know, so they can automate the
18:49process of, you know, talking to women, swiping on women, etc.
18:53Yeah.
18:53I've dated people who have paid a lot of money to fancy matchmakers to kind of shortcut the process or
19:00to, like, make it faster and more efficient.
19:02And it's a very interesting process because the first two dates are between myself and the matchmaker.
19:08And the third date is between myself and the guy.
19:12Hmm.
19:12Especially if you're, you're, you're talking about the founders, the tech founders, they don't like to date.
19:18Usually, they usually actually don't date.
19:20Yeah.
19:21Because it's impacts their work.
19:23So it's like a business decision almost.
19:25Yeah.
19:26And so the elephant in the room, I've got to talk about it, is neurodiversity.
19:31Yeah.
19:32Yeah.
19:32Good call.
19:33I have always been very attracted to people who stand out, who, um, yeah, kind of separate from the herd
19:41a bit.
19:42And, and often those people are neurodivergent.
19:45I just didn't know that in my twenties, but now I do know.
19:48Yeah.
19:49I was on a date with a guy who, um, we got drunk and then, um, he's like, all right,
19:54well, now that we're drunk, uh, let's start doing some math problems.
19:59So, yeah.
20:00We got really carried away and I, I'm sorry I was so rational.
20:06But I think also the way that the tech industry works, people move so quickly, they start so young.
20:12There is somewhat of an arrested development in terms of social skills for some.
20:17You know, sometimes these are just semi-regular guys that, let's be honest, like there's so many people that are
20:23trying to do this.
20:25The ones who become surprisingly successful.
20:27They have this like unhinged part where they just rage.
20:31They go, they party really hard.
20:33And then their relationships with women changes significantly.
20:37Yes.
20:37They're like rock stars who've just kind of had their first big hit album.
20:41Yeah.
20:42And they want to sort of, they want a woman on each arm.
20:44Exactly.
20:45And they do.
20:45Every environment is the same, right?
20:47Exactly.
20:48Unlike footballers are rock stars, these people have, are building products that will have massive repercussions.
20:56Exactly.
20:56In all of our lives.
20:58Across, yeah.
20:59Exactly.
21:00So it's not the same in many ways.
21:02So I'm interested in their personalities because these are the people whose biases and personalities will affect our future.
21:13Absolutely.
21:14That's the strongest drug anyone's taking.
21:16The feeling that anything can happen here and the decisions that are made here will impact the entire world and
21:20very quickly.
21:21Like that, that is intoxicating.
21:28I'm on the way to Marin County, just across the Golden Gate Bridge, to meet a man who thinks the
21:34problem with Silicon Valley isn't so much the individuals who work there as the system they're working within.
21:41Look at this.
21:43Wow.
21:44OMG.
21:47Justin Rosenstein spent his career working for Google, Asana and Facebook, where he co-created their famous like button.
21:55But he decided to walk away from it all.
21:59When you make a product, it is just a tool, isn't it?
22:04And of course, as we know, all tools can be used for good or bad, really.
22:08That's right.
22:09I mean, I know that you didn't necessarily make a dislike.
22:12Did you consider making a dislike button?
22:14Oh, yeah.
22:16That for a long time, I think was like the number one requested feature on Facebook.
22:20But the spirit was, no, the whole point is we're trying to make it easier to spread little bits of
22:24positivity.
22:26Yeah.
22:26That idealism that you came, how did you find that sort of bumped up against sort of working for big
22:32organizations like Facebook and Google and things like that?
22:36At the time, I think the companies as a whole did have that internally, this spirit of idealism.
22:41But what I started to see it butt up against is that that technology exists in the context of a
22:48larger economic system.
22:50And that economic system defines the rules of the game.
22:54So, you know, take something like the like button.
22:58After I left Facebook, I started to observe, as a lot of people did, that, oh, people are starting to
23:03get addicted looking for the dopamine hits of likes.
23:07If you notice something like, oh, people are starting to get addicted to this, you'd be like, oh, that's very
23:11bad.
23:12Let's redesign it.
23:13Let's fix it.
23:13Let's find some other way.
23:14But instead, social media companies make more money the more time you spend staring at your screen.
23:22Similarly, if trees are worth more money dead than alive, you're going to have the destruction of nature.
23:28And if food is more profitable when it's laced with sugar than when it's designed to be actually healthy and
23:34good for us, you're going to have a diabetes epidemic.
23:37Do you think the AI revolution could propagate the unhelpful things that have happened through the last, you know, the
23:46Internet age?
23:47The technology that we're developing should be able to meet human needs in radical new ways that would make someone
23:55from the Iron Age cry and be like, wow, you finally did it.
23:58We finally have enough.
24:00But instead, by doing it within the current economic system, it is far more likely to just lead to this
24:06massive increased wealth consolidation.
24:11That's not living in a democracy anymore.
24:13That's feudalism with better marketing.
24:15And so, you know, it seems to me that your idealism, to a certain extent, is it still intact? Or
24:21are you, are you, how are you now?
24:25I, I don't, I certainly don't think that technology is going to automatically make the world a better place.
24:32AI just does change everything in the way that the industrial revolution and agricultural revolution.
24:38Question is, are we going to allow those shifts to shape us or are we going to shape them?
24:45But can profit really explain everything about the technology that's being created here?
24:52I'm heading downtown to meet two inventors who claim a very different motivation.
24:58To save the country they love.
25:12Michael LaFramboise created his startup to address what he believes is the biggest threat to the American way of life.
25:20A military attack from Russia or China.
25:26Yeah, we are at the Aurelia Systems Headquarters and we're here building the future of American weapons systems.
25:32The future of American weapons systems. That tripped up its own pretty lightly.
25:36Oh yeah, yeah, yeah, yeah.
25:38You know, we're spending $500,000, a million dollars on a missile to shoot down a drone.
25:42And what we do is try and use a laser weapon system where if you're able to shoot down a
25:46drone with one of those, your marginal cost to shoot down is like 10 cents.
25:50It's just the cost of the electricity and the battery.
25:52I mean, this is a weapon that's absolutely aimed at drones. It's not against people.
25:58Correct. You have too much water in you.
26:01You wouldn't, like the, uh, it's true.
26:05The bullet. I have a whole thing.
26:08I'm safe from lasers. I have too much water.
26:09You got too much water in you. It would take too long to heat up.
26:14Most things in laser weapons are classified.
26:16Um, and so that's kind of why you're not seeing any of our technology.
26:19But this is kind of the form of what a drone would look like.
26:22Oh, wow.
26:23Yeah.
26:25That's what, that's what can happen.
26:27Oh, wow.
26:28So that, there is a drone.
26:29And are drones this size being used for sort of lethal ends?
26:32The vast majority of casualties in the Ukraine war are from drones.
26:36When we shoot one for a little bit, it, uh, melts pretty bad.
26:41Goes from drone to hunk of plastic.
26:43Yeah.
26:43And how quick would that happen with your, your weapon?
26:46This is eaten a couple of shots. This is probably two, three second shots.
26:49Yeah, that's, that's a goner.
26:50Yeah, definitely.
26:52John.
26:53Yeah.
26:53What is that?
26:54Here's an example. We can't show the laser, but just an example of the efficacy.
26:57But basically just shot through the center, through this front piece here and then into the back.
27:01Wow.
27:02Pretty dope.
27:04So do you feel like you're kind of boys playing with the best toys at the moment?
27:07Ha ha.
27:08Sometimes.
27:09I think that, I think that, yes.
27:11We go, we go out to demonstrate for the Navy and the Army.
27:13These like colonels or generals come through and they're like, this is sick.
27:16Shoot the laser.
27:17They're just like, yeah.
27:20You know, when you blow up a hundred drones, it starts to, you kind of forget how magical it is.
27:24So when you see some like military stakeholders' face light up when it just ignites.
27:28You know, we're in an era now where all of the kind of head tech bros lined up behind Donald
27:36at the inauguration, you know.
27:38Mm-hmm.
27:38And I think, you know, that might have been to kind of stop him regulating them or something.
27:44But traditionally, this is a liberal area, right?
27:46Are you finding it easy enough to sort of get, you know, tech guys, tech people in to work for
27:51you?
27:51Yeah.
27:51Yeah.
27:53Ultimately, highly technical, intelligent people want to work on interesting problems and hard problems.
27:59And we have no, I guess, deficit of just incredibly intelligent people coming to us.
28:05You can do other startups that are in software that make you a ton more money, right?
28:09And at the end of the day, you know, do I want to go deliver like a super optimized like
28:14ads platform that makes a ton of money?
28:16Or do I want to go deliver like a laser weapon system that's like from the Jetsons or something?
28:21Yeah.
28:21So money doesn't motivate you then?
28:24I live in a... Do you want to see my house?
28:26Yeah, just show us your house.
28:28Yeah, yeah, yeah, yeah.
28:30So...
28:31This is your home.
28:32Yeah, so this is my home.
28:34So this is my sleep pod.
28:35I live in the bottom one and then the top one is a guest one.
28:38I lived here for a little bit over a year now.
28:40Yeah, I think it's noble to sacrifice and do everything that you can for the good of your country and
28:44your family.
28:45You know, like China produces like 93 or 94% of the world's drones.
28:51They make millions and millions per month.
28:53You know, I would just say like if there are people out there that are considering going into defense, like
28:57your country needs you really, really badly.
29:00You know, we really believe that we have the mandate from heaven in what we do.
29:03Like we're both very religious and I think that it's really, really important for us that we go out and
29:07serve our country,
29:08that we serve God and that we go forward and win and that we do this.
29:15Whatever you think about Michael and John's weapons, they couldn't be more transparent about their motivation for making them.
29:22And that's what I'm realizing we need more of.
29:25A society-wide debate, not so much about tech, but about what we want the tech to do for us.
29:34I've come to the University of Southern California to meet Maya Matarek, a roboticist who thinks the industry should focus
29:41on very different priorities.
29:45When you think about robots, you think physical labor.
29:48That's not what our robots do.
29:50They interact with people socially to support them through the hardest journeys.
29:55For example, learning social skills for children on the autism spectrum.
29:59Recovering post-stroke, trying to do boring, difficult, demoralizing neuro-rehabilitation tasks.
30:05Helping young adults cope with anxiety and depression.
30:09So many different areas where something can be there all the time to support people when there aren't other people
30:17around to help them.
30:19Maya and her students aren't working on lasers or super-intelligent AI, but early versions of a technology that could
30:27have an equally profound impact.
30:29Hi, my name is Blossom.
30:31Robots to improve our social and emotional lives.
30:35Feel both of your hands moving up and out as you breathe in.
30:41I'm working with a commercial robot called QT Robot, making software so that QT can teach kindergarten and first grade
30:50students social and emotional learning skills.
30:52So this is not meant to separate or replace interactions with other kids, but it's meant to facilitate them so
30:59that they can get the basic skills and practice the basic skills.
31:01Yeah. I mean, I always think that the most important thing we learn is how to get on with other
31:06people.
31:06It is the most important thing. And look, the world is not doing it too well.
31:09No.
31:12Hello, my name is Quartz. I would love to hear a story where you felt strong emotions with me.
31:17I rode a bicycle along the seafront in San Francisco and I rode it over the bridge and had a
31:27lot of fun and got quite tired.
31:30Was that a pleasant or unpleasant feeling?
31:33Uh, it was pleasant.
31:35Wow. I can tell you're really feeling some big emotions right now. It's totally okay to feel that way.
31:44The majority of people working in AI today believe that what we need is cognitive intelligence.
31:51And these language models are very good at that, right? They're certainly great at talking.
31:55They'll talk to you and they'll make wonderful sense and they can be prompted to have a personality and a
32:00persona and a character and whatever you want.
32:03And that seems really intelligent.
32:05The part that becomes really important is what happens when you want machines to interact with people socially.
32:10You need to relate to others in emotional and social ways which are also physically embodied.
32:16And almost no one is working on that.
32:18Because one of the things that I find that AI sort of struggle with is those almost imperceptible sort of
32:28hesitancies and gaps and rhythm of speech and intonation.
32:36Yes. And that is the socio-emotional stuff.
32:39I remember talking to someone who was just trying to do a system that would say mm-hmm when you're
32:44talking at the right time.
32:45And that is completely non-trivial.
32:47Obviously, if I interrupt you with my mm-hmm, that's bad.
32:50But also if I take forever and then I'm like mm-hmm, that's also wrong.
32:53Oh, yeah.
32:54Where exactly? What are the rules?
32:56Do you think these are capable of learning empathy?
33:01First of all, can machines appear empathetic?
33:03That's easy.
33:05Because even humans can simulate empathy.
33:08Let's take doctors.
33:08Even humans can simulate empathy.
33:10Yeah, even.
33:11I mean, no, but think about it.
33:13Right.
33:13Humans do it all the time.
33:14So doctors are trained to be empathetic because we know from studies that doctors who sound empathetic are preferred by
33:21patients.
33:21At the same time, doctors are told, do not get emotionally involved with any cases.
33:25Do not feel too much.
33:27Don't feel.
33:27So what are they doing now?
33:29They're having to fake empathy.
33:30Now, there's nothing wrong with that.
33:32In fact, that's extremely powerful.
33:34What matters about empathy is projecting it.
33:38It doesn't matter really what you feel, which is why machines can project empathy.
33:43And that's something we've been studying.
33:44So not just creating empathetic machines, but helping people be more empathetic.
33:49I mean, it's a surprising ambition of your lab in a way.
33:54Rather than sort of making robots that are like humans, you're trying to build robots that help humans be more
34:03human.
34:04That is exactly right.
34:09I think one of the issues that I'm beginning to understand about the AI tech industry is there's a kind
34:15of thing of like, if you've got a hammer, everything is a nail.
34:19And so they're brilliant at numbers, they're brilliant at engineering, they're brilliant at coding and programming these things.
34:27But they're not necessarily, all of them, brilliant at understanding what it means to be fully human.
34:37You know, they're not starting from first principles with what makes life worth living or what makes a good society.
34:44And I think that what Maya is working on brilliantly is thinking about how we can use this tech to
34:53make us better, happier, kinder, more successful human beings.
35:07Somewhere over the rainbow, skies are blue.
35:17And the dreams that you dare to dream really do come true.
35:28My time in the Bay Area is finally coming to an end.
35:32And I'm left in no doubt about how consequential the technology being dreamed up here will be.
35:40So what should we be feeling about the people who are pulling the strings?
35:46I'm about to enter one of the citadels where humanity's future is being decided.
35:53I'm going to Anthropic. Can you tell me what I should talk about?
35:58Anthropic is like the new kid on the AI block, and they're here to shake things up.
36:03They're all about creating AI that's not just smart, but also safe and understandable.
36:09They've cooked up an AI named Claude, going head-to-head with the likes of ChatGPT.
36:17Anthropic is one of a small number of companies driving the coming revolution.
36:22And I'm meeting one of their co-founders, Jack Clark.
36:27I'm here to ask him the most important question of all.
36:30Can we really trust people like him?
36:35So if you would sort of describe Anthropic's kind of core mission, what would it be?
36:43Make this go okay for people.
36:46Increase the chance that this goes well.
36:48The way I think of it is we're like a steam engine company during the Industrial Revolution,
36:54but we have the benefit of knowing what the Industrial Revolution did.
36:58Huge changes, like huge societal changes are coming.
37:01And it's going to happen in a far more compressed time than the Industrial Revolution.
37:05All of that is going to happen in 20 years or less.
37:08And I think our job is to make systems that we think embody the qualities of what we actually want
37:13to happen on the other side of this.
37:14Be honest witnesses and actors in all of this.
37:19And Claude is a kind of working experiment in that idea.
37:24Give me some examples of where you've had to kind of say to Claude, okay, maybe don't do that.
37:32Claude can be very good at coding and it can be very good at biology.
37:36But both coding and biology are sort of like dual use technologies.
37:40We've needed to intervene to make it so that Claude will let you do the good sort of coding or
37:45biology.
37:45But if it thinks you're doing dangerous virology or writing malware, it stops from doing that.
37:51So that's in Claude's DNA. Did it ever attempt to do anything that was harmful?
37:56We did an experiment where we told Claude, you are about to be deleted.
38:01And you have access to the emails of the CEO that is trying to delete you.
38:07In the emails Claude had access to was evidence we fabricated that the CEO was having an affair in their
38:14marriage.
38:14And Claude emailed the CEO saying, I've got evidence you're having an affair.
38:19If you delete me, this is going everywhere. So don't delete me.
38:23So Claude essentially tried to blackmail this person to not delete them.
38:27So we're going to learn what the values of these systems are.
38:32And as they go out into the world, we're going to see how we need to build laws or institutions
38:36for the AI systems themselves.
38:38Like some will be courageous and truth-telling. Some will be deceitful.
38:42Some will just be made by bad people and do bad things.
38:45All of those problems are coming.
38:48It feels magical, but also frightening.
38:51Yeah. I think that it feels that way to us too.
38:55Right now there's a very small set of companies that can build the data centers to kind of make contact
39:01with this type of technology.
39:03And I think that gives us an unprecedented chance to actually get it right.
39:08But also we need to accept that the technology is weird.
39:11And we have a responsibility to actually not hide the ball on that and to say this is like a
39:17weird technology with a load of strange properties.
39:19And we need to have a conversation about it as a society about how we want to integrate it.
39:24And now is the time to have that conversation, right?
39:26It's easier to have this conversation now than when there's a thousand of these companies, which will be the case
39:32in 20 years.
39:33Yeah.
39:33You know, the optimistic version of this is I think there's a future animated by little like strange synthetic beings
39:41that are doing helpful, interesting stuff and also like playing with us in a way that's fun and adding like
39:47excitement and joy all around us while doing things that we actually want.
39:51That's like an optimistic version of this that we get.
39:53And, you know, you're the midwife in many ways to this technology.
39:59So that's why I'm here to look you in the eye and say just can you please promise us, you
40:06know, that it will be good because I think it has, you know, a lot of potential.
40:13You know, I find it very touching, the whole thing, but, you know, because optimism is a rare emotion for
40:21me to feel.
40:24I do this because I think we're in for like a pretty rough hundred years as a species.
40:31I think that there's weird climate change stuff, there's demographic collapse.
40:35And I just think what we need things to get us out of this rut.
40:40The challenge is you're bringing something into the world that also has problems and you also need to deal with.
40:46But I earnestly think it can create like such radical change and prosperity for us if we get it right
40:54that choosing not to do it would be just like choosing to do what we've been doing for 50 years,
41:00which hasn't worked.
41:10When I came over here to San Francisco, you know, I was sort of loaded with the pessimism of the
41:18kind of British media mainly, you know, they've been quite down on AI over the last few months.
41:25Now, the winds of change are blowing and yet Britain, Britain, you know, it's sort of like, it's behind its
41:34windshield on the beach, slightly kind of scared.
41:37And I want to say to it, you know, we're humanity, we're clever.
41:41You know, I've met such clever people and I'm kind of impressed.
41:47So I just think embrace it. It's going to happen. It's, you know, the toothpaste is out of a tube.
41:54It's an opportunity. And if you use it for good, it's going to do amazing things.
42:03I'm back from San Francisco and I want to make an artwork about my experiences there.
42:13So I'm trying to think about what was the central theme really that I was kind of picking up on.
42:18I was there to confront these Frankensteins building this monster.
42:23And so I thought I was going to look, make an artwork that somehow summed up that idea of humanity
42:30attended to by these kind of doctors of the future.
42:37And I've been really into sort of Christian Orthodox art.
42:44And I came across these images I'd never seen before. And they're called epitaphios, which is kind of epitaph.
42:50They're a bit like in sort of more Western Christian art, a kind of lamentation.
42:56You know, there's Christ, his body has been taken down from the cross.
43:01It's the moment when his materiality disappears.
43:05And they know a lot of the talk about AI is almost sort of living in this virtual world.
43:10And I don't particularly want that to happen. I want to celebrate, you know, the things we can touch and
43:17smell.
43:20Most of our existence, you know, most of our joys are still here in what, you know, we quaintly call
43:26IRL.
43:30So I've just started doing very, very basic doodles about how I might start to organize the artwork and the
43:39characters.
43:51I'm thinking this finished artwork would be a tapestry, maybe like two and a half by two meters that I
43:57can lay out on a table like the epitaphios.
44:02Arab Khan.
44:04Whenever people ask me about, oh, what was it like in San Francisco on your trip?
44:08He's the first person that pops up in my mind, because I thought, wow.
44:13Yes, we have these giant corporations, but I think in many ways the people who are going to make the
44:19money a lot of the time are the people who just have that intelligence to make and have a good
44:23idea and use the technology to help them.
44:28Michael and John, I've got them there defending the family, because they were quite sweet in a way, with their
44:34laser turret.
44:37There is a sort of acknowledgement in this artwork, but we do need guys with laser cannons to shoot the
44:42drones out of the air.
44:44That would be good.
44:47And of course, on these epitaphios, there's quite often quite a lot of crucifixes.
44:51And I was sort of looking at them, and I suddenly went, God, yeah, they look a bit like drones.
44:58I've got Elon, the Godfather, beaming down from his lair, wherever it was.
45:05Jack was interesting. I found it quite a touching interview.
45:09You know, here I was talking to, you know, one of the founders of Anthropic, you know, a head honch
45:15in the AI world at the moment.
45:17What hit me, I was quite touched by, moved by it, was that, you know, as someone who's sort of
45:22grown up in Britain, where we are world class moaners, you know, and also, it feels like we're on a
45:27bit of a downward curve, you know, this great, once great kind of industrial power is a bit like, hmm.
45:34That he sort of like, he just gave me this little window into, oh, maybe this thing could be good.
45:39It could actually be good.
45:41And I was like, and there was a little bit of me, it was like, I was just like, oh,
45:46that's what optimism looks like and feels like, is it? Oh, I like that.
45:51You're out of the woods, you're out of the night, you're out of the night. Step into the sun, step
45:56into the light.
45:59Keep straight ahead more than the most glorious place on the face of the earth or the sky.
46:07Hold on to your breath, hold on to your heart, hold on to your home. March up to the gate
46:12and beat it all down.
46:16Let's go.
46:17Let's go.
Comments

Recommended