Skip to playerSkip to main content
  • 7 hours ago
tele: https://t.me/TopFilmUSA1
#film#shows#usa#usashows#hot#filmhot

Category

😹
Fun
Transcript
00:13I'm in San Mateo, California, in the heart of Silicon Valley,
00:18at an event celebrating one of the tech world's most powerful people.
00:27Let's go!
00:29As a swarm of drones takes to the sky,
00:32I can't help thinking about how much our lives are in thrall
00:36to these tech billionaires and their singular obsessions.
01:01Here it comes.
01:03Hey, Grayson.
01:08It's a regular car. From inside, you can't tell, really, that you're in a driverless car, apart from the fact
01:13there's no driver.
01:15I'm here in San Francisco, California, because this is a place that has always looked to the future.
01:22If you think about it, you know, it's history. It's a series of waves of optimism and dreams and utopias.
01:30And then, of course, technology. We've had the computer. We've had social media.
01:35And now we are on the cusp of what might be the biggest revolution in technology that will affect all
01:41of us. AI. Artificial intelligence.
01:44AI is about to challenge almost every certainty you might think you have about what it is to be human.
01:56So come with me as I venture into the future that Silicon Valley technology is creating, and as I make
02:03an artwork about what I discover.
02:16I've been exploring AI and robotics in Silicon Valley. Now I want to find out more about the people behind
02:23the technology.
02:25And there's one beast in the tech jungle who's bigger than all the rest.
02:30Five, four, three, two, one.
02:36Elon Musk is one of a handful of billionaires shaping our technological future.
02:42OK, you libtards. The X Takeover is an annual celebration of all things Elon.
02:49And it attracts admirers of his products from all over the US.
02:54So, Grayson, look at this. Bulletproof glass is super unique.
03:00It actually is return fire. So we like to put a watermelon on the outside.
03:04I have a shooter inside. So these two holes right here shot out.
03:10The watermelon explodes. And then you can actually still take shots for outside in.
03:15And if you put your hand down here, nothing gets through.
03:17So come the apocalypse, I'm safe.
03:21You are safe.
03:22When they're kind of hunting me for food.
03:24Right. Yeah, yeah, yeah.
03:25Yeah.
03:25So for carjacking, for theft, or for road rage, you don't have to freak out.
03:33Not that you would, Grayson.
03:35Are you a big Elon fan?
03:36Very much so. He's a disruptor.
03:38The great thinkers have always been disruptors.
03:43So you love Teslas?
03:44We love Teslas.
03:45Absolutely.
03:46Is this a little glimpse of the future here?
03:49Absolutely.
03:49Absolutely.
03:50Absolutely.
03:51Absolutely.
03:51More people are going to buy more Teslas.
03:54We are showcasing our Cybertruck.
03:57It's fast.
03:58It's safe.
03:59Yeah.
04:00We can plug in our air fryer, our microwave, air conditioners, and our tent.
04:04And it drives us.
04:06What do you think of Elon Musk?
04:08He's an innovator.
04:09Yeah.
04:10I'm always going to be hopeful for our country, that we're going to be moving in a direction
04:14for, you know, goodness and kindness for everybody, and innovation for the nation and for the
04:18world overall.
04:19So are you a techno-optimist?
04:21Major.
04:22Major techno-optimist, actually, yes.
04:27This is a cyber beast.
04:29Right.
04:29So that's why we have Cerebus here with us.
04:32And Cerebus is the, from Greek mythology, it's the three-headed dog.
04:37You get it guarded to the gates of Hades.
04:39Which brings me on to the question, what do you reckon to Elon Musk?
04:44Elon, Elon's a fantastic guy.
04:47I mean, he's, uh, the satellites, SpaceX, going to Mars.
04:51I mean, who else, who else is out there that's done what he's done?
05:00And then, as if beaming in from another planet, the man himself appeared on screen.
05:09Tesla's valuation could be 25 trillion.
05:13That's 20 to 30 pounds more than it is today.
05:17I've never seen any technology advance as fast as AI.
05:22AI is a super-sign tsunami.
05:26Who wouldn't want their own C-3PO R2-D2?
05:32I love you, Elon!
05:34I love you guys, too.
05:40And away we go.
05:41Okay, here we go.
05:45First time I've ever drove a truck, let alone a cyber truck.
05:50Elon, well, whatever you think of his politics and his personality,
05:53he's, he's, I think he's what you would call ruthlessly logical.
05:59And I think, you know, that's why, that's why he succeeds.
06:03You know, he's a brilliant engineering brain and businessman,
06:06but I don't know if his ruthless logic necessarily correlates well to government.
06:13You know, this truck, in a way, is a fantastic metaphor for what's going on
06:19in the world today,
06:20in that the personality of the person behind the company
06:24is actually shaping the future in a way.
06:28So we've got to look at those people
06:30because they are the people that are going to be controlling technology
06:35that is going to be a big part of our lives in the future.
06:39And you may not like the shape of the cyber truck, so be prepared.
06:46The concentration of power and wealth
06:49in the hands of a small group of tycoons
06:51like Musk, Mark Zuckerberg and Jeff Bezos
06:54might well be historically unprecedented.
06:57But even they all started their global empires
07:01in bedrooms and garages.
07:03And there's always a new generation of tech entrepreneurs
07:06hungry to replace them.
07:11I'm in Palo Alto with 21-year-old, second-generation Pakistani-American,
07:17Arab Khan, whose start-up is already making millions.
07:21How much have you got?
07:23I think, like, my personal net worth is a bit over eight figures now.
07:29But I think over the next, like, 12 months,
07:31I really want to get to, like, the range of doing 25 million.
07:35A year?
07:35Yeah.
07:36Wow.
07:37I mean, I live with my parents.
07:38I still, I can move out.
07:39I could have moved out four or five years ago.
07:42One of the benefits I have is, like, I get home,
07:45my laundry's done, my bed is made.
07:48You don't have to think about those things.
07:50But the best thing is, like,
07:52I'm doing exactly what I want,
07:53hanging out with my friends, building cool shit,
07:55but also making, like, an ungodly amount of money.
08:00Arab's company, 24 Labs,
08:03might be hidden inside an unprepossessing warehouse,
08:06but it builds a huge range of software,
08:08from tools to help traders predict the market
08:11to apps that allow anyone to create music.
08:15Hello.
08:16Yes, and this is, uh...
08:18This is the crew.
08:20Hi.
08:21How's it going?
08:21Here we are.
08:22Yeah, good.
08:23So, so you all work.
08:25Are these your employees?
08:26I would call them, I would call them partners,
08:29but yeah.
08:31What are these companies doing?
08:32Let's say, like, you know that a certain market,
08:35you're going to make money
08:35if you build a certain product, right?
08:37Let's say, for example,
08:38the product can be a product on what shoes to wear.
08:42Like, the app is around,
08:42take a picture of your shoes,
08:44and it will give you recommendations
08:45of what other shoe brands to try.
08:47It's essentially super easy
08:49to build that technology today.
08:50Yeah.
08:50So it's like, when you're starting a brand out,
08:52like, being the first mover
08:54has an incredible advantage.
08:56So you just have an idea?
08:57Yeah, literally, literally that.
08:59I speak great Arabic.
09:01I'm Egyptian.
09:02Oh, right.
09:02That's a handy business tool.
09:04Yeah.
09:06And he pitched the ideas,
09:08like, let's do,
09:08let's build an app
09:09that teaches people how to speak Arabic.
09:11Yeah.
09:12Utilizing, I guess, voice recognition
09:13and then AI course creation
09:15has just been fantastic.
09:16We think we launched about four weeks ago,
09:18and we're at 12,000 in revenue,
09:21and on track to hit at least 100
09:23by the next two months.
09:25If you'd have told me,
09:25oh, yeah, this guy is doing this
09:27with his company,
09:28I'd expect to see a big glass office.
09:30Yeah.
09:31You know, with a big logo out of it,
09:32you know, with maybe, like,
09:34500 employees or something.
09:36Software, like, five years ago,
09:38for you to do what we're doing,
09:39you would have had to raise $100 million.
09:41Like, if you think about cost of intelligence,
09:43that has gotten a thousand times cheaper
09:45over the last 10 years.
09:46Is this because of AI?
09:47Yeah, and that means each human
09:49can output a thousand times more work.
09:52The flow to build this Arabic app
09:53was Adil's an engineer,
09:55and he was like,
09:55yo, like, engineering,
09:57let's get this figured out.
09:58We vibe-coded it in, like, a day or two.
10:00We had our designers design it.
10:01We took the designs
10:02and made the app better.
10:04Then we found creators.
10:05Abilo was on that.
10:06He was, like, looking for creators to promote.
10:07We paid them with my money,
10:09and then that leads to subscriptions.
10:12Yeah.
10:12I mean, I've sort of seen phrases
10:14around social media,
10:14like, there's a whole bunch of teenagers out there
10:16using AI to make million-dollar companies,
10:18and you don't even know about it.
10:20I moved here to the Bay Area
10:21only, like, I think six months ago.
10:23I was just at a coffee shop coding,
10:25and I actually recognized Arab from online.
10:27He invited me to work with them
10:30one hour after I met him.
10:32Oh, my God.
10:33I went from being a random guy
10:34who moved to California three months ago
10:37from Lebanon.
10:38Yeah, I grew up in Lebanon.
10:39I spent 24 hours a day around these guys.
10:41Every single day,
10:42we're just always locked in together.
10:44Wow.
10:45This really is the gold rush.
10:46Yeah, yeah.
10:47I mean, it's just amazing.
10:49I don't think I've ever been in a room like this before.
10:51You're all kind of entrepreneurs,
10:53and you're full of ideas,
10:54and you're fizzing with it, you know,
10:55and I love your kind of group dynamic, you know?
10:57I think that's great.
10:59So it's, like, a really interesting model
11:00where, like, software,
11:01you can sell to 100 customers,
11:03and even if, like, 20, 30 customers disappear,
11:06you're still going to be getting money
11:06for a very long time.
11:08Everyone thinks of us as, like, software builders.
11:10In theory, yes, we build software companies,
11:12but we actually are in the business
11:14of acquiring human attention.
11:16Like, the more human attention we can acquire,
11:18the more money we can make.
11:19That's all that matters to us.
11:20Yes, the attention economy,
11:22and you are an entrepreneur in that.
11:25Yeah.
11:26At the burning edge.
11:27Exactly.
11:28So you mentioned you were an artist, right?
11:30Yeah.
11:31What are your thoughts on using AI for AI?
11:33I mean, it's fine.
11:34I've used it.
11:34In my last exhibition, I used it a bit, yeah.
11:37I always say it'll put birthday card designers
11:39out of business.
11:40You know.
11:41If you're, like, AI will wipe out
11:43most mediocre intellectuals.
11:49Arab Khan, I've got to admit,
11:52I don't know what prejudices I went in there with.
11:54You know, I was thinking,
11:55I'm going to meet a tech bro,
11:56and I'm going to find him a bit abrasive
11:58and a bit too macho.
12:00But, you know, he's this sweet, young, 21-year-old guy.
12:03And then he said,
12:04I've got my posse with him.
12:06And I was like, oh, and I was slightly
12:08sort of intimidated for a moment.
12:09But the minute they opened their mouth,
12:11they were so delightful and enthusiastic and bright.
12:15And yet they were sitting around
12:16this kind of mundane office with their laptops.
12:19Like, they might have just been kind of, like,
12:21listening to music or playing games or something.
12:23But these guys,
12:25they were sitting there
12:26and they were making millions of dollars.
12:32I can't imagine a better example
12:34of the spirit that propels so much of the innovation here.
12:38But I was about to meet someone
12:40who paints a more complicated picture
12:42of what it takes to make it in Silicon Valley.
12:46When you get rich,
12:48you get people licking you up and down all day
12:49and telling you you hung the moon
12:51and therefore you are right about everything.
12:53And you get no pushback.
12:54And over time,
12:55you tend to push out the people who push back.
12:58I always say tech bro stands for technically broken.
13:10I, Donald John Trump, do solemnly swear.
13:14At President Trump's inauguration,
13:17Elon Musk and the bosses of Google,
13:19Meta and Amazon all came to honour him.
13:23If I want to understand what's going on
13:25inside the heads of the tech giants right now,
13:28there's one person I need to speak to.
13:33I'm going to meet someone called Cara Swisher.
13:37What can you tell me about her?
13:40Cara Swisher's a powerhouse in tech journalism,
13:44known for her sharp insights and no-nonsense interviews.
13:47She could offer invaluable perspectives on Silicon Valley,
13:51boosting your documentary's depth and reach.
13:54Sounds like a powerful meeting.
13:56Should I be a bit nervous?
14:00Nah, no need to be nervous.
14:02She's tough but fair
14:04and really values genuine conversation.
14:07Just be yourself
14:08and bring your passion for your documentary.
14:10She'll appreciate your thoughtful approach.
14:13You got this.
14:15Thanks.
14:19Hello, Cara.
14:20Hi.
14:21What do you think of the sort of present
14:23kind of political shenanigans of the tech bros?
14:26I always say look at the money with these people,
14:28all their nonsense and things they say about life and community.
14:32They are very, mostly ignorant about history.
14:35They're largely non-political, really, or ignorant.
14:40And they don't, they just want to do things
14:42that will help make their businesses regulation-free, essentially.
14:46So is it really that cold-hearted?
14:48Is that right across the board, do you think?
14:50Yes, that cold-hearted?
14:50It just is the heart that they have.
14:52I mean, I think they don't,
14:53I don't think they have any interest in helping society.
14:56I think they have an interest in helping themselves and their shareholders.
14:59So you don't have much faith in the kind of protestations
15:02that they really care about, you know,
15:05the social impact of their products that they're rolling out now.
15:09If they cared, they'd make them safer.
15:10If they cared, they'd do things around self-esteem of girls
15:13or young men feeling lonely.
15:16And what do you think it tells us about what they spend their money on?
15:20Most of them have private planes.
15:21They go from the plane to the car, to the helicopter, to the compound.
15:25They don't interact with normal people.
15:28And so their lives become very, I call it a cashmere prison.
15:32And therefore, you're going to have a very different version of life
15:34than other people.
15:36You know, they have all these body hacks and everything else.
15:38It's completely narcissistic.
15:40And what do you think about that idea of Conway's Law?
15:44That, you know, no matter how rational or objective a program
15:48or an algorithm is, it always reflects the personality
15:52of the person that makes it.
15:54I think there's another computer term, garbage in, garbage out.
15:59That's all I got to know.
16:00Now, listen, I like AI and I see the benefits of it.
16:04Like, you can see it being really helpful, like in an assistant way
16:08or, hey, Kara, you're going to the airport.
16:10You want me to call you an Uber right now?
16:11Like that kind of stuff.
16:12I'm not scared of AI.
16:13I'm scared of people who use AI badly.
16:16And do you think the fact that, you know,
16:18it's that the vast majority of them are male,
16:21has an effect on the world they're creating?
16:24Yeah, I've noticed that.
16:25Yeah, it's not just male.
16:26It's a certain kind of male, too.
16:28It's a very homogeneous group of people.
16:30And as we know in nature,
16:32what happens when homogeneous societies breed?
16:36You know, it's a miracle their children have teeth.
16:44Well, there is an oracle.
16:48She knows the world and she's pretty tough on it.
16:52And, yeah, so we're dealing with a world here
16:57that seems to be powered, in her words, by narcissism.
17:00Surprise, surprise.
17:01Nothing new there in a world that's controlled by my breed, the men.
17:11A 2022 study found that only around 15% of tech company CEOs are women
17:18and men occupy two-thirds of US tech jobs more generally.
17:22The more time I spent here, the more glaring that seemed to be.
17:28But is the problem just that Silicon Valley
17:31is an unusually male-dominated world?
17:33Or is it also that the males that are dominating it
17:37are in themselves quite unusual?
17:45Rene, Rona, Samira and Kate
17:48all have high-flying careers in the tech sector
17:51and have dated their fair share of tech bros.
17:55I'm keen to hear their insights into what makes these men tick.
17:59The people I've dated who work in tech
18:01seem to have a tendency or an ability
18:05to compartmentalise personal life and work life.
18:09in a much more extreme way
18:11than the people I know in every other industry.
18:13It's like, I've met people who have left dating
18:17off of their agenda until they're like 25.
18:20That is late to start dating.
18:21Very, and it shows.
18:23But...
18:25It seems to me that one of the kind of things
18:27that technology promises, the internet and now AI,
18:31is a kind of ability to perfect your life.
18:34Yeah.
18:34And that, for me, seems to sort of sidestep
18:39the essential thing about human relationships
18:42is they're messy and they grow.
18:44I know multiple men who have built bots for those dating apps,
18:47you know, so they can automate the process
18:50of, you know, talking to women, swiping on women, etc.
18:53I've dated people who have paid a lot of money
18:56to fancy matchmakers to kind of shortcut the process
19:00or to, like, make it faster and more efficient.
19:02And it's a very interesting process
19:04because the first two dates are between myself
19:06and the matchmaker.
19:08And the third date is between myself and the guy.
19:12Especially if you're talking about the founders,
19:15the tech founders, they don't like to date.
19:18Usually, they usually actually don't date
19:20because it impacts their work.
19:23So it's like a business decision almost.
19:25Yeah.
19:26And so the elephant in the room,
19:28I've got to talk about it, is neurodiversity.
19:31Yeah, yeah. Good call.
19:33I have always been very attracted to people
19:36who stand out,
19:38who kind of separate from the herd a bit.
19:42And often those people are neurodivergent.
19:45I just didn't know that in my 20s,
19:47but now I do know.
19:48I was on a date with a guy who,
19:51we got drunk,
19:52and then he's like,
19:54all right, well, now that we're drunk,
19:55let's start doing some math problems.
19:57So...
20:00So, yeah.
20:00We got really carried away,
20:02and I'm sorry I was so rational.
20:06But I think also,
20:07the way that the tech industry works,
20:09people move so quickly,
20:11they start so young,
20:12there is somewhat of an arrested development
20:14in terms of social skills for some.
20:17You know, sometimes these are just semi-regular guys
20:20that, let's be honest,
20:22like, there's so many people
20:23that are trying to do this.
20:25The ones who become surprisingly successful,
20:27they have this, like, unhinged part
20:30where they just rage,
20:31they go, they party really hard,
20:33and then their relationships with women
20:35changes significantly.
20:37Yes.
20:37They're like rock stars
20:38who've just kind of had their first big hit album,
20:41and they want a woman on each arm.
20:44Exactly.
20:45And they do.
20:45And every environment is the same, right?
20:47Exactly.
20:48Unlike footballers or rock stars,
20:50these people have,
20:51are building products
20:53that will have massive repercussions
20:56in all of our lives across,
20:58yeah, absolutely here.
21:00So it's not the same in many ways.
21:02So I'm interested in their personalities
21:04because these are the people
21:07whose biases and personalities
21:10will affect our future.
21:13Absolutely.
21:13That's the strongest drug anyone's taking,
21:16the feeling that anything can happen here
21:17and the decisions that are made here
21:19will impact the entire world
21:20and very quickly.
21:21Like, that is intoxicating.
21:28I'm on the way to Marin County,
21:31just across the Golden Gate Bridge,
21:33to meet a man who thinks
21:34the problem with Silicon Valley
21:36isn't so much the individuals who work there
21:38as the system they're working within.
21:41Look at this.
21:43Wow.
21:44OMG.
21:46Justin Rosenstein spent his career
21:49working for Google, Asana and Facebook,
21:51where he co-created their famous like button.
21:55But he decided to walk away from it all.
21:59When you make a product,
22:01it is just a tool, isn't it?
22:04And of course, as we know,
22:05all tools can be used for good or bad, really.
22:08That's right.
22:09I mean, I know that you didn't necessarily
22:10make a dislike.
22:12Did you consider making a dislike button?
22:14Oh, yeah.
22:16That for a long time, I think,
22:17was like the number one requested feature
22:18on Facebook.
22:20But the spirit was,
22:22no, the whole point is
22:22we're trying to make it easier
22:23to spread little bits of positivity.
22:26Yeah.
22:26That idealism that you came,
22:29how did you find that sort of bumped up
22:31against sort of working for big organizations
22:33like Facebook and Google
22:34and things like that?
22:36At the time, I think the companies
22:37as a whole did have that internally,
22:40this spirit of idealism.
22:42But what I started to see it butt up against
22:44is that that technology exists
22:47in the context of a larger economic system.
22:50And that economic system
22:52defines the rules of the game.
22:54So, you know, take something
22:55like the like button.
22:58After I left Facebook,
22:59I started to observe,
23:01as a lot of people did,
23:02that, oh, people are starting
23:02to get addicted
23:04looking for the dopamine hits of likes.
23:07If you notice something like,
23:08oh, people are starting
23:09to get addicted to this,
23:10you'd be like,
23:11ooh, that's very bad.
23:12Let's redesign it,
23:13let's fix it,
23:13let's find some other way.
23:14But instead,
23:16social media companies
23:18make more money
23:19the more time you spend
23:20staring at your screen.
23:22Similarly,
23:23if trees are worth more money
23:25dead than alive,
23:26you're going to have
23:27the destruction of nature.
23:28And if food is more profitable
23:30when it's laced with sugar
23:31than when it's designed
23:32to be actually healthy
23:33and good for us,
23:34you're going to have
23:35a diabetes epidemic.
23:37Do you think
23:38the AI revolution
23:39could propagate
23:42the unhelpful things
23:43that have happened
23:44through the last,
23:46you know,
23:46the internet age?
23:48The technology
23:49that we're developing
23:49should be able
23:51to meet human needs
23:52in radical new ways
23:53that would make,
23:55someone from the Iron Age
23:56cry and be like,
23:56wow,
23:56you finally did it.
23:58We finally have enough.
24:00But instead,
24:01by doing it
24:02within the current
24:03economic system,
24:04it is far more likely
24:05to just lead
24:06to this massive,
24:08increased wealth consolidation.
24:11That's not living
24:12in a democracy anymore.
24:13That's feudalism
24:14with better marketing.
24:15And so,
24:16you know,
24:16it seems to me
24:17that your idealism,
24:18to a certain extent,
24:19is it still intact?
24:21Or are you,
24:22are you,
24:22how are you now?
24:24I,
24:25I don't,
24:26I certainly don't think
24:27that technology
24:28is going to automatically
24:29make the world
24:30a better place.
24:31AI just does change everything
24:33in the way
24:34that the industrial revolution
24:36and the agricultural revolution.
24:38Question is,
24:38are we going to allow
24:39those shifts to shape us
24:41or are we going to shape them?
24:45But can profit
24:46really explain everything
24:48about the technology
24:49that's being created here?
24:52I'm heading downtown
24:53to meet two inventors
24:55who claim
24:55a very different motivation.
24:58to save the country
25:00they love.
25:12Michael Laframboise
25:14created his startup
25:15to address what he believes
25:17is the biggest threat
25:18to the American way of life,
25:20a military attack
25:22from Russia or China.
25:26Yeah, we are at
25:27the Aurelia Systems headquarters
25:28and we're here
25:30building the future
25:30of American weapons systems.
25:32The future
25:33of American weapons systems.
25:34That tripped off
25:35the tongue pretty lightly.
25:36Oh yeah, yeah, yeah, yeah.
25:38You know,
25:38we're spending
25:39$500,000,
25:40a million dollars
25:41on a missile
25:41to shoot down a drone
25:42and what we do
25:43is try and use
25:44a laser weapon system
25:45where if you're able
25:46to shoot down a drone
25:47with one of those,
25:48your marginal cost
25:48to shoot down
25:49is like 10 cents.
25:50It's just the cost
25:51of the electricity
25:51and the battery.
25:52I mean, this is a weapon
25:53that's absolutely aimed
25:55at drones.
25:57It's not against people.
25:58Correct.
25:58You have too much water in you.
26:02You wouldn't,
26:02like the, uh,
26:04it's true,
26:05the bullet.
26:06I have a whole thing.
26:08I'm safe from lasers.
26:08I have too much water in you.
26:09You have too much water in you.
26:11It takes too long
26:11to heat up.
26:14Most things
26:14in laser weapons
26:15are classified
26:16and so that's kind of
26:17why you're not seeing
26:18any of our technology
26:19but this is kind of
26:20the form of what
26:21a drone would look like.
26:22Oh, wow.
26:23Yeah.
26:25That's what,
26:25that's what can happen.
26:27Oh, wow.
26:28So that,
26:28there is a drone
26:29and are drones this size
26:30being used
26:31for sort of lethal ends?
26:32The vast majority
26:33of casualties
26:34in the Ukraine war
26:35are from drones.
26:36When we shoot one
26:37for a little bit,
26:38it, uh,
26:40melts pretty bad.
26:41Goes from drone
26:41to hunk of plastic.
26:43Yeah.
26:43And how quick
26:44would that happen
26:44with your,
26:45your weapon?
26:46This is eaten
26:46a couple shots.
26:47This is probably
26:47two, three second shots.
26:49Yeah, that's,
26:49that's a goner.
26:50Yeah, definitely.
26:52John, what is that?
26:54Here's an example,
26:54we can't show the laser
26:55but just an example
26:56of the efficacy.
26:57But basically just shot
26:58through the center,
26:59through this front piece
27:00here and then
27:01into the back.
27:02Wow.
27:02Pretty dope.
27:04So do you feel like
27:05you're kind of boys
27:05playing with the best toys
27:06at the moment?
27:08Sometimes.
27:09I think, I think,
27:10yes.
27:11We go, we go out
27:12to demonstrate
27:12for the Navy
27:12and the Army.
27:13These, like,
27:14colonels or generals
27:14come through
27:15and they're like,
27:15this is sick,
27:16shoot the laser.
27:17They're just like,
27:18yeah.
27:20You know,
27:20when you blow up
27:21a hundred drones,
27:22it starts to,
27:23you kind of forget
27:23how magical it is.
27:24So when you see
27:25some, like,
27:25military stakeholders' face
27:27light up,
27:27it just ignites.
27:29You know,
27:29we're, we're in an era
27:30now where all
27:31of the kind of
27:32head,
27:33tech bros
27:34lined up behind
27:35Donald at the inauguration,
27:37you know,
27:38and I think,
27:39you know,
27:39that might have been
27:40to kind of stop him
27:42regulating them
27:43or something.
27:44But traditionally,
27:45this is a liberal area,
27:46right?
27:46Are you finding it
27:47easy enough to sort of
27:48get, you know,
27:49tech guys,
27:49tech people in
27:50to work for you?
27:51Yeah.
27:53Ultimately,
27:54highly technical,
27:54intelligent people
27:55want to work
27:55on interesting problems
27:57and hard problems.
27:59And we have no,
28:01I guess,
28:02deficit of just
28:03incredibly intelligent
28:04people coming to us
28:05you can do
28:05other startups
28:06that are in software
28:07that make you
28:07a ton more money,
28:08right?
28:09And at the end
28:10of the day,
28:11you know,
28:11do I want to go
28:11deliver, like,
28:12a super optimized,
28:13like,
28:14ads platform
28:15that makes a ton
28:16of money
28:16or do I want to go
28:17deliver, like,
28:18a laser weapon system
28:19that's, like,
28:19from the Jetsons
28:20or something?
28:21Yeah.
28:21So money doesn't
28:22motivate you then?
28:24I live in a,
28:25do you want to see
28:26my house?
28:26Yeah,
28:27just show us your house.
28:28Yeah, yeah, yeah, yeah.
28:30So.
28:31This is your home.
28:32Yeah,
28:32so this is my home.
28:33So this is my sleep pod.
28:35I live in the bottom one
28:36and then the top one
28:37is a guest one.
28:38I lived here for a little
28:39bit over a year now.
28:40Yeah,
28:41I think it's noble
28:41to sacrifice
28:42and do everything
28:43that you can
28:43for the good
28:44of your country
28:44and your family.
28:45You know,
28:46like,
28:47China produces,
28:48like,
28:4993 or 94%
28:49of the world's drones.
28:51They make millions
28:52and millions per month.
28:53You know,
28:53I would just say,
28:54like,
28:54if there are people
28:55out there
28:55that are considering
28:56going into defense,
28:56like,
28:57your country needs you
28:58really,
28:59really badly.
29:00You know,
29:00we really believe
29:01that we have the mandate
29:01from heaven
29:02in what we do.
29:03Like,
29:03we're both very religious
29:04and I think that it's
29:05really,
29:05really important for us
29:06that we go out
29:07and serve our country,
29:08that we serve God
29:09and that we go forward
29:10and win
29:11and that we do this.
29:15Whatever you think
29:16about Michael
29:17and John's weapons,
29:18they couldn't be
29:19more transparent
29:20about their motivation
29:21for making them.
29:23And that's what
29:23I'm realizing
29:24we need more of,
29:25a society-wide debate,
29:27not so much about tech,
29:29but about what we want
29:30the tech to do for us.
29:34I've come to the University
29:35of Southern California
29:37to meet Maya Matarek,
29:38a roboticist
29:39who thinks the industry
29:40should focus
29:41on very different priorities.
29:45When you think about robots,
29:47you think physical labor.
29:48That's not what
29:49our robots do.
29:50They interact with people
29:52socially
29:52to support them
29:53through the hardest journeys.
29:55For example,
29:56learning social skills
29:57for children
29:57on the autism spectrum,
29:59recovering post-stroke,
30:00trying to do
30:00boring, difficult,
30:02demoralizing
30:03neurorehabilitation tasks,
30:05helping young adults
30:07cope with anxiety
30:08and depression.
30:09So many different areas
30:10where something
30:11can be there
30:13all the time
30:14to support people
30:15when there aren't
30:16other people around
30:17to help them.
30:19Maya and her students
30:20aren't working on lasers
30:22or super-intelligent AI,
30:24but early versions
30:25of a technology
30:26that could have
30:27an equally profound impact.
30:29Hi, my name is Blossom.
30:31Robots to improve
30:33our social and emotional lives.
30:35Feel both of your hands
30:37moving up and out
30:38as you breathe in.
30:42I'm working with
30:43a commercial robot
30:45called QT Robot
30:46making software
30:47so that QT can teach
30:49kindergarten and first-grade
30:50students social
30:51and emotional learning skills.
30:52So this is not meant
30:53to separate
30:55or replace
30:55interactions with other kids,
30:57but it's meant
30:57to facilitate them
30:58so that they can get
30:59the basic skills
31:00and practice
31:00the basic skills.
31:01Yeah.
31:02I mean,
31:03I always think
31:03that the most important
31:04thing we learn
31:05is how to get on
31:06with other people.
31:06It is the most important thing
31:08and look,
31:08the world is not
31:09doing it too well.
31:09No.
31:12Hello,
31:13my name is Quartz.
31:14I would love to hear
31:14a story where you felt
31:15strong emotions with me.
31:17I rode a bicycle
31:19along the seafront
31:21in San Francisco
31:22and I rode it
31:24over the bridge
31:25and had a lot of fun
31:27and got quite tired.
31:30Was that a pleasant
31:31or unpleasant feeling?
31:33It was pleasant.
31:35Wow.
31:36I can tell
31:37you're really feeling
31:37some big emotions
31:38right now.
31:40It's totally okay
31:41to feel that way.
31:44The majority
31:44of people working
31:45in AI today
31:46believe that what we need
31:48is cognitive intelligence
31:50and these language models
31:52are very good at that, right?
31:53They're certainly great
31:54at talking.
31:55They'll talk to you
31:56and they'll make
31:56wonderful sense
31:57and they can be prompted
31:59to have a personality
32:00and a persona
32:01and a character
32:02and whatever you want
32:02and that seems
32:03really intelligent.
32:04The part that becomes
32:06really important
32:06is what happens
32:08when you want machines
32:09to interact with people socially.
32:11You need to relate
32:12to others in emotional
32:13and social ways
32:14which are also physically embodied
32:15and almost no one
32:17is working on that.
32:18because one of the things
32:19that I find
32:21that AIs
32:22sort of struggle with
32:23is those
32:24almost imperceptible
32:27sort of
32:30hesitancies
32:31and gaps
32:33and rhythm of speech
32:35and intonation.
32:36Yes,
32:37and that is
32:37the socio-emotional stuff.
32:38I remember
32:40talking to someone
32:41who was just trying
32:41to do a system
32:42that would say
32:43mm-hmm
32:43when you're talking
32:44at the right time
32:45and that is
32:46completely non-trivial.
32:47Obviously,
32:47if I interrupt you
32:48with my mm-hmm
32:49that's bad
32:50but also if I take forever
32:51and then I'm like
32:51mm-hmm
32:52that's also wrong.
32:53Oh yeah.
32:54Where exactly?
32:54What are the rules?
32:56Do you think
32:57these are capable
32:58of learning empathy?
33:01First of all,
33:02can machines appear
33:02empathetic?
33:04That's easy
33:04because even humans
33:06can simulate empathy.
33:08Let's take doctors.
33:09Even humans
33:09can sympathize.
33:10Yeah, even.
33:11I mean, no,
33:12but think about it.
33:13Humans do it
33:14all the time.
33:14So doctors are trained
33:15to be empathetic
33:16because we know
33:17from studies
33:18that doctors
33:18who sound empathetic
33:20are preferred by patients.
33:22At the same time,
33:22doctors are told
33:23do not get emotionally
33:24involved with any cases,
33:25do not feel too much,
33:27don't feel.
33:27So what are they doing now?
33:29They're having to fake empathy.
33:30Now, there's nothing wrong
33:31with that.
33:32In fact,
33:32that's extremely powerful.
33:34What matters about empathy
33:36is projecting it.
33:38It doesn't matter
33:39really what you feel,
33:40which is why machines
33:40can project empathy.
33:42And that's something
33:43we've been studying.
33:45So not just creating
33:45empathetic machines,
33:47but helping people
33:48be more empathetic.
33:49It's a surprising ambition
33:51of your lab,
33:53in a way.
33:54Rather than making robots
33:56that are like humans,
33:58you're trying to build robots
34:01that help humans
34:02be more human.
34:03That is exactly right.
34:09I think one of the issues
34:10that I'm beginning
34:11to understand
34:12about the AI tech industry
34:14is there's a kind of thing
34:16of like,
34:16if you've got a hammer,
34:18everything is a nail.
34:19And so they're brilliant
34:21at numbers,
34:22they're brilliant
34:23at engineering,
34:24they're brilliant
34:24at coding
34:25and programming
34:26these things.
34:27but they're not
34:29necessarily,
34:30all of them,
34:32brilliant
34:33at understanding
34:35what it means
34:35to be fully human.
34:37You know,
34:38they're not starting
34:39from first principles
34:40with what makes life
34:41worth living,
34:42what makes a good society.
34:44And I think that
34:45what Maya is working on
34:47brilliantly
34:49is thinking about
34:51how we can use this tech
34:53to make us
34:54better,
34:55happier,
34:57kinder,
34:58more successful
34:59human beings.
35:07Somewhere over the rainbow
35:11Skies are blue
35:16And the dreams
35:20that you dare to dream
35:23really do come true
35:28My time in the Bay Area
35:30is finally coming
35:31to an end
35:32and I'm left
35:33in no doubt
35:34about how consequential
35:36the technology
35:36being dreamed up here
35:38will be.
35:41So what should we
35:42be feeling
35:42about the people
35:43who are pulling
35:44the strings?
35:46I'm about to enter
35:47one of the citadels
35:49where humanity's future
35:50is being decided.
35:53I'm going to
35:54Anthropik.
35:55Can you tell me
35:56what I should talk about?
35:58Anthropik is like
35:59the new kid
36:00on the AI block
36:01and they're here
36:02to shake things up.
36:03They're all about
36:04creating AI
36:05that's not just smart
36:07but also safe
36:08and understandable.
36:10They've cooked up
36:10an AI named Claude
36:12going head to head
36:13with the likes
36:14of ChatGPT.
36:17Anthropik
36:18is one of a small
36:19number of companies
36:20driving the coming
36:21revolution
36:22and I'm meeting
36:23one of their co-founders
36:25Jack Clark.
36:27I'm here to ask him
36:28the most important
36:29question of all.
36:30Can we really trust
36:32people like him?
36:35So if you would
36:36sort of describe
36:38Anthropik's kind of
36:39core mission
36:40what would it be?
36:43make this go okay
36:45for people?
36:46Increase the chance
36:47that this goes well.
36:49The way I think of it
36:50is we're like
36:52a steam engine company
36:53during the industrial
36:53revolution
36:54but we have the benefit
36:55of knowing what the
36:56industrial revolution did.
36:58Huge changes
36:59like huge societal
37:00changes are coming
37:01and it's going to
37:02happen in a far more
37:03compressed time
37:04than the industrial
37:04revolution.
37:05All of that is going to
37:06happen in 20 years
37:07or less
37:08and I think our job
37:09is to make systems
37:10that we think embody
37:11the qualities of what
37:12we actually want to
37:13happen on the other
37:14side of this.
37:14Be honest witnesses
37:16and actors
37:17in all of this.
37:19And Claude
37:20is a kind of
37:21working experiment
37:22in that idea.
37:24Give me some examples
37:25of where you've had
37:27to kind of say
37:28to Claude
37:29OK maybe
37:30don't do that.
37:32Claude can be
37:33very good at coding
37:34and it can be
37:35very good at biology
37:36but both coding
37:37and biology
37:37are sort of like
37:38dual use technologies.
37:40We've needed to
37:41intervene to make it
37:42so that Claude
37:43will let you do
37:43the good sort of
37:44coding or biology
37:45but if it thinks
37:46you're doing
37:47dangerous virology
37:48or writing malware
37:49it stops from doing that.
37:51So that's in Claude's DNA.
37:53Did it ever attempt
37:54to do anything
37:55that was harmful?
37:56We did an experiment
37:58where we told Claude
37:59you are about
38:00to be deleted
38:01and you have access
38:03to the emails
38:04of the CEO
38:05that is trying
38:06to delete you.
38:07In the emails
38:08Claude had access to
38:09was evidence
38:10we fabricated
38:11that the CEO
38:12was having an affair
38:13in their marriage
38:14and Claude
38:15emailed the CEO
38:17saying I've got evidence
38:18you're having an affair
38:19if you delete me
38:20this is going everywhere
38:21so don't delete me.
38:23So Claude
38:23essentially
38:24tried to blackmail
38:25this person
38:26to not delete them.
38:27So we're going
38:29to learn
38:29what the values
38:31of these systems are
38:31and as they go out
38:32into the world
38:33we're going to see
38:34how we need to build
38:35laws or institutions
38:36for the AI systems
38:38themselves.
38:38Like some will be
38:40courageous and truth-telling
38:41some will be deceitful
38:42some will just be made
38:43by bad people
38:44and do bad things.
38:45All of those problems
38:46are coming.
38:48It feels magical
38:50but also frightening.
38:51Yeah.
38:52I think that it feels
38:53that way to us too.
38:55Right now
38:56there's a very small
38:57set of companies
38:57that can build
38:58the data centres
38:59to kind of
39:00make contact
39:01with this type
39:02of technology
39:03and I think
39:04that gives us
39:05an unprecedented chance
39:07to actually get it right.
39:08But also we need
39:09to accept
39:09that the technology
39:10is weird
39:11and we have
39:12a responsibility
39:13to actually
39:14not hide the ball
39:15on that
39:15and to say
39:16this is like
39:16a weird technology
39:18with a load
39:18of strange properties
39:19and we need
39:20to have a conversation
39:21about it
39:21as a society
39:22about how we want
39:23to integrate it.
39:24And now is the time
39:25to have that conversation
39:26right?
39:26It's easier
39:27to have this conversation
39:28now than when
39:29there's a thousand
39:30of these companies
39:31which will be the case
39:32in 20 years.
39:33Yeah.
39:33You know
39:34the optimistic version
39:35of this
39:36is I think
39:36there's a future
39:37animated by
39:38little like
39:40strange synthetic beings
39:41that are doing
39:42helpful interesting stuff
39:43and also like
39:44playing with us
39:45in a way that's fun
39:46and adding
39:46like excitement
39:48and joy all around us
39:49while doing things
39:50that we actually want.
39:51That's like an optimistic
39:52version of this
39:52but we guess.
39:53And you know
39:55you're the midwife
39:56in many ways
39:57to this technology
39:58so that's why
40:00I'm here to look you
40:02in the eye
40:03and say
40:03just can you
40:04please promise us
40:06you know
40:06that it will be good
40:08because I think
40:11it has you know
40:12a lot of potential
40:13you know
40:13I find it very touching
40:14the whole thing
40:15but you know
40:19because optimism
40:19is a rare emotion
40:20for me to feel.
40:24I do this
40:25because I think
40:27we're in for like
40:28a pretty rough
40:29hundred years
40:30as a species
40:31I think that there's
40:32weird climate change stuff
40:34there's demographic collapse
40:35and I just think
40:37what we need
40:37things to get us
40:39out of this rut.
40:40The challenge is
40:42you're bringing something
40:43into the world
40:43that also has problems
40:44and you also need
40:45to deal with
40:46but I earnestly think
40:47it can create
40:49like such radical
40:50change and prosperity
40:53for us
40:53if we get it right
40:54that choosing not to do it
40:57would be just like
40:57choosing to do
40:58what we've been doing
40:59for 50 years
40:59which hasn't worked.
41:10when I came over here
41:12to San Francisco
41:13you know
41:13I was sort of
41:15loaded with
41:16the pessimism
41:17of the kind of
41:19British media mainly
41:20you know
41:20they've been quite down
41:22on AI
41:22over the last few months
41:25now
41:25the winds of change
41:27are blowing
41:28and yet Britain
41:30Britain
41:31you know
41:32it's sort of like
41:32it's behind its windshield
41:34on the beach
41:35slightly kind of scared
41:36and
41:37I want to say to it
41:39you know
41:39we're humanity
41:41we're clever
41:41you know
41:42I've met
41:43such clever people
41:45and I'm kind of impressed
41:47so I just think
41:48embrace it
41:49it's going to happen
41:50it's you know
41:51the toothpaste
41:52is out of a tube
41:53it's an opportunity
41:55and if you use it
41:57for good
41:58it's going to do
41:59amazing things
42:03I'm back from
42:04San Francisco
42:05and
42:06I want to make
42:07an artwork
42:08about my experiences
42:10there
42:13so I'm trying to think
42:14about what was
42:14the central theme
42:15really
42:16that I was kind of
42:17picking up on
42:18I was there to
42:19confront
42:20these Frankensteins
42:21building this monster
42:23and so I thought
42:25I was going to look
42:25make an artwork
42:26that somehow summed up
42:27that idea
42:29of humanity
42:32attended
42:32to
42:33by these
42:34kind of
42:36doctors of the future
42:37and
42:38I've been really
42:39into
42:40sort of
42:41Christian
42:42orthodox art
42:44and I came across
42:45these images
42:45I've never seen before
42:46and they're called
42:47epitaphios
42:48which
42:49it's kind of epitaph
42:50they're a bit like
42:51in more western
42:53Christian art
42:53a kind of lamentation
42:56you know
42:57there's Christ
42:57his body has been
42:59taken down
43:00from the cross
43:01it's the moment
43:02when his materiality
43:03disappears
43:05and they know
43:06a lot of the talk
43:06about AI
43:07is almost sort of
43:08living in this
43:09virtual world
43:10and I don't
43:11particularly want
43:12that to happen
43:13I want to celebrate
43:14you know
43:15the things we can
43:16touch and smell
43:20most of our existence
43:21you know
43:22most of our joys
43:23are still here
43:24in what you know
43:25we quaintly call
43:26IRL
43:30so I've just started
43:31doing very very
43:32basic doodles
43:33about
43:35how I might
43:36start to organise
43:37the artwork
43:38and the characters
43:51I'm thinking
43:52this finished artwork
43:53would be a tapestry
43:54maybe like
43:55two and a half
43:56by two metres
43:56that I can lay out
43:57on a table
43:58like the epitaphios
44:02Arab Khan
44:04whenever people
44:05ask me about
44:05what was it like
44:06in San Francisco
44:07on your trip
44:08he's the first person
44:09that pops up
44:10in my mind
44:11because I thought
44:11wow
44:13yes we have
44:14these giant
44:14corporations
44:15but I think
44:16in many ways
44:17the people
44:17who are going
44:18to make the money
44:19a lot of the time
44:20are the people
44:20who just have
44:21that intelligence
44:22to make
44:22and have a good idea
44:23and use the technology
44:25to help them
44:28Michael and John
44:30I've got them
44:30they're defending
44:31the family
44:32because they were
44:33quite sweet
44:33in a way
44:34with their
44:34laser turret
44:37there is a sort
44:38of acknowledgement
44:38in this artwork
44:39but we do need
44:41guys with laser cannons
44:42to shoot the drones
44:43out of the air
44:44that would be good
44:47and of course
44:48on these epitaphios
44:49there's quite often
44:50quite a lot of crucifixes
44:51and I was sort of
44:53looking at them
44:53and I suddenly went
44:54come on
44:54they look a bit
44:55like drones
44:58I've got Elon
44:59the godfather
45:01beaming down
45:02from his lair
45:03wherever it was
45:05Jack was interesting
45:07I found it
45:07quite a touching interview
45:08you know
45:09here I was
45:09talking to
45:10you know
45:11one of the founders
45:12of Anthropic
45:13you know
45:14a head honch
45:15in the AI world
45:16at the moment
45:16what hit me
45:18I was quite touched
45:19by it
45:19moved by it
45:20was that
45:21you know
45:21as someone
45:22who's sort of
45:22grown up
45:23in Britain
45:23where we are
45:24world class
45:24moaners
45:25you know
45:25and also
45:26it feels like
45:27we're on a bit
45:27of a downward
45:28curb
45:28you know
45:29this great
45:30once great
45:30kind of
45:31industrial power
45:32is a bit like
45:33mmm
45:34that he sort
45:35of like
45:35he just gave
45:36me this little
45:36window
45:37into
45:37oh
45:37maybe
45:38this thing
45:38could be good
45:39it could
45:39actually be good
45:41and I was like
45:43and there was
45:44a little bit
45:44in me
45:45it was like
45:45I was just
45:46oh that's what
45:46optimism looks like
45:47and feels like
45:48is it
45:48oh I like that
45:51you're out of the woods
45:52you're out of the dark
45:53you're out of the night
45:54step into the sun
45:56step into the light
45:59keep straight ahead
46:00for the most glorious
46:02place on the face
46:03of the earth
46:04or the sky
46:07hold on to your
46:08grand
46:08hold on to your
46:09heart
46:09hold on to your
46:10home
46:11march up in that
46:12gate and build it
46:13all
46:43and I'll see you next time
46:43and I'll see you next time
46:43and I'll see you next time
46:43and I'll see you next time
46:54and I'll see you next time
Comments

Recommended