- 10 hours ago
Category
📺
TVTranscript
00:129-1-1, what is your emergency?
00:15I hit a bicycle that was in the road.
00:17Can you tell us they're injured at all?
00:18They are injured, they need help.
00:24A car crashes in the middle of the night.
00:31A woman is run down.
00:33What's her name?
00:38It's the first death of its kind in the world.
00:41Are you the driver?
00:45A pedestrian killed by a driverless car.
00:49The car was an auto-drive.
00:51I couldn't see it, and all of a sudden it was just there.
00:54Although the vehicle was driving itself using artificial intelligence, there was a human operator behind the wheel.
01:02Overheader?
01:03Yeah.
01:03Okay.
01:04Do I need a lawyer then? I mean, I can't give legal advice.
01:08For the first time ever, the decision had to be made.
01:12Who was to blame for this crash?
01:14Was it the human or the AI?
01:21Artificial intelligence, a machine beyond the mind of man.
01:25For decades, scientists have dreamed of creating incredible machines that could talk like us, learn like us, think like us.
01:37But what we didn't imagine is the impact they would have on us.
01:42In this series, I'm exploring what happens when AI collides with human lives, unearthing stories far stranger than we could
01:52ever have imagined.
02:19I'm Professor Hannah Fry, I'm a mathematician, and I've spent my career in the world.
02:23We're examining the ways technology can transform our future.
02:28Here, in Phoenix, a multi-billion dollar industry has the potential to save millions of lives.
02:35That is weird.
02:39We're on the cusp of a self-driving revolution, and driverless taxis are now a reality in several countries around
02:47the world.
02:48All right, request it.
02:50You hail them on an app.
02:53Ooh, you have my initials on top.
02:55This is my first time in one of these.
02:58Worked with Google for years.
03:00Never been in a driverless car.
03:02Can you imagine?
03:04Here it is.
03:06Where's it going to stop?
03:10Where's it going to stop?
03:12There?
03:14Ooh!
03:16Ooh, that was quite well done.
03:18Here, look.
03:19Big Jeff on the top.
03:24No-one in the car.
03:26Still the ride.
03:28Go.
03:33It's quite nerve-wracking.
03:35We'll do all the driving, so please don't touch the steering wheel or pedals during your ride.
03:41It's green, but it's slowed down.
03:43Oh, it's turning right, that's why.
03:47Oh, yeah, I sort of don't trust it.
03:49I don't trust it.
03:51You want me changing lanes?
03:58It's not timid, is it?
04:01The technology needed for a car to drive itself is nothing short of miraculous.
04:07The car uses a combination of sensors to understand what's going on around it.
04:14First, cameras.
04:17Great at identifying road signs, traffic lights, pedestrians and other cars.
04:22But they struggle in bad weather.
04:26So, many driverless cars also have radar, often built into the front bumper, which sends out radio waves that bounce
04:35off objects and measure what comes back.
04:38It works well over long distances, but not so well up close.
04:44That's why some driverless cars also use LiDAR, the spinning cylinders you sometimes see on the roof and side of
04:52the car.
04:53It's like radar, but with lasers, and is able to build up a detailed 3D picture of nearby objects.
05:02Precise, but easily confused by reflective surfaces like windows and shiny buildings.
05:18The AI takes these three imperfect systems, pieces together what's actually out there, tries to predict what will happen next,
05:27and decides how the car should steer, brake and accelerate.
05:32I mean, I was writing about these things 10 years ago, but they were a research project.
05:38They were understanding the environment, advancing the engineering, kind of tweaking the software.
05:43It was only very, very recently that they'd become commercially available, that just anybody could hail one.
05:53And soon they're even coming to the UK, starting with the busy streets of London.
06:01Who's good at sticking to the speed limit, I'll tell you that?
06:07I think there is actually quite a lot to look forward to from this future with driverless cars.
06:14Like, humans make terrible drivers.
06:16And these things, they're not falling asleep at the wheel, they're not drink driving, they're not getting road rage.
06:22Like, fine, maybe they're not going to be perfect, but there is a lot of scope for roads to be
06:28much safer than they currently are.
06:30And I think that is a goal that is worth pursuing.
06:35But the story of how we got to this point is marked by tragedy and loss.
06:44There was a pedestrian walking a bicycle.
06:47Once the pedestrian got into the lane of traffic, the vehicle struck the pedestrian.
06:51It was a self-driving vehicle.
06:54It was in the autonomous mode at the time.
06:57In Arizona, in 2018, for the first time, someone was struck and fatally injured by a driverless car.
07:07Rafaela Vasquez was the human backup driver in the self-driving Uber vehicle that hit and killed Elaine Herzberg.
07:15After years of avoiding the limelight, the woman at the center of the story, Rafaela Vasquez, had agreed to meet
07:22me.
07:24Hey.
07:25Hey, how are you?
07:26Good, how are you doing?
07:26Do you want to come in?
07:27Yeah, thank you.
07:28You haven't spoken on camera about this before, have you?
07:30No.
07:31How are you feeling about it?
07:32I don't know.
07:33I have a whole plethora of emotions going through me.
07:35But my biggest issue going through this was not being able to rebut anything or even defend myself.
07:43It's like getting a chance to speak for yourself.
07:44Yeah, because I've been dealing with it now for, what, seven years.
07:48So this is you?
07:50Yes, and that's to get you in and out of the buildings.
07:53Rafaela got a job at Uber in 2017, not long after the company had decided to develop their own self
08:00-driving cars.
08:01I like a lot of technology stuff.
08:03If self-driving vehicles were starting to emerge, I knew that Phoenix was becoming a hotbed for it.
08:11So I would see autonomous vehicles roaming around.
08:15I'm interested in that.
08:16So I went and applied.
08:19I passed everything instantly.
08:20So I was excited.
08:23Her job was to ride in the autonomous cars as Uber began testing their new technology on Arizona's public roads.
08:32There was a lot of engineers, coders, but everybody was super nice.
08:35And you have this chance to see the new technology from the inside?
08:38Not just see it, I'm testing it before it even comes out.
08:40I loved my job.
08:42I absolutely loved my job.
08:44When you were in the cars in those early days, how good did you think they were at driving?
08:49They were better than what I thought they were going to be.
08:52And it also depends.
08:53Uber vehicles, for the longest time, had problems with overreacting with things on the side of the road.
08:59And some builds were great, but then they'd do another update to try to fix something else,
09:03and it makes something else go haywire, and then you just don't know.
09:08So much has happened to you since that, right?
09:11Yeah.
09:19During its first year of testing in Phoenix, Uber assigned two operators to every self-driving car.
09:25All right, and we're engaged.
09:28One rode in the passenger seat to track the car's performance.
09:32On the laptop, I can monitor a lot of the prediction software, so I can see, like, where the car
09:37is going.
09:37Any unexpected behavior had to be logged on a computer.
09:42The second operator sat behind the wheel and kept their eyes on the road.
09:47They were expected to take control if the AI in the car malfunctioned.
10:01But a year into testing, Uber changed the setup.
10:05Now one operator had to watch the road and monitor the car's actions.
10:12Okay, come on then.
10:14This decision to switch to a single human in the car came just a few months before Raffaella's crash.
10:22Even riding in the car, I still get nervous.
10:25Yeah.
10:26That's why I'm hesitating, because I'm just trying to not have a panic attack.
10:29I understand.
10:31Oh, yeah.
10:34Sorry.
10:35No, don't apologize, please.
10:37There is no pressure.
10:40Like, we do.
10:41No, I want to do this.
10:41Are you sure?
10:42Yeah.
10:43Okay, here we go.
10:46I don't want my driving to unnerve you.
10:49No, I'm just worried you're going to get pulled over.
10:52Because you're not driving like a typical person.
10:55Driving like a granny?
10:56Well, you're driving like an autonomous vehicle, actually.
10:59Oh, really?
10:59Right at the speed limit.
11:02Each operator was allocated a route to which the car would drive again and again.
11:07It could be monotonous work.
11:10Like, the most boring one I hated was this one through this little neighborhood.
11:13You come out, you go here, then this is all like 20 miles an hour down here, down here,
11:20and then boom, back.
11:22And that's all it is.
11:23Round and round.
11:24Three hours.
11:25With the car driving itself.
11:26Mm-hmm.
11:29Humans are not good at paying attention when things get boring.
11:33And with only one operator monitoring a repetitive route, there was a danger of getting distracted.
11:41But when it wasn't quiet on the streets, there could be a lot for one person to do.
11:48God, there's lots of students out, isn't there?
11:50Yes, and this is what we're testing.
11:53Yeah.
11:53And you can't predict what somebody's going to do.
11:55So I would always take it out of autonomous mode during these areas.
12:00Right.
12:00That's why two people was important, but then all of a sudden they changed it.
12:06But there are several screens that you're continually having to look at.
12:10Yes, we were supposed to push buttons and enter codes any time something happened.
12:14I'm sort of trying to put something into an iPad while you're supposed to be monitoring the road.
12:19Yeah.
12:26On the night of March 18th, 2018, Raffaella was on her usual test route.
12:35The car was in autonomous mode and had been running for 19 minutes without incident.
12:48At 9.58 p.m., it turned right onto Mill Avenue.
12:57At the same time, a pedestrian began walking across Mill Avenue, pushing a bicycle by her side.
13:15Raffaella was looking down as the vehicle approached the woman.
13:20The car should have detected her.
13:24But it didn't.
13:33By the time Raffaella looked up and slammed on the brakes, it was too late.
13:38The vehicle struck the woman at 39 miles per hour.
13:43The pedestrian killed was Elaine Hirsberg.
13:47She was often seen on her bike in the area.
13:52She never gave up.
13:54She always helped people.
13:56She was funny, funny, funny, funny.
13:58If you were her friend, you felt true love from her.
14:01She didn't deserve what happened to her.
14:13So this is the park?
14:14That's the park.
14:14I did that venue, that theatre.
14:17Everybody crossed there, including homeless people, because homeless people would come up here to the park.
14:22That's where she was going.
14:27This was the first time Raffaella had returned to the site of the crash.
14:36Do you see that sign on that post, that light?
14:39Yeah.
14:40Yeah.
14:40She was over there.
14:41Oh, gosh.
14:44And she's just screaming the whole time.
14:48Yeah.
14:49The whole time.
14:51I...
14:54The screaming, it was...
14:59It was just terrible to hear it.
15:01And then, but then what was worse is when it stopped.
15:05And then the ambulance showed up.
15:08And then they said she's passed away.
15:10And then I lost it.
15:14Somebody died.
15:24Following the collision, the police launched a criminal investigation into the artificial intelligence car and its human operator.
15:43Building a system that can drive a car involves much more than turning a wheel and pressing pedals.
15:49You also need to teach it to do things that humans do instinctively.
15:54Like spotting pedestrians and recognising road signs.
15:58We do that without even thinking.
16:00But training a machine to do it has proved incredibly difficult.
16:05So, back in the early days of AI, the only real form of intelligence that we knew about was human
16:11intelligence.
16:12And so, people look to the human brain for some inspiration about how to build an electronic brain.
16:21And the thing about the human brain is that it is made up by these billions and billions of neurons
16:27that are connected together.
16:29And as you think, you are essentially sending these little electrical impulses, these little bursts that can be big or
16:36small, through this network in your brain.
16:40In the 1950s, people started trying to construct a much simpler computer version, where a network of artificial neurons would
16:48pass signals between each other.
16:50This concept became what we now call a neural network.
16:56But rather than all of your neurons being kind of intermeshed together, they appear in layers, like a sort of
17:04hierarchy.
17:08Decades later, people realised you could use these neural networks to recognise images, like a stop sign.
17:16Here's a simplified version of how it works.
17:20Each neuron in the hierarchy has its own job.
17:24At the bottom, they're just looking at a single pixel each.
17:27And as you go further up the network, things get more sophisticated.
17:32Maybe there'll be a neuron up there that is checking to see if there's some red.
17:38Maybe there's another one up there that's checking to see if there's an octagonal shape that stands out from the
17:43background behind it.
17:45And all of this information, all of these signals get sent up.
17:48So you eventually get right to the very top, to the big boss, who makes a decision based on all
17:56of the information that has flowed through the network, to finally decide whether it thinks it's a stop sign, yes
18:02or no.
18:05The extraordinary thing about these networks is that if you only show it one picture, you'll just be guessing whether
18:12it's a stop sign or not.
18:14But show it thousands and tell it when it's right or wrong, and the AI learns to recognise the sign
18:21itself, adjusting its network every time it makes a mistake.
18:27In the neural networks you'll find in a car, they'll be classifying not just stop signs, but pedestrians, vehicles, lampposts,
18:36road markings.
18:37They are gigantic and gigantically complex.
18:41But the principle is still the same.
18:44This is a machine built through trial and error.
18:55But if errors happen in the real world, on our roads, the consequences can be fatal.
19:22In April 2019, a Tesla Model S failed to stop at a stop sign and ploughed through a T-junction.
19:31Who else is involved?
19:33For what I understand, he was driving the car.
19:36How are you, sir?
19:37I was driving, I dropped my phone and looked down, and I ran the stop sign and hit the guy's
19:41car.
19:42Sir, which car are you driving?
19:43This car, Tesla right here.
19:44The driver of the Tesla, 42-year-old George McGee, had been on his phone when it fell out of
19:51his hand.
19:52He bent down to pick it up, leaving Tesla's autopilot to drive the car.
19:57Take it back, take it back.
19:59But something went wrong.
20:02Boss?
20:03The Tesla hit 26-year-old Dylan Angulo's truck at 62 miles an hour.
20:09Did you stop at the stop sign?
20:11No, I didn't, sir. I don't think.
20:12I honestly don't know. I looked down. I didn't know how close I was to the intersection.
20:16And I was driving on a cruise, going through it, and I looked down, and to get the phone,
20:20I dropped, and I reached down, and I didn't see it.
20:24What do you do?
20:25I manage a private equity fund out of Boca.
20:28I'll explain the Tesla.
20:30Yes, sir.
20:32Remarkably, Dylan survived the accident.
20:39He got ejected.
20:41Shit.
20:43But he wasn't alone that night.
20:46What do you think?
20:47There's no real serious, sir.
20:49Okay, wait a minute, though. There's ladies' flip-flops.
20:52Yeah, but it was pointed.
20:54There's a pair of ladies' flip-flops.
20:56Please tell me this.
20:58Get back.
20:58Shit, I'm sorry, sir.
21:11This picture was actually the day of the crash.
21:20And we were going fishing, and we stopped to get bait at the bait store.
21:27At the time of the accident, Dylan had been with his new girlfriend, 22-year-old Nibel Benavidez.
21:34Is he her?
21:37Oh.
21:38The impact from the Tesla killed Nibel instantly.
21:43She's so gorgeous.
21:45Nibel always had this peace and happiness to her.
21:49Just being around her would just...
21:52It would rub off on you, you know?
21:55I was going to meet her mom the next day, you know?
21:57We were going to catch the fish, and then...
22:00I was going to cook lunch for her mom the next day, and, uh...
22:04Was that the first time you were going to meet her mom?
22:06Yeah, it was going to be the first time.
22:07Mm-hmm.
22:08And unfortunately, you know, the first time that I...
22:11I have to meet her mom were under these circumstances.
22:24I'm so sorry this happened to you.
22:31Right away, they started doing an investigation into the accident.
22:35And I finally get my hands on the police body cam video.
22:42And in that police body cam video, the driver, yeah, he's like,
22:45I was driving, I was on the phone, I had the car on autopilot cruise.
22:51I started to do research, and I had no idea this existed, you know?
22:58This thing called autopilot, where the cars can drive themselves, you know?
23:04And, um...
23:06And right then and there, I was like,
23:09this guy was relying on this car to drive itself.
23:14This is why this happened to us.
23:28Autopilot is Tesla's advanced driver-assist function,
23:32highlighted in their slick promotional videos
23:35that sell a vision of technological sophistication,
23:38safety, and convenience,
23:40showing that it can steer, brake, and change lanes
23:43on real roads in the real world.
23:47Tesla car next year will probably be 90% capable of autopilot.
23:53Like, so 90% of your miles could be on auto.
23:55Other car companies will follow.
23:57Elon Musk has even tweeted that his cars
24:00can completely drive themselves.
24:03But they can't.
24:05The driver's manual says a human still has to be
24:07in full control of the car.
24:10This thing is psychotic.
24:13Oh, my God!
24:13In our city, yeah, it's turning.
24:14I'm not involved in this. Watch the road.
24:16What do you think, autopilot?
24:17Am I on the wrong side of the road?
24:18Whoa, Jesus, that was scary.
24:19Wait, there's a double yellow line.
24:21Am I on the right side of the road?
24:22I'm not even...
24:22No, you are not!
24:23Get the fuck over!
24:24Does...
24:25Are you serious?
24:25Yes!
24:26Look!
24:27Don't fucking trust this thing.
24:28And there's serious safety concerns
24:31over the autopilot feature.
24:35This Tesla crashed into a highway divider in California,
24:39killing the driver.
24:40Another slammed into a parked fire truck in Utah.
24:43Both had the autopilot feature on.
24:51This is the bend right before the intersection
24:54where the accident happened.
24:56And at night, from right here,
25:00you could already see the red light blinking from right here.
25:03Oh, yeah.
25:04I see it.
25:13This is where her body ended up laying.
25:18You know, we pulled over to look at the stars.
25:22It's crazy how far her body flew, you know?
25:27That's how fast the car was going,
25:29straight through this intersection.
25:32Six months after the fatal collision,
25:34the man behind the wheel of the Tesla,
25:36George Magee,
25:37was charged with careless driving,
25:40which he didn't contest.
25:44But Dylan wanted Tesla to also be held accountable.
25:49These car manufacturers,
25:50they need to do a better job
25:51with designing these cars.
25:55We want justice.
25:56My Bell's family and I,
25:58we want justice.
26:00These cars were allowed on the road
26:01before they were ready to be on the road.
26:04They were not safe.
26:05And they were advertised as these cars
26:07that can drive themselves.
26:14Tesla offered Dylan and Nybel's family
26:17an undisclosed sum
26:19to draw a line under the case.
26:22But they refused.
26:24And a court date was set.
26:27Tesla would now face a jury trial
26:30over its autopilot system.
26:46First at five,
26:47the video showing the moments
26:48before an Uber self-driving car crashes.
26:51Backup driver, Raffaella Vasquez,
26:53looks down for approximately four seconds.
26:56Back in 2018,
26:58the fallout from Raffaella's crash
26:59had left the whole self-drive industry
27:02hanging in the balance.
27:03If there's no human driver,
27:05who bears the responsibility
27:07when a car makes a misstep?
27:09Is there enough safety built in
27:10before we deploy these vehicles?
27:12Is there any suggestion at the moment
27:15that Uber has done something wrong?
27:17We don't know.
27:20Under intense media
27:22and political scrutiny,
27:24Tempe's police investigation
27:25left no stone unturned.
27:28So, the police have sent over
27:31all of their digital evidence.
27:35And I have not yet watched this.
27:37Let's have a look.
27:40The actual car
27:42in the police compound
27:43being analysed.
27:45Oh, my gosh, look at that.
27:47And then there's also one
27:48that says,
27:48seize phones.
27:53They're knocking on someone's door.
27:54It's the police department.
27:55What was the...
27:56blurred everyone's faces.
28:01Hey.
28:02Hey, Raphael.
28:04We're coming up to do follow-up
28:06regarding the accident.
28:07Is we have a seizure warrant
28:09for your cell phone.
28:10They've got a warrant
28:11for her phone.
28:13Which one...
28:14Which number did you want?
28:15Hey, Raphael,
28:16how many phones do you have?
28:17What?
28:17How many phones do you have?
28:19I have my work phone
28:20and then my personal phone,
28:21but my work phone
28:23is the one I had to work.
28:24So, the war is saying
28:24we need both phones.
28:25Well, we need all the phones.
28:26We need how many phones?
28:28All your phones.
28:30Why are they taking our phones?
28:36Thank you, Raphael.
28:44Hi.
28:45Good morning.
28:46Casey Marsland was the lead detective
28:48on the case.
28:49He agreed to show me the footage
28:51he'd obtained from Uber.
28:53So, this is the footage
28:54with all three of the camera views.
28:57When I first hit play,
28:59we can see the driver
29:00looking down multiple times.
29:05And there didn't seem to be
29:07a logical explanation
29:08as to why she was looking down.
29:10So, we're coming up
29:11to the crash now, are we?
29:12Yes.
29:14And there.
29:17Can you go back a few frames
29:19and I just want to see
29:19what was happening
29:20immediately before?
29:21Sure.
29:24So, looking down,
29:25looking down,
29:25looking down,
29:26looking down,
29:26looking down,
29:27looking up.
29:28Oh, wow.
29:30What did she say she was looking at?
29:32Initially, there was a statement
29:35that it had to do
29:36with the iPad
29:36in the center of the vehicle.
29:40And does that broadly stack up?
29:42Were Uber requiring people
29:43to look at screens
29:45while they were driving?
29:46So, yes.
29:47That was one of the purposes
29:48of her job
29:49was to monitor the vehicle
29:51and if there was
29:52anything abnormal,
29:53she would need to
29:54interact with that screen.
29:55I mean,
29:55the conflicting instructions,
29:57right?
29:57That, like...
29:58Yes.
29:59They're supposed to be
29:59looking at screens
30:00but simultaneously
30:01monitoring the road.
30:02It definitely presents
30:03a bit of a challenge.
30:04Yeah.
30:05However, when we, of course,
30:06looked at what the screen
30:07was actually showing
30:08at the time,
30:09it didn't show
30:10that there was
30:11any sort of alerts
30:12or any sort of interaction
30:13with the screen.
30:14We looked into the phone.
30:15We saw the apps
30:17that were installed
30:18on her personal phone.
30:20We noticed Hulu
30:21was an active app
30:22on her phone
30:22and when we wrote
30:24a search warrant
30:24to the company,
30:25they responded with
30:26a printout of the activity
30:28on the account
30:28and that's what gave us
30:30probably the most
30:31important information
30:32at the time.
30:33Then what did it say?
30:35So, this was the information
30:37that was provided
30:37from Hulu
30:38and it shows
30:39that the show,
30:41the voice,
30:41was being streamed
30:42to her personal phone.
30:44So, what was your take
30:45on seeing all of this?
30:47Essentially, this goes
30:48to the beginning
30:48of her shift.
30:49She started her shift
30:50by driving the vehicle
30:51out of the garage
30:53and it was at that time
30:54before she even got
30:55onto the roadway
30:56that she had set up
30:57and started streaming
30:58the Hulu.
30:59And so, I think
30:59that the conscious decision
31:01to provide yourself
31:03with a likely distraction
31:05before even getting
31:06onto the roadway
31:07is an absolutely reckless
31:10decision to be made.
31:22So, I found
31:23a couple of clips
31:25around the time
31:27and lots of them
31:28are not on your side.
31:30Some of them
31:31were terrible
31:31especially at first.
31:33Take a look at this.
31:33This is Hulu Records.
31:35It turns out
31:36she was streaming
31:37that scene competition
31:38the voice
31:39at the time
31:40when she should have
31:40been watching the road.
31:42Rafaela Vasquez
31:43was riding
31:44in autonomous mode
31:45at the time
31:46but she was distracted
31:47and looking down
31:48for more than 30%
31:49of the nearly 22 minutes
31:51before the crash.
31:53Right there.
31:54They're right.
31:55I wasn't distracted.
31:57I was doing my job.
31:58I had other things
31:59to do other than
31:59operate the vehicle
32:00because we went
32:01to one person.
32:02So, I had duties assigned.
32:03Like in the video
32:04you see me look away.
32:05I am looking away.
32:06But if you time them
32:07and even the police did
32:08I never looked away
32:10for more than five seconds.
32:11Why?
32:11Because we were trained.
32:12We were trained to look.
32:13Boom.
32:13iPad five seconds
32:14then look back.
32:15Boom.
32:16Over here five seconds
32:17look back.
32:18Boom.
32:18Our cell phones
32:19we have to Bluetooth them
32:20into the vehicle.
32:23It wasn't watching TV.
32:24I was listening to it.
32:26But the police said
32:26I was watching.
32:27That means that's
32:28a definitive statement.
32:29You just told the public
32:30that you have evidence
32:32that I was watching something.
32:33That's bullshit
32:34because you don't.
32:35You have evidence
32:36that I was Bluetooth
32:36streaming something.
32:38But hello
32:38streaming Bluetooth
32:39you can listen to stuff.
32:41People do it all the time
32:42with music.
32:44Everyone
32:45automatically assumed
32:46because the police said it
32:48that I was watching Hulu.
32:51And because of that
32:53I was negligent
32:54therefore I was unable
32:55to prevent the accident
32:56therefore the charge.
32:58The investigation
32:59lasted for two and a half years
33:02before Raffaella
33:03was finally charged.
33:05Where were you
33:05when the indictment
33:06came through?
33:07I was at home
33:08because my attorneys
33:09informed me.
33:10They just told me
33:10that I was going to be indicted
33:11and for negligent homicide.
33:12I said like what?
33:13And they said
33:14negligent homicide.
33:15I said
33:15Negligent homicide?
33:16I said homicide.
33:18I said so murder.
33:20That's just me
33:21how I interpret that.
33:22So I now
33:24I'm like
33:25I didn't know
33:26what to say or do.
33:28Now I'm in shock.
33:30If found guilty
33:32Raffaella faced
33:33up to eight years
33:34in prison.
33:42With conflicting accounts
33:44from both sides
33:45I decided to track down
33:46the official accident report.
33:52The report concluded
33:54that the crash
33:54was probably caused
33:55by Raffaella's
33:56failure to monitor
33:57the driving environment.
34:00But it was also
34:01critical of Uber
34:02uncovering troubling flaws
34:04in the design
34:05of its AI programming.
34:10This table
34:11is particularly interesting
34:12because this is like
34:13seeing inside the brain
34:15of what the driverless car
34:16was thinking
34:17at the time.
34:19The car actually notices
34:20that there's something there
34:225.6 seconds
34:23before the collision
34:24which is
34:26plenty of time
34:27to start breaking
34:29or turning.
34:30And initially
34:31it's radar
34:32that detects
34:33there's something
34:33in the distance.
34:34It predicts
34:35it's a vehicle though.
34:36And then 0.4 seconds later
34:39the lidar kicks in
34:40and picks up
34:42that there's something there.
34:44The lidar doesn't know
34:45what it is though.
34:46It just classified it
34:47as other.
34:47Something unknown.
34:49Now the lidar
34:50keeps changing its mind.
34:52So
34:53a full second later
34:55changes its mind
34:56to side to vehicle.
34:57Then 0.3 of a second later
34:59changes its mind again
35:01and then
35:02it keeps switching.
35:04Vehicle, other,
35:05vehicle, other.
35:06The computer
35:07within this driverless car
35:09is
35:09not
35:10connecting the dots.
35:12It's not seeing this
35:13as one object
35:14whose classification
35:15keeps changing
35:17whose path
35:18is slowly moving
35:19over time.
35:21Instead
35:21it is seeing this
35:22as brand new objects
35:24every single time.
35:30In fact it's only
35:311.2 seconds
35:32before impact
35:34that it finally
35:34decides to settle
35:35that it's a bicycle.
35:41Obviously way too late
35:42to actually do anything
35:43about it.
35:46Terrifying.
35:47You design a system
35:48that fails to track
35:50the path of an object
35:51until it's decided
35:52what it is.
35:53If you've got something
35:55that you're going to
35:55collide with
35:56I don't care what it is.
35:58I care about
35:59that it's going to collide.
36:05That's insane.
36:07Although the system
36:09sensed the pedestrian
36:09nearly six seconds
36:10before the impact
36:11the system never
36:12classified her
36:13as a pedestrian
36:14or predicted correctly
36:16her goal
36:18the system design
36:19did not include
36:21a consideration
36:22for jaywalking pedestrians.
36:26It wasn't designed
36:27to recognise people
36:29unless they were
36:31on a crosswalk.
36:36I think this is
36:37pretty damning actually.
36:41What was this thing
36:42doing on the roads?
36:51The report made me
36:53want to know more
36:53about what was happening
36:54inside Uber
36:56around the time
36:57of Raffaella's crash
36:58in 2018.
37:01After some digging
37:02I found Robby Miller.
37:04Robby.
37:05Hi.
37:06An operations manager
37:07who'd been at Uber
37:08at the same time
37:09as Raffaella.
37:10He had quit
37:11on safety grounds
37:12just a few days
37:14before the fatal collision.
37:16I'd been working
37:17in the self-driving space
37:19for four or five years
37:20at this point.
37:21I was running
37:23the self-driving truck fleet
37:24for Uber.
37:25Eventually there was
37:25a move to combine
37:28the operations
37:29of the testing
37:30for self-driving cars
37:32and self-driving trucks.
37:36How did you feel
37:37about that?
37:37I was not at all
37:39comfortable with it.
37:40The self-driving cars
37:41were having
37:41significant issues.
37:43There's a lot
37:44of really scary
37:45incidences
37:47that are occurring
37:47near misses,
37:49near collisions.
37:50We would have
37:51an incident
37:53where the car
37:54was driving
37:55on the sidewalk
37:56in broad daylight.
37:58And you realize
37:59they are headed
38:02on a path
38:03where someone
38:04is going to get
38:05seriously injured
38:06or worse.
38:07And so
38:08I gave notice.
38:11There's this
38:12overarching fear
38:14I would say
38:15at Uber
38:15that Waymo
38:17is about
38:18to release
38:19their self-driving cars.
38:21And it's
38:22I think it's
38:23very scary
38:23for Uber's leadership
38:25to not have
38:26a response.
38:27So it's turned
38:28into a race?
38:28It is absolutely
38:30a race.
38:31Waymo,
38:32owned by Google's
38:33parent company
38:34Alphabet,
38:35was Uber's
38:36arch-rival.
38:38Just five months
38:39after Waymo
38:40launched its
38:40first public trial,
38:42Uber moved
38:43from two operators
38:44in the car
38:45to one.
38:48You need to show
38:49to your investors,
38:50hey, we're making
38:51this progress.
38:52An easy way
38:53to do that
38:53is just take
38:54someone out
38:55of the car.
38:55And this is
38:56something I mentioned,
38:57you need that
38:58second person
38:58in the vehicle.
38:59You're not ready
38:59to take that person
39:01out of the vehicle.
39:02Just a few days
39:03after I left,
39:05the crash occurred.
39:08And...
39:08When did you
39:09first hear about it?
39:11I was driving
39:12and I received
39:15a phone call
39:16from one of my
39:17former co-workers
39:19at Uber.
39:22And I sat
39:23in the parking lot
39:24and cried
39:24for 15 minutes.
39:31Wow.
39:33I'm very passionate
39:35about the technology.
39:36I believe in the technology.
39:37I want the technology
39:40to succeed.
39:42But there's...
39:44There's a way to do it.
39:53A few years
39:54after Raffaella's
39:56fatal crash,
39:57Uber sold off
39:58its self-drive division,
39:59leaving the path
40:00clear for Waymo.
40:05There you go.
40:06Waymo's just crashed.
40:10AI technology
40:10not working.
40:13Why is this
40:13happening to me
40:14on a Monday?
40:15I'm in a Waymo car.
40:16Next up to Ryder's support.
40:17This call may be recorded
40:19for quality assurance.
40:20This car is just
40:21going in circles.
40:22Either mine,
40:23a scab.
40:23I'm calling
40:24from Waymo support.
40:25Yeah,
40:25I got a flight to catch.
40:27Why is this thing
40:27going in a circle?
40:28I'm getting dizzy.
40:29I understand.
40:30Waymo are now
40:31the dominant force
40:32in driverless taxis
40:34in the US,
40:35with two and a half
40:36thousand of them
40:36on the road.
40:42Experts have praised
40:44their safety record.
40:48But some see these cars
40:50as symbols
40:50of the powerful tech elite
40:52and their disconnection
40:54from the lives
40:54of ordinary people.
40:56Hey.
40:57Hello.
40:58And there's one group
41:00who've decided
41:00to take action.
41:02Hello there.
41:02Nice to meet you.
41:03Nice to meet you.
41:04How are you all doing?
41:05Doing well.
41:06Doing well.
41:06They call themselves
41:07the Safe Street Rebels
41:09and carry out
41:10their operations
41:11incognito.
41:12What is it about
41:13the autonomous driving
41:16that you dislike, though?
41:18They are heavily
41:19promoting themselves
41:20as the future
41:22of public transit.
41:23And we just
41:24fundamentally don't think
41:25they're the future
41:25public transit.
41:26They're just a taxi
41:28where you don't
41:28talk to someone.
41:29And when that car
41:30is not parked,
41:31it's still driving around
41:32waiting.
41:3350% of the miles
41:34they drive,
41:34there's nobody
41:35in the vehicle.
41:36In addition,
41:37they cannot be ticketed
41:38for any kind
41:39of moving violation
41:40in the city.
41:41We have videos
41:41of them driving
41:4240 miles an hour
41:43on the wrong side
41:43of the road
41:44and nobody
41:45can do anything
41:45about it.
41:46They're completely
41:46immune.
41:47They're above
41:48the kind of rules
41:49that would stand
41:49for an Uber driver
41:50or a Lyft driver.
41:51Yes.
41:51And if they can't
41:52do something about it,
41:53we can.
41:55To express
41:56their frustration,
41:57the group go around
41:58San Francisco
41:59disabling Waymos.
42:03Let me understand
42:04the strategy then.
42:05So how easy
42:06are they to override?
42:08Pretty easy.
42:09Probably pretty easy.
42:11We're not allowed
42:12to show you
42:12how they disable
42:13these cars,
42:14although it's not
42:16exactly high-tech.
42:17So you're not
42:18putting them
42:18out of action
42:19permanently?
42:19No, and we're not
42:20causing any damage
42:21to them either.
42:21It's painter's tape,
42:22so it doesn't leave
42:23any residue
42:23when you remove it.
42:24There's one coming.
42:26All right.
42:27There's one.
42:28There's one.
42:28Check this one.
42:30Someone go, Frank.
42:35Poking up!
42:36Poking up!
42:40Did you hear that?
42:41There was people
42:41on the sidewalk
42:42who were, like,
42:43cheering them on.
42:45The disabled car
42:46was now blocking
42:47the road.
42:53So the one behind
42:54hasn't been taped
42:56but is stuck.
42:57There's another one
42:58coming.
42:59Oh, my gosh.
43:02Three of them.
43:04And they're just stuck.
43:06And then now
43:06there's another car
43:07behind flashing
43:09but it has a human
43:10driver so they can
43:10go around the problem.
43:15Is it actually
43:15illegal,
43:16what you're doing?
43:17We don't think
43:18it's illegal.
43:19We can't find any law
43:19that'll break.
43:20It's not vandalism.
43:21So it's hard to point
43:22to any law we're breaking.
43:25Are there not other ways
43:26that you could do this?
43:27I mean, can you...
43:28I don't know.
43:28Is there, like,
43:29a not more
43:29sort of democratic way?
43:31It would be nice
43:32if we could vote,
43:33if we had the opportunity
43:33to vote on them
43:34but we have not been
43:35presented with that
43:35opportunity yet.
43:37Is this...
43:37Does it feel a bit like
43:38this is something
43:39that has happened
43:39to you rather than
43:40with you?
43:41I mean, absolutely.
43:42Hey, there's one
43:43coming on the other side.
43:44Other side.
43:44Other side.
43:45Other side.
43:45It's coming, yeah.
43:49Yeah, you want to get it back?
43:56They are surprisingly
43:57easy to bully,
43:58aren't they?
44:01The other thing
44:02that I think
44:02was really surprising
44:03was just how many
44:05of them were going
44:07past, right?
44:08And maybe it's
44:08the time of day
44:09but, I mean,
44:11hardly any of them
44:12have people inside.
44:17Yeah, it's empty.
44:23They did say something
44:24that I hadn't considered
44:25before which is about
44:26how they are continually
44:28circling when they
44:30have people inside them
44:30and that I hadn't
44:32considered because,
44:32of course,
44:33there's a congestion
44:33aspect but there's
44:36an environmental
44:37aspect too, right?
44:38I mean,
44:38they've got to be
44:38powered.
44:41Hmm.
44:42Interesting.
44:44Interesting.
45:08It happened fast.
45:10The hearing for Raphael
45:11Vasquez was scheduled
45:13as a settlement conference
45:14but it quickly led
45:15to a plea deal.
45:17In July 2023,
45:19after five years
45:20of legal wrangling,
45:22Rafaela accepted
45:22a plea deal
45:23to avoid going to jail.
45:25The agreement indicates
45:26that she wished
45:26to plead guilty
45:27to the crime
45:27of endangerment.
45:28The event is
45:29non-dangerous,
45:30non-repentant.
45:31Is that the crime
45:32that she wished
45:32to plead guilty to?
45:34Yes.
45:35She pled guilty
45:36to a reduced charge,
45:39endangerment,
45:39with the guarantee
45:40it would be lowered
45:41to the least serious
45:42category of offence
45:43following three years
45:45of probation.
45:48Hi.
45:49Hey, how are you?
45:50I wanted to meet
45:51Rafaela's lawyers,
45:53Al Morrison
45:54How are you?
45:55and Marcy Crater.
45:56So good to see you.
45:58Why did you take
45:59this case on?
46:01Our job is to fight
46:01for the little guy.
46:02No offence.
46:03But she was
46:05the little guy.
46:07And the more
46:08we got into this case,
46:09the more we realized
46:09just how one-sided it was.
46:12Where do you think
46:13this came from then?
46:14I mean, is this,
46:15is this the police
46:17or is this from Uber?
46:18Uber.
46:19Really?
46:20They did whatever
46:21they could to make it
46:22not their fault.
46:24And it's David and Goliath.
46:26Yeah.
46:27We have this single person
46:29up against a multi-million dollar
46:31company with unlimited
46:33resources.
46:34The problem was
46:35that there were so many
46:37aspects of the way
46:37these cars were programmed.
46:39It didn't account
46:40for real-world situations.
46:41The most obvious one is
46:43the thing wasn't programmed
46:44to deal with jaywalkers.
46:46To save yourself
46:47in a campus,
46:48college campus,
46:50to not program vehicles
46:51to deal with jaywalking,
46:52I think,
46:53is the height of negligence.
46:54I was scared
46:56to go to trial.
46:57If I lose,
46:59it's prison time,
47:00no fans or buts.
47:01And the media
47:03already destroyed me
47:04out there.
47:05That's always in the back
47:06of our minds as lawyers
47:07and certainly our clients' minds,
47:09that it's the exposure.
47:10What's going to happen to me
47:12if I go to trial and lose?
47:14But Marcy and I felt like
47:15we had a great case,
47:16but we also understood
47:17the risk was all hers.
47:19Me not going to trial
47:20has no reflection on them.
47:22These two saved my life.
47:23They absolutely saved my life.
47:32No criminal charges
47:33were ever brought
47:34against Uber.
47:45Raffaella got dealt
47:46a really rough hand
47:48and this didn't play out fairly.
47:53Maybe she was
47:54watching her phone.
47:55Maybe she wasn't.
47:56Like, I...
47:58Honestly, I don't know.
47:59But I also...
48:00I don't think
48:01that that's the point of this.
48:03The scandal here
48:05is that you have got
48:06this massive,
48:08corporate,
48:10multi-billion dollar project
48:12which is not prioritising
48:15the safety of the people
48:16who are in the cars
48:18or, like,
48:18the members of the public
48:19who haven't even agreed
48:21to participate
48:21in the experiment.
48:23It is not enough
48:25to say that
48:26all of the responsibility
48:28lies with the person
48:30who was in the car.
48:32That's not enough.
48:34That's not enough for me.
48:47Back in Miami,
48:48Dylan and Nybel's family
48:49were taking Tesla
48:50to court
48:51over the fatal crash
48:52that had killed Nybel.
48:55Nice to meet you, Anna.
48:56Adam,
48:57good to see you.
48:57Great to meet you.
48:59What's up, buddy?
49:03Dylan's lawyer,
49:04Adam Bumel,
49:05brought me up to speed
49:06on the case.
49:07This is a sort of situation
49:09where there's no precedent
49:10at all.
49:11Does that make it
49:12quite difficult
49:12to build a legal case
49:14when you're,
49:14I don't know,
49:15forging a new path?
49:17Extremely.
49:17I mean,
49:18we had to do it
49:18from the start.
49:19This is creating the law.
49:22Obviously,
49:22from day one,
49:23Tesla's position was
49:24this is 100%
49:26driver's fault.
49:28The thing is
49:29that it's difficult
49:29to untangle
49:30the responsibility
49:31in this case
49:32because you're not saying
49:33that the driver
49:34had no responsibility
49:35at all, right?
49:36Of course not.
49:37We 100% acknowledge
49:39that the driver
49:40had responsibility
49:41and our position
49:42from day one
49:43has been
49:43this is a case
49:44of shared responsibility.
49:46Yes,
49:47the driver was at fault.
49:48He was distracted.
49:49He was disengaged
49:50from the driving task
49:51but then you have
49:52to ask yourself
49:53why was he disengaged
49:55from the driving task
49:56and you realize
49:57that Tesla fostered
49:59this belief in him,
50:00this trust of the system
50:02that was unwarranted.
50:04He thought
50:05that the car
50:06would stop
50:07or swerve
50:09or do something
50:10before plowing
50:11into a parked vehicle.
50:12Do you have the data
50:13from inside the car?
50:14Do you know
50:15what the car was seeing?
50:16So we were able
50:17to finally get the data
50:18from inside the car
50:20showing what the autopilot
50:22computer was detecting
50:23and processing.
50:26the computer understood
50:28that the car
50:28was speeding
50:29towards the end
50:30of the road.
50:31It appreciated
50:32the stop sign.
50:34It appreciated
50:35the blinking red light.
50:37It appreciated
50:39Dylan's parked SUV.
50:43So it identified
50:44a lot of things
50:45and it did nothing.
50:55The car knew
50:56what was going on
50:58in front of it
51:00and didn't do nothing
51:01to warn the driver
51:02to stop the car.
51:03So this isn't a case
51:05of misinformation
51:06inside the car's system.
51:08The car has all
51:09of the information
51:10it needed
51:10in order to avoid
51:11the collision
51:12but it was programmed
51:16not to behave
51:17in the way
51:17that the driver
51:18thought it was programmed
51:19to behave.
51:20Did the car behave
51:21as people expected
51:23and believed it should
51:24based off of
51:25the Tesla's marketing
51:27and Elon Musk's
51:28statements
51:29and kind of hyping up
51:31the capabilities
51:32of this car?
51:32When you take
51:35that backdrop
51:35and you put it
51:36against this case
51:37the car did nothing
51:39that anybody
51:40thought it would
51:41or should.
51:42So this is almost
51:44like become a case
51:44of like
51:46misadvertising
51:47or like
51:47misrepresentation then?
51:49They make
51:50the consumers
51:51and drivers
51:52believe that the car
51:53is more capable
51:54creating false
51:56expectations
51:57in their drivers.
51:59In the Uber crash
52:00the blame
52:01had been laid
52:02firmly
52:02on Raffaella's
52:03shoulders.
52:07With Tesla
52:08would this now
52:09be the first time
52:10the car itself
52:11would be considered
52:12at fault?
52:33tonight a Florida jury
52:35forcing Tesla
52:36to pay $243 million
52:38to victims
52:39in a deadly
52:402019 crash.
52:41The jury
52:42fighting flaws
52:42in Tesla's
52:43self-driving software
52:44were partly to blame.
52:46We can be proud
52:47that we stood up
52:48and that we did
52:49everything in our power
52:51to help shine light
52:53on what's going on.
52:56against all odds
52:58Dylan emerged
52:59victorious in court
53:01forcing Tesla
53:02to take
53:03some responsibility
53:03for the first time
53:05ever
53:05for a crash
53:06involving its
53:07autopilot system.
53:11It was a shocking
53:13landmark verdict
53:15that could well
53:16reshape the future
53:17of self-driving cars.
53:20When they gave
53:21the verdict
53:22how was it?
53:24It was very
53:24emotional
53:25you know
53:26I mean
53:26like me and my dad
53:28we started hugging
53:28each other
53:29and you know
53:29praying that
53:32that we would
53:33get justice.
53:34You know
53:34from the beginning
53:35we knew this was
53:36a joint liability case
53:37and the jury
53:39decided to hold
53:40Tesla accountable
53:40and I'm grateful
53:43for that
53:44and I'm grateful
53:45that people
53:47heard all the evidence
53:48and saw
53:49that Tesla
53:50made a mistake.
53:51I really get the impression
53:52that this has never
53:53been about money
53:54for you.
53:55It was more important
53:56for us to
53:58to shine light
53:59and to show
54:00the world
54:01that this technology
54:02is not safe.
54:05People will come
54:07to me and
54:07you know
54:08tell me congratulations
54:09but I don't feel
54:11like it's something
54:13to celebrate.
54:14Like a...
54:15You're not walking
54:16away from this
54:16a winner.
54:17I would say
54:18it's more of just
54:20justice was served.
54:23Yeah.
54:25Tesla planned
54:27to appeal the ruling.
54:43Meanwhile in the UK
54:45British firm Wave
54:47will launch
54:48their self-driving cars
54:49in London
54:50later this year.
54:54This feels like
54:55so much more
54:56high stakes
54:56than it did
54:57when I was in
54:57Phoenix.
55:00I mean this is
55:01slightly wild for me
55:02right?
55:02Like I've lived
55:02in London
55:03for 20 years
55:04and we're driving
55:05in a driverless car
55:06down Camden
55:07High Street.
55:08This is honestly
55:10something I thought
55:11was much further
55:12in the future.
55:12It never gets old.
55:14Wave CEO Alex Kendall
55:16points out
55:17that the AI
55:17in driverless cars
55:18has come on
55:19leaps and bounds
55:20in the past few years.
55:22And this is interesting
55:23this person's body
55:24language was sort of
55:25turning backwards
55:25and forwards
55:26away from the crossing.
55:27So that was noticeable
55:28actually the car sort of
55:29changed its mind
55:30a couple of times.
55:31Yeah we saw that person
55:31turn their body
55:32back and forth
55:33the AI has got
55:33quite good at learning
55:34that kind of intent.
55:36So it's like
55:37super super fancy
55:38cruise control.
55:39It's a completely
55:40different experience
55:42from cruise control.
55:43Have I just undercut
55:44your product now?
55:45You're comparing
55:46a floppy disk
55:47with a quantum computer.
55:51Think of AI
55:52as the next evolution
55:54of tooling.
55:55When you think about
55:55the wheel
55:56the calculator
55:57the computer
55:59different tools
55:59that as humanity
56:00we've invented
56:01that push forward
56:02society.
56:03I think intelligent
56:04machines are going
56:05to be there
56:05next evolution
56:06of this.
56:13Whether you like
56:14it or not
56:15driverless cars
56:16are coming.
56:16I mean they're
56:17here already.
56:19And I still think
56:20there's lots of good
56:20to come from that
56:21but to get to this
56:22point we had to go
56:25through that difficult
56:26phase.
56:26We had to go
56:27through the learning
56:28which I mean
56:30by definition
56:31learning involves
56:32making mistakes
56:33mistakes.
56:34It's just that
56:35these mistakes
56:36you know
56:37these deaths
56:37these lives
56:38that were ruined
56:39I don't think
56:40that they were
56:41inevitable.
56:41I don't think
56:43that they were
56:43an acceptable price
56:45to pay
56:46for a new technology.
56:49I think different
56:49choices could have
56:50been made.
56:51I think there
56:51were different ways
56:53to balance
56:53the rollout
56:54of the new product
56:55with the safety
56:56of the public
56:58and I just hope
56:59now that these
57:00are lessons
57:01from the past
57:02you know
57:02I hope
57:03that the steepest
57:04part of this
57:05learning curve
57:05is now behind
57:07us.
57:36and I hope
57:37it's just that
57:37it's just that
57:38it's just that
57:38it's just that
57:54The chief executive of one of the United States' largest health insurance companies has been
57:59shot and killed in New York.
58:01The alleged killer has been identified as Luigi Mangione.
58:05The suspect had something to say about the insurance industry.
58:09The insurance companies are relying on algorithms to make decisions to deny patient care.
58:15Stop denials with AI!
58:17How many people have to die?
58:19The anger and the upset is real.
58:22Because there is a human right!
58:25To discover more about AI and how it can shape our future,
58:29go to connect.open.ac.uk forward slash AI with Hannah Frye
58:36or scan the QR code on the screen now.
59:07Theitchis
Comments