- 1 day ago
Category
📺
TVTranscript
00:05Good morning, New Yorkers. Another beautiful day in the city.
00:12The Marriott Hotel, Midtown Manhattan.
00:18The CEO of the largest health insurer in America sets out for a meeting of investors.
00:28But as he reaches the venue, a masked man approaches from behind.
00:35Breaking news. We're just hearing that the CEO of United Healthcare, which is the largest healthcare company in the country,
00:42was shot and killed right here in New York.
00:45Brian Thompson was shot just before 7 a.m. Police believe the shooting was a targeted killing.
00:50The shooting of the father of two led to a nationwide man hunt. Five days later, a suspect was arrested.
00:5926-year-old Luigi Mangione became a controversial media sensation.
01:06Luigi Mangione, who has also been dubbed America's hot assassin.
01:11Instead of denouncing the killing, some seem to cheer it on.
01:15We, the people, will be free!
01:19Furious about the U.S. health insurance industry.
01:23But what few people realize is that this is a story about artificial intelligence.
01:30Stop denials with AI!
01:32How many people have to die?
01:34And the growing use of AI in modern healthcare.
01:40They're using AI to maximize the profits off of their clients.
01:47What happens when life and death decisions are no longer made by doctors, but by machines?
01:57Artificial intelligence. A machine beyond the mind of man.
02:01For decades, scientists have dreamed of creating incredible machines that could talk like us, learn like us, think like us.
02:13But what we didn't imagine is the impact they would have on us.
02:19In this series, I'm exploring what happens when AI collides with human lives,
02:25unearthing stories far stranger than we could ever have imagined.
02:55I'm Professor Hannah Fry, and for years I've been interested in how AI could transform healthcare.
03:01What I couldn't predict is that it would take me to the scene of a murder on the streets of
03:07Manhattan.
03:08Hey, Hannah.
03:09Hi.
03:10How you doing?
03:11Good, nice to meet you.
03:12I'm Hannah, lovely to meet you.
03:14Hannah Parry is a journalist for Newsweek, who lives in New York and was on duty the day Brian Thompson
03:20was shot.
03:21When did you first hear about it?
03:23Well, we first heard the reports of a shooting up in Midtown, but it quickly came through that it was
03:29something more than that.
03:31We were able to see that it had been a targeted attack.
03:35Hi, I'm Brian Thompson, CEO of United Healthcare, and welcome to the attendees of Reuters Total Health.
03:42Our mission and values…
03:43This polished corporate video shows Brian Thompson addressing a conference, 18 months after he became CEO of United Healthcare, America's
03:53largest insurer, with over 50 million customers.
03:57On the morning of his death, he was on his way to the company's annual investor meeting on West 54th
04:04Street.
04:04This is the entrance to the hotel where the conference is being held, and that's a surveillance camera that captures
04:10the shocking footage.
04:14There were multiple bullet casings, and that kind of became one of the key parts of the case.
04:20This suspect had inscribed three words on them, which were the words delay, deny, depose.
04:26Those words are extremely similar to a very well-known critique of the insurance industry.
04:32Rather than prove claims, they would rather delay, deny, defend.
04:37It appeared clear that the suspect had something to say about the insurance industry.
04:43In the months prior to the shooting, reports were circulating about United Healthcare's use of AI in deciding when to
04:51pay out to patients and when to deny claims.
04:57A new investigation alleges one healthcare giant may have been giving the algorithms too much power.
05:04United Health pressured its medical staff to cut off payments in lockstep with a computer algorithm's calculations.
05:13This news report's from 2023, which is a year before Brian Thompson was shot.
05:19I think if you don't live in America, you've probably never even heard of United Healthcare until this Luigi story
05:25surfaced.
05:26But there is a lot going on with them.
05:30And at the heart of it all are algorithms and artificial intelligence.
05:39The potential for AI to transform healthcare for the better is enormous.
05:45And some of it is already here.
05:47A step toward medical super intelligence.
05:52That's what the CEO of Microsoft is calling the company's new artificial intelligence tool.
05:56From accelerating drug development.
05:59A revolution in drug discovery could cure cancer in the next 50 years by bringing new approaches to once impossible
06:07problems.
06:08To reading scans and detecting diseases.
06:11See that little square?
06:13Yeah.
06:13It finds a very subtle polyp that maybe you would have missed.
06:16And assisting doctors during complex operations.
06:20A 3D map of a patient's pelvis is generated from a CAT scan to help guide the surgeon in real
06:26time.
06:26In the UK, AI is already being used in the NHS to detect the risk of leukemia and to identify
06:34early signs of lung cancer.
06:36For our overstretched health service, this technology has huge potential to boost efficiency and bring down costs.
06:43The revolution in artificial intelligence offers a golden opportunity to deliver better care at better value.
06:51And the NHS will usher in a new age of medicine.
06:55Leapfrogging disease.
06:56So we are predicting and preventing it rather than just diagnosing and treating.
07:06In the US, healthcare is a huge business.
07:10I wanted to find out more about the claims that United Healthcare had been using in AI to make crucial
07:16life and death decisions.
07:19So I went to meet a doctor working in the oldest homeless shelter in Los Angeles.
07:25Okay, so let me take a look.
07:26Mary Marfasi is a medical director at the Union Rescue Mission.
07:30And then, tell me how you are.
07:33Fine, please.
07:33Okay.
07:35Mary, hi.
07:37Hi.
07:38Hi, nice to meet you.
07:39It's lovely to meet you.
07:40How are you doing?
07:41I want to give you a little tour.
07:43Mary has spent her career providing medical care to the city's most vulnerable.
07:48But in 2023, when her family needed help, she believes they found themselves at the mercy of AI.
07:55Mary, is that your hat?
07:56It's my husband's hat.
07:58Oh, that's lovely.
07:59I thought I'd bring it with me today.
08:03Like, isn't that one of the most handsome older white guys?
08:06Oh, gee.
08:06He looks so English.
08:07He was Welsh.
08:08He was American Welsh.
08:10The story with your husband.
08:12When did this all start?
08:14Going back about three years ago.
08:16I noticed he was having a lot of balance problems, just falling too many times.
08:20And this was a very athletic man.
08:22And then finally, one fall that was so bad, fractured his nose and his face was filled with blood and
08:29had some cranial fractures with it.
08:31Got him in the hospital and finally got a diagnosis, which was accumulation of fluid on the brain.
08:36Oh, my gosh.
08:37Right?
08:38Oh, wow.
08:39That's a proper fall.
08:41Yeah, it was rough.
08:42So what happened?
08:43So we stayed in the hospital and then he got into a rehabilitation center.
08:48And then within just a couple of weeks, I was told by the center, okay, his time is up.
08:54He's no longer in need of services.
08:56And I said, what?
08:57He can't even brush his teeth.
08:58He can't go to the bathroom on his own.
09:00And I thought, something's not right here.
09:02So we had him home and then again, another fall within just a few months.
09:07And the same thing, I was told by the center, okay, his time is up.
09:09And anytime I would complain about it, I'd hear, oh, well, at his age.
09:14And I got tired of hearing that preamble.
09:17So when he was discharged, I mean, if he's not in a fit state to walk out of the hospital,
09:22what happened?
09:23Wheelchair. Just wheelchair to the car.
09:26Staff takes him out, loads him in my car, and I drive away.
09:30And how'd you get him in the house to the other end?
09:32It was a struggle.
09:34One time he fell back on me.
09:35I dislocated my right shoulder.
09:37Because this is just not a man who's strong enough to be back home.
09:39Right.
09:40That wasn't good.
09:42And that's when I started calling UnitedHealthcare, saying, who's making these decisions?
09:46And the response I would get is clinical team.
09:50Right.
09:51And I started to say things like, well, who runs the clinical team?
09:55And then I would just get the runaround.
09:57So I called the ombudsman.
10:00And that's when I was told AI is involved.
10:02Okay.
10:04After each fall, he wasn't getting enough physical therapy or occupational therapy.
10:09You just get weaker and weaker over time?
10:11Yeah.
10:12Yeah.
10:15You think he didn't need to die when he did?
10:18I don't think so.
10:20He had goals to live much longer.
10:23I really thought he could get back to some of his function.
10:27Had he been given the care he needed?
10:29Yes, I think so.
10:32How angry are you about all of this?
10:37It's more I miss him more than anger.
10:39I don't want it to happen to anybody else.
10:47All right.
10:48Thanks for coming.
10:48Thank you so much.
10:49Because it meant so much to have you.
10:51All right.
10:51Stay in touch.
10:52I'll send you some things.
10:58There's a kind of irony to this story.
11:01You have somebody who spent her entire life advocating for people's health,
11:06trying to make sure they get the care that they need.
11:08When she needed it, when her family needed it,
11:12despite having paid for it, it wasn't available.
11:17Frank and Mary's case was not an isolated one.
11:21By early 2023, the story was starting to get out,
11:25alleging the widespread use of AI algorithms by big insurers,
11:31including UnitedHealthcare, to deny elderly patients care.
11:36Later that year, there was a Senate inquiry.
11:39The reason we're here today is that all too often the big insurance companies
11:44have been failing seniors when they need care.
11:47And perhaps most troubling of all, there is growing evidence
11:51that insurance companies are relying on algorithms,
11:54rather than doctors or other clinicians,
11:56to make decisions to deny patient care.
12:02Two months after the Senate committee published its report,
12:06Brian Thompson was shocked.
12:09Let's head to New York because police there are continuing their hunt
12:12for the gunman who killed the boss of one of the biggest companies in the world.
12:16The shooter took off down an alleyway around 55th Street
12:19and is currently still at large.
12:21In the days after the shooting,
12:23the hunt for Brian Thompson's killer gripped the nation.
12:27Police are looking for a suspect described as a man about 6'1",
12:31wearing all black.
12:32Police drones, helicopters and thousands of CCTV cameras
12:36are combing the city street by street.
12:38On Thursday, detectives shared new images of the man they want to question.
12:42The image, caught on surveillance camera,
12:45shows him standing at the check-in desk at a hostel.
12:48The man appears relaxed and smiling.
12:51Just a few minutes later, he shot Mr. Thompson dead
12:54before making his escape.
12:57Then, five days after the killing,
12:59police received a tip-off from an employee
13:02at a McDonald's in Pennsylvania.
13:05Um, what's your name?
13:07Uh, Mark.
13:08What is it?
13:09Mark.
13:09Mark?
13:10Yes, sir.
13:10Mark what?
13:11Azaria.
13:12Azaria?
13:12Someone called.
13:13Right, they thought you were suspicious.
13:15Oh, I'm sorry.
13:16Um...
13:16Should I be on you?
13:17Uh, yes, sir.
13:18Thanks.
13:22The alleged killer has been identified as Luigi Mangione.
13:27As Luigi was led into court after his arrest,
13:30it was clear he had something to say.
13:33Dog, dog.
13:36Oh!
13:38It's completely out of touch.
13:40It's an insult to the intelligence of the American people
13:43and the alleged experience.
13:47Luigi Mangione, the man accused of killing
13:49the US healthcare insurance chief executive,
13:52Brian Thompson, appeared in New York
13:54to face 11 state criminal counts
13:56that could lead to a death penalty sentence.
13:59Then, something extraordinary happened.
14:02Free Luigi!
14:04Free Luigi!
14:07Almost immediately,
14:09protesters flocked to his court appearances,
14:12hailing him as a folk hero.
14:15I'm here because I support universal healthcare for all,
14:18and I'm here because I believe that Luigi Mangione's
14:21civil rights are being violated.
14:23So what brought you here today?
14:25Um, this case is a case about humanity.
14:27I think everybody in the crowd identifies
14:29with the extortionist nature of our American healthcare.
14:32They implemented like an AI bot to review the claims.
14:35Some of these people had life-threatening illnesses
14:37that were rejected for efficiency.
14:39What's your feeling in general about AI?
14:42I mean, um, technology is going to find its way into everything.
14:45So to stand against it is, you become a fossil.
14:49But we have got to continue to invest into the human components of things
14:53because as we saw, you can't just automate everything, right?
14:55Somebody has to be looking and watching.
14:57People are dying!
14:59People are dying!
15:02People are dying!
15:02People are dying!
15:03I myself have had a terrible experience with American healthcare
15:07despite having one of the better plans.
15:09Both my mom and I have had to battle and fight tooth and nail AI claim denials.
15:16It's really insidious the way that AI is infecting every aspect of our anxiety,
15:20but specifically something so important and crucial such as healthcare
15:24that already shouldn't be for profit.
15:27They're using AI to maximize the profits that they can make off of their clients.
15:35Stop denials with AI!
15:37How many people have to die?
15:39In this case, like, incredible strength of feeling from the protesters.
15:44You can understand, too.
15:45You know, if you've been personally affected by this,
15:47or if someone in your family has,
15:49like, I understand why this is such an emotional moment.
15:53But, you know, I also think that, like, someone was murdered here.
15:58You know, someone with a family, with children.
16:01And I think there's a bit of mental gymnastics going on with these protesters
16:05where they sort of conveniently manage to forget that fact.
16:15In situations like these where emotions are high,
16:19it's sometimes easy to turn on technology,
16:22especially when it feels like it's something we don't fully understand
16:25and can't easily control.
16:29AI didn't create the perceived problems with US healthcare,
16:33but it has supercharged them.
16:36And to understand how, it helps to know what an AI algorithm actually is.
16:43I think the words algorithm and AI get thrown around quite a lot.
16:46It gets quite confusing.
16:47So I thought I would explain the difference between the two
16:50in the most New York way possible
16:52by imagining that you are opening up a new hot dog set.
16:56Now, you've got a couple of options here.
16:58You could use an algorithm.
17:01An algorithm is a series of steps for completing a task.
17:06In the case of our hot dog stand, the task is to sell hot dogs.
17:13The inputs are the sausages, the onions.
17:17The algorithm is the instructions.
17:20Cook 100 hot dogs a day, sell them for $4 each,
17:24stay open till 11 on Friday.
17:27And the output is, hopefully, a tidy profit.
17:33With this traditional type of algorithm,
17:36everything is spelled out in advance.
17:37It means it's very precise, it's very reliable,
17:40but it's also completely inflexible.
17:43Now imagine that you've got a stand that is run by an AI algorithm.
17:48So this time, you don't tell it what to do.
17:50You just give it the inputs, the buns, the sausages, the onions,
17:53and you say,
17:54the only thing I care about is how much money you make.
17:58Now, at first, the AI is just going to watch.
18:02Just going to collect data,
18:03it's going to hunt for patterns,
18:05like when people are buying the most,
18:07where has the most footfall,
18:09how that changes with the weather.
18:11And then, after a while,
18:13it's going to start suggesting things
18:14that you hadn't even thought of,
18:16like pitch up outside a dog park on Sundays,
18:19or start selling vegetarian sausages,
18:21or people tip more when the cart smells like onions.
18:25But the thing is,
18:26you didn't tell the AI any of those rules,
18:28it discovered them for itself.
18:31AI algorithms are capable of crunching huge quantities of data
18:36and analysing complex patterns of behaviour,
18:39all in order to make your hot dog stand profitable.
18:46And that is the really key difference here.
18:49For an AI algorithm,
18:50you don't need to spell out every possible scenario in advance.
18:54You just define the goal
18:56and let the AI learn for itself.
19:01Just because an algorithm uses AI
19:03doesn't mean it's necessarily better,
19:06but it will ruthlessly pursue whatever goal it sets.
19:13I wanted to find out more about the specific algorithm
19:16being used by United Healthcare.
19:19So, I tracked down the two investigative journalists
19:22who were the first to uncover the story.
19:25Hey.
19:26Hey, how are you?
19:27How are you doing? Good to meet you.
19:28I'm Hannah.
19:29Nice to meet you.
19:30Lovely to meet you.
19:30Hey, Bob.
19:31Lovely to meet you, Bob. How are you doing?
19:32Good.
19:33Casey Ross and Bob Herman
19:35were nominated for a Pulitzer Prize for their investigation.
19:39How big has the story been for you
19:41in terms of your journalistic career?
19:42I think this is probably the biggest story that I've done
19:45in terms of the impact that it's had.
19:47I think that the reaction that we got
19:49and some of the fallout in terms of lawsuits in the U.S. Senate
19:52validate that this was a story that had an extraordinary impact.
19:57What was the first spark of the story?
19:59This person in the nursing home industry sent me an email,
20:02and it was just this visceral reaction that,
20:05hey, the health insurers are issuing a lot of denials
20:08and they're not telling us why.
20:10The people that were in this facility were getting removed
20:14when they were still very sick and were not ready to go home.
20:16And so it was just a signal that, okay,
20:18well, maybe we should ask some more questions about this.
20:20And what we found was that this all centers around an algorithm
20:23called NH Predict, and that algorithm is used on behalf
20:27of insurance companies to reduce the amount of time
20:30that people are in these facilities and to control their cost of care.
20:34How much sight do we have of the actual algorithm
20:37that's going on underneath this?
20:39Yeah, so there are a bunch of pieces of data
20:41that they are feeding into a model.
20:43The age of the person, what was their primary diagnosis,
20:46what other illnesses do they have?
20:48It compares that patient to other patients like them
20:52in a database of six million patients.
20:55And based on this comparison, this is the amount of care you should get.
20:59Once that prediction is made, basically a date is circled on the calendar.
21:03And this is the date that they're trained to push these people toward.
21:07This is the report that's produced by the algorithm.
21:10And it all boils down to this prediction,
21:12which is that the estimated length of stay of the patient,
21:16in this case, is 16.6 days.
21:18Point six?
21:19Yes.
21:20What, so it's like to the hour?
21:21It predicts it down to the decimal point.
21:23You can't predict when somebody's going to be fine to the hour.
21:26No, and it's the type of information
21:28that only an AI algorithm could give you.
21:30It doesn't take into account a lot of things about these people.
21:36Every healthcare journey is different.
21:39Every injury, every recovery,
21:40everything that comes up in the course of your care
21:43causes all sorts of things to happen that can't possibly be predicted.
21:47Literally, they're boiling down these people to numbers.
21:50Oh, you're not just a number.
21:51Yes, you are.
21:52And there it is.
21:54There is a counterargument to all of this, right?
21:56In that people are quite accustomed to artificial intelligence
21:59that compares you to other people.
22:02Like, you know, the recommendation algorithms that you get on Netflix
22:05or Spotify or Amazon.
22:06It's like, people like you did this.
22:09Therefore, do you want this?
22:11Yeah, and I think the common experience, or at least my experience,
22:15is that most of the time I think the algorithm is wrong.
22:18It's way off.
22:18Just because I watch this cooking show
22:21does not mean I want to watch this other show.
22:23But if you're talking about me in the hospital
22:25or after a serious injury and it gets that wrong,
22:28well, I have a much bigger problem with that.
22:32Casey and Bob's damning reporting concluded in 2024,
22:37throwing UnitedHealthcare into the spotlight,
22:41just months before Brian Thompson was shot dead.
22:46Can you remember when you first heard about Luigi Mangione?
22:49I mean, I remember the morning that it happened,
22:51we started getting texts from people,
22:53and we're like, holy crap, like, is this actually for real?
22:55We don't have any indication that this individual
22:58was inspired by a reporting.
23:01But nonetheless, as a reporter, you're stunned.
23:04You're shocked.
23:04You're like, you don't ever want any of your reporting
23:07to inspire somebody to act that way.
23:20Luigi Mangione is not someone you would expect
23:23to take on the role of outlaw vigilante.
23:30He studied at the prestigious University of Pennsylvania,
23:33one of America's Ivy League.
23:37Luigi Nicholas Mangione.
23:40Luigi.
23:42You can see in this video of his high school graduation
23:45that Luigi already stood out.
23:47He's giving the valedictorian speech,
23:51which means he's being celebrated
23:53as the most academically successful student in his year.
23:57Family, friends, faculty, and fellow students.
24:01Good afternoon.
24:03He was clearly a popular, confident young man.
24:11We spoke to a few of his classmates who didn't want to go on the record
24:14because they have been continually hounded by the press ever since,
24:17which you can understand.
24:19And everyone, almost to a point, say that Luigi was, like, smart,
24:23he was helpful, he was kind, he was friendly,
24:25he was a popular guy from a good family,
24:28who was also really well-educated.
24:31But then I discovered a remarkable detail about Luigi,
24:36one that's been all but overlooked.
24:39He did computer and information science as his major,
24:43but then he also then got a master's degree from here
24:48in computer science
24:50with a concentration in artificial intelligence.
24:54One of the modules that he was taking in here
24:57was exactly the stuff that sits behind this algorithm.
25:01It's called data structures and algorithms.
25:03Very neaty, very mathematical,
25:06algorithms course about data structures and computer science.
25:09Like, it's tough.
25:10And the thing is, Luigi wasn't just doing well in this course,
25:13he was doing so well
25:14that he was appointed as an assistant tutor.
25:18Look at this photo.
25:20I mean, he looks so confident.
25:22He was basically teaching other students
25:25who were the same age as him.
25:27This is stuff that he knew incredibly well
25:30and, by the sounds of it,
25:33was also very, very good at.
25:38Most of Luigi's friends have so far refused to go on the record.
25:43But back in the UK,
25:44I tracked down one person who was willing to talk.
25:49Gawinda Bogal is a British blogger
25:51who writes about the impact of technology on society.
25:55Hi. Hi, Anna. How are you doing?
25:57Nice to meet you. Lovely to meet you.
25:59Thank you for that.
26:01Luigi was a subscriber to Gawinda's blog
26:05and the two of them talked about their shared concerns.
26:10When I spoke to Luigi, he was in Japan
26:12and he was remarking on how sort of lonely it felt in a lot of places.
26:18The streets were empty because everything's just automated.
26:21He was quite concerned about mass automation
26:24and the knock-on effects that this might be having on society.
26:28You can live your entire life without leaving your house.
26:31You know, you can do your shopping online.
26:33You can do your banking online.
26:34You can do your dating online.
26:36You can even have a full relationship online.
26:40There's less human connection
26:41and AI is going to make this a lot worse.
26:44I mean, he himself probably had no problem making friends.
26:47He was a very charismatic individual.
26:48But I think he was worried about other people
26:51and maybe how they were kind of gradually being lost.
26:56The connections that we have, they keep society together.
26:59And he was worried that all of that is unravelling.
27:02Did you talk about healthcare as well?
27:04There was only one brief exchange that we had.
27:08Luigi made a passing remark about how I was lucky
27:11because we had the NHS in the UK, you know, free healthcare.
27:14How did he seem to you?
27:17He actually seemed quite cheerful.
27:20Someone who did not seem particularly pessimistic,
27:24although some of the issues that he raised were quite pessimistic.
27:28But his general demeanour was actually the opposite of that.
27:31He said that he wanted me to focus less on the problems
27:34and more on the solutions.
27:35He would ask me, you know,
27:36so what's the practical takeaway of this?
27:38He wasn't just interested in moaning.
27:41He wanted to find real solutions.
27:47Since his arrest, the Luigi Mangione story
27:50had taken on a life of its own.
27:53Luigi Mangione.
27:55The internet's favourite hot Italian sausage.
27:58And social media had exploded with memes about the alleged killer.
28:02Luigi, you're a brave Italian stallion
28:06whose actions ignited a national dialogue
28:09about the USA's crappy healthcare system.
28:13If you feel the same way about me as I feel about you,
28:17please do not deny, delay or defend your love.
28:21There was also no shortage of theories about what happened and why.
28:26There is a note which was supposedly found in his backpack
28:30when he was arrested.
28:31And the press have decided to call this his manifesto.
28:34In the document, he appeared to talk about the planning
28:37of the alleged crime, but then moved on to health insurance.
28:42There's one bit here that says,
28:43frankly, these parasites had it coming.
28:46Then he specifically name-checks United.
28:49And then he says,
28:51these, indecipherable, have simply gotten too powerful
28:55and they continue to abuse our country for immense profit
28:59because the American public has allowed them to get away with it.
29:04The other thing that the internet is obsessed with
29:07is his own personal story.
29:09It's how he ended up feeling so strongly about this particular issue.
29:14I've got his X profile.
29:17In the middle of his banner, he's got this X-ray.
29:22And, I mean, we know that he had a spinal problem.
29:24We know that he had surgery for it.
29:25And that's...
29:26I mean, that's pretty extreme, that surgery.
29:29But what is interesting is that there's no evidence
29:33that he was actually a client of United Healthcare.
29:36And so far, nobody's found anything to prove
29:39that he had a personal insurance claim denied, even.
29:45As humans, we understand being let down by other humans.
29:50Real doctors often don't have all the answers,
29:53but we forgive them for their very human flaws.
29:58But when it comes to technology,
30:00we have very little tolerance for error.
30:04The flip side is also true.
30:07When AI promises extraordinary medical possibilities,
30:11it's all too easy for us to believe in the impossible.
30:24Just a few miles across the state line,
30:27I've been invited to meet a young tech entrepreneur
30:29who is selling a seemingly incredible new medical breakthrough.
30:33Hey!
30:34Hey!
30:35Using AI.
30:36How are you?
30:37Yeah, very good.
30:3825-year-old Kyan Sadehi runs a tech startup
30:41called Nucleus Genomics.
30:44Are you sequencing the entire genome?
30:46The entire thing.
30:47The whole thing?
30:48The whole thing.
30:51His company uses AI algorithms to analyze the DNA
30:54of his customers' future children like never before.
30:59These are nitrogen tanks, are they?
31:01Yes.
31:03How many embryos do you reckon there are in here?
31:05Like, millions?
31:05If all the samples here were embryos,
31:08yes, there would be millions, yes.
31:10Millions of potential new humans?
31:12Yeah.
31:14Kyan uses AI to map each embryo's DNA,
31:18comparing it against huge DNA databases
31:21to try to predict a baby's future risk of disease.
31:25Some diseases can be predicted with certainty.
31:28For others, he can say how people with similar DNA turned out.
31:34When I talk to a couple, they say,
31:36my grandfather had Alzheimer's,
31:38I want to do anything I can to make sure
31:39my son doesn't have Alzheimer's.
31:41And then I think, well, genetics obviously can help with that.
31:44And so I think more people are going to use IVF,
31:46and they're going to be using
31:48genetic optimization technology
31:49to basically pick their child.
31:52Figuring out whether a baby will develop Alzheimer's
31:55later in life is a best guess, not a diagnosis.
31:59But the company goes further,
32:02helping parents choose an embryo
32:04based on eye color, height, or even IQ.
32:09Welcome to the Nucleus office.
32:11Kyan had already raised $32 million from investors.
32:16By the way, have you ever seen how many tabs I have
32:18on the right side of the screen?
32:19I don't think you should ever show that on the camera.
32:21It is actually so insane.
32:23I think it also slows down.
32:24I think it uses it for your RAM.
32:26Anyways, let me show this.
32:28He just launched a glossy new ad campaign.
32:31I think it is.
32:31Okay, let's go.
32:33Nucleus embryo is for couples doing IVF
32:36to uncover the full genetic profile of each embryo
32:40in one intuitive platform.
32:42Every parent deserves the power to decide what possibility feels right for their family.
32:51Some people don't think you should have this choice.
32:54But it's not their choice to make.
32:58It's yours.
33:00You are not afraid of controversy, are you?
33:04No, no, no, no.
33:05It's not...
33:06You know, if you were doing IVF and you had five embryos,
33:10would you want to pick your future baby randomly?
33:13Or would you ask the doctor for more information on each,
33:16especially if you have a family history of Alzheimer's, of cancer?
33:19It's my right to know this information.
33:21It's my choice to say I want a baby with lower disease risk,
33:23I want a baby that's slightly taller,
33:25or even with a specific eye color, et cetera.
33:27It's their right, it's their choice.
33:28I think what you're doing,
33:29where people have genetic predispositions for particular diseases,
33:33especially when they're preventable or treatable,
33:35I think it's amazing.
33:36What I don't understand is why you would include something like IQ and eye color.
33:41Yeah.
33:41Because it's so controversial.
33:43And it's like, it's a little bit eugenics-y, you know?
33:46You are implicitly saying that taller is better.
33:49You are reinforcing the ideas of preferences
33:53that some people are more valuable than others.
33:56No, not at all.
33:58I think parents have the right to choose across their embryos, right,
34:03if they want a baby with a lower disease risk, for example,
34:06or if they want a baby that's shorter or taller.
34:08That's absolutely their right to choose.
34:10This is designer babies, though.
34:11I mean, bluntly, this is designer babies.
34:13No, no, it's not designer babies at all, actually.
34:15How is it not designer babies?
34:16It's not designer babies at all.
34:17If a parent wants to give their child the best start in life,
34:19that is like a parent doing the most basic, in my mind,
34:24a human thing.
34:25If a parent, if the moment a child is born,
34:28they will run multiple tests on a baby
34:31to make sure the baby is healthy.
34:32They'll give it vaccines, for example,
34:33to make sure they don't get diseases.
34:35This is just another tool in the toolkit
34:37that helps parents do that.
34:40But there was something else I was worried about.
34:43Even the best scientists don't fully understand
34:46how genes and environment combine to make us who we are.
34:50I was worried that all of this complexity
34:53was being overlooked in favour of tantalisingly simple AI predictions.
34:58Do you think that the technology is mature enough, accurate enough, capable enough
35:08to be giving people this illusion that they have control?
35:12The platform, I think, does an excellent job showing the uncertainty
35:15and showing the fact that, again, DNA is not destiny
35:19and DNA will never be destiny.
35:21People can have certain just genetic dispositions,
35:23but there's the whole thing called life,
35:25which is there's environment, there's how you're nurtured,
35:27how you're raised, nutrition, etc.
35:29We cannot possibly reduce human life to just a DNA strand.
35:40Testing for physical characteristics is banned in the UK.
35:44But there's nothing to stop British couples travelling to the US for it,
35:48as long as they're willing to pay the approximate $40,000 price tag.
35:57Hi.
35:58How are you doing?
35:59Hello, welcome.
36:02Dragos and Laura live in London
36:04and were at the start of their IVF journey with Kian's company.
36:09These are beautiful.
36:10Are these your babies?
36:12Yes, our two children, two boys.
36:15So your two boys, did you have them naturally?
36:18Yes, we did.
36:19Yes.
36:19So why not naturally for the third?
36:22Because Dragos was kind of afraid we are...
36:25I'm 40, he's 43.
36:27When you're over 43,
36:29it was clear that the chances of having issues are very high.
36:33If this wasn't possible, would you still want to have another baby?
36:38Definitely not.
36:39Not?
36:40Not.
36:40Because you don't want to take chances.
36:43Everybody knows how important a healthy child is,
36:45and I think everybody should do the extra mile for that outcome.
36:50Where are you at now than in the process?
36:54In less than two months, we are going to New York to start the IVF process.
37:00And then they will test the egg, they will take a few cells,
37:03they will zoom in on anything that can happen,
37:06and then once we have selected the right embryo,
37:09then we will go back to New York for the implantation,
37:11which takes only one or two days.
37:13So in theory, in a few months, you could be pregnant.
37:18Yes.
37:19Fingers crossed.
37:21Yeah.
37:21We both want a healthy child, a healthy embryo.
37:26We are also hoping she's going to be a girl.
37:31Is that the dream? A little girl?
37:33Yes.
37:34This was my dream all along, to have three babies.
37:38So if a girl would come, it would be the perfect picture for myself.
37:43When you picture your daughter, your potential daughter in your head,
37:47what does she look like?
37:52Beautiful.
37:55Brown eyes, brown hair, light skin, big lips, full lips.
38:03I don't know.
38:04So she'll be looking like you.
38:07Not like me.
38:09Hopefully not bald.
38:12With a beard.
38:14Some people worry about the nuclear stuff,
38:16because it effectively prioritizes some characteristics of the baby over others.
38:23For example, IQ, eye color, height,
38:27is stuff that you can, in theory, at least give a probability to, right?
38:31You can measure.
38:31I would not go through this just for the height and the eye color.
38:36But if height is what we can find out now,
38:39maybe it's important, maybe not.
38:41But, you know, we'll take that, because it's on the table.
38:45They give you a probability of the IQ that he or she might have.
38:51But, yes.
38:52And that's going to be one you'll look at?
38:54Of course.
38:56Of course it will.
38:58Before Dragos and Laura started their IVF cycle,
39:00there was a first step.
39:03Getting their own DNA mapped by nucleus.
39:06I don't know. Let's see.
39:08To see if they were carriers for any diseases they could pass on to their baby.
39:12The results of these genetic tests had just come in.
39:17The moment of truth.
39:21Feels like unraveling a present.
39:24You don't know exactly what's inside.
39:29So, our family summary.
39:31We have 2,154 rare diseases tested and one detected risk for children.
39:40Focuse endotelial corneal dystrophy.
39:44Symptoms often begin at 50.
39:47So, imagine our child is born this year.
39:4950 years from now, I'm sure we'll have bionic eyes that we can replace,
39:55hardwired to our brain.
39:56So, I don't think this is serious enough for us to be worried about it.
40:01Of all of the things that could potentially be risky,
40:04Alzheimer's, Parkinson's, breast cancer...
40:06Yeah, this is very...
40:08Low level.
40:08Low level.
40:09Mild.
40:11Don't worry.
40:12You and Laura can still have healthy children.
40:17So, I think this is important.
40:19Mmm.
40:23I think if you're having a baby in your 40s,
40:26I can completely understand why you would want as much information as possible.
40:31If you're...prone to worry,
40:34why wouldn't you try and eliminate the risks where you possibly can?
40:41But at the same time, I'm worried that this technology,
40:46in this way,
40:48gives the illusion of control that you don't actually have.
40:53Gives the illusion of certainty and prediction of the future
40:59that doesn't really exist.
41:14Over in the US, Luigi's case had proceeded through the courts.
41:18Charged with murder in the first degree,
41:20killing as an act of terrorism and criminal possession of a weapon,
41:24he faced his plea hearing.
41:26Do you agree to this indictment, sir? Guilty or not guilty?
41:30Not guilty.
41:32Stoking further outrage on both sides.
41:35One struggle, one right!
41:37No, there is a human right!
41:39We have very disturbed people who somehow think that eliminating a father
41:44who has two young children over some cause is somehow justified.
41:49Who are you? Want to make something?
41:51But on the other side of America, another legal case was quietly gaining its own momentum.
42:03A class action lawsuit against UnitedHealthcare's use of AI had been launched by one plucky public interest firm called Clarkson
42:14Law.
42:15They'd also launched suits against other big insurers for similar claims.
42:20Lawyer Glenn Dannas is leading the charge.
42:25We know for sure that there is an algorithm of some kind that is predicting how long you need in
42:32rehabilitation.
42:32And we also know that some people think that that wasn't long enough. Is that fair?
42:39Yes. I mean, from our perspective, that's a vast understatement.
42:43But that is, in fact, true. Yes.
42:45The length of stay value is consistently too low.
42:49And it seems highly unlikely that this is an accident because it's only ever in one direction.
42:56How does the legal case come in?
42:58One way it's illegal is because there are different states like California that require that it be a human making
43:04decisions.
43:05So that, by law, it cannot be delegated to an AI, an algorithm, anything other than a human medical professional
43:14exercising his or her professional knowledge and specialty.
43:22How many people are involved in this? How many plaintiffs are there?
43:25Because United has such a large market share in America, it's almost certainly in millions.
43:33If you eventually win this case, does that mean that they have to compensate everybody who held that type of
43:40insurance?
43:41What we want, whether it's by a settlement or by going to trial, is to have all the people who
43:46were denied what was owed to them to be paid for that and then to change these practices going forward.
43:55The Clarkson case rests on the evidence of former United Healthcare customers who have come forward to testify, and in
44:04particular, those who have survived to tell their tale.
44:08Hey.
44:09Bill, this is Hannah.
44:11Hi Hannah.
44:11Hi Hannah.
44:12Such a treat to meet you. How are you doing?
44:14Very good, thank you.
44:14Lovely to meet you.
44:16One of those is 86-year-old Bill Hull.
44:20How are you guys doing?
44:21Well, as well as can be expected, I guess.
44:25Your koi are quite the beasts.
44:28I chuck them in.
44:31So tell me, how are you doing now? How's your health at the moment?
44:34About 70% of my heart is shot.
44:39But the worst part has been the paralysis that the stroke caused me.
44:45I used to be very active.
44:47Like we did this thing, did all the bricking here.
44:51Can't do any of that anymore.
44:54In June 2023, he suffered a heart attack on his way to a medical appointment.
45:00There's a park bench, big long park bench, just outside the building.
45:04I just slumped over, apparently.
45:07Two medical technicians that knew CPR laid me down on the bench, and they broke all my ribs.
45:13And I understood later, if you don't break ribs, you ain't doing it right.
45:18I was in the hospital for 25 days.
45:21In intensive care?
45:23Almost all of it was intensive.
45:25I'm there, and I think at the 22nd or 23rd day, I got a notice that you're to be released.
45:32They said, well, we'll sign a case manager to you.
45:35She got a hold of me and said, I have talked with your doctors, and everyone recommends that you go
45:42into skilled nursing.
45:43A day later, she came back and said, well, I don't know.
45:46They said no.
45:48They would not approve it and wouldn't give me a reason.
45:52Clarkson Law argue that this denial was likely the result of UnitedHealthcare's predict algorithm.
45:59They made you feel like you needed to get out, and yet they were giving you no place to go.
46:04And they said, where do you want to go?
46:05I said, I guess I'm going to go home.
46:07Okay.
46:08So we'll get you out of here.
46:10Stuck me in a car.
46:11What physical state were you in at this point?
46:14My wife was very worried because she didn't know if she could take care of me.
46:19She's 85 years old.
46:20She's macular degeneration, has a hard time seeing.
46:23She's using a walker too, like I am.
46:27Anyway, I get home.
46:29I was two and a half, three days out of the hospital.
46:32My daughter came over and I was in bad shape.
46:36I think slurring my words.
46:38I was starting to have a stroke.
46:40I could not move a finger.
46:42Couldn't move my arm.
46:43Couldn't move my foot.
46:44You feel yourself just literally dying.
46:48It's terrifying.
46:50Oh, worst thing that ever happened to me.
46:53Do you think that you would have had the stroke, regardless of where you were?
46:59Well, I think I'd have had the stroke.
47:01But I should have been in either a skilled nursing or in the hospital.
47:07And I would have gotten some attention several hours before.
47:12And what I ended up with would have been much less than what I ended up with.
47:20Bill is now looked after by his daughters, Lisa and Laura.
47:25What's your take on artificial intelligence being used in your dad's healthcare?
47:29Certainly the whole time he was in the hospital, we assumed it was doctors who worked for that hospital
47:37who were making these decisions and who were denying this care.
47:41And it wasn't until he found the class action lawsuit that he believed it was actually the AI system that
47:49was being utilized by United Healthcare.
47:51I guess if a human had been making that decision, you would have been able to have a conversation with
47:55them.
47:55You would think.
47:56If an algorithm says it, you can kind of wash your hands of it.
48:00I feel like that's what they're doing.
48:01If they can make AI do their dirty work, I think they're very happy to do that.
48:06How do you feel about the murder of Brian Thompson?
48:08I think it's indicative of how frustrated human beings can become with huge corporations like United Healthcare.
48:18But I 100% don't condone what happened and wouldn't condone any action like that in the future.
48:25But the anger and the upset is real and it's justified.
48:34The thing is, I'm not anti-AI, right? I'm not anti-algorithms.
48:40I think that there is an incredible amount of waste that happens when humans make decisions.
48:46I think there's incredible efficiencies and therefore better care that you can provide people when you carefully introduce these kind
48:54of systems.
48:56The problem is that once you start automating stuff, it all comes down to exactly what that algorithm was designed
49:03to care about.
49:04You know, was it designed to care about improving the outcomes for the patients in the long term?
49:12Or was it designed to minimise the amount of money that is spent on caring for the patient?
49:19And those are two things that are often at odds with one another.
49:30But the problems with algorithms go deeper.
49:37Here is one thing that you should know about AI.
49:40Sometimes people call it a black box.
49:44And there is a good reason for that.
49:45You have to imagine that at one end, you're putting in some information, your input.
49:50And at the other end, you get some results, your output.
49:53Now, the question is, what is going on in the middle, inside of the algorithm?
50:00Now, there is nothing magical going on here.
50:03There's no voodoo.
50:04It's just loads and loads and loads of calculations.
50:07But these things are so big and unwieldy that it becomes impossible to follow one thread from one end all
50:16the way through to the other.
50:17And that means that artificial intelligence is sometimes finding patterns that we cannot see, are unable to check and might
50:27not like.
50:30This problem was highlighted in 2019 by a researcher called Ziad Obermeyer.
50:37He looked at an AI algorithm being used in hospitals to identify patients most in need of care and offer
50:45them help on a special program.
50:49But he discovered that the patients the algorithm selected were disproportionately white.
50:55These algorithms should have been a great use case for AI.
50:58But unfortunately, a design choice in building those algorithms made them biased.
51:04Obermeyer couldn't see inside the AI, so he worked backwards from the results and found that the algorithm had used
51:12a shortcut.
51:13It wasn't finding the sickest patients.
51:15It was finding the ones who'd had the most money spent on their care.
51:21Black patients have less money spent on them by our healthcare system today because of barriers to access and because
51:27of discrimination.
51:29And that means that the AI saw that fact clearly.
51:32It predicted the cost accurately.
51:34But instead of undoing that inequality, it reinforced it and enshrined it in policy.
51:40Even though the objective of the algorithm was good, the outcome led to discrimination.
51:46And until Obermeyer, no human had been there to spot the difference.
51:58We may never be able to see inside the algorithm used by United Healthcare to assess claims.
52:05The closest we can get is by talking to those who worked alongside it.
52:11And until now, very few company insiders have ever spoken on the record about this AI.
52:16But one former employee had agreed to meet me.
52:21Lovely to meet you.
52:23Would you like something to drink?
52:24Thank you so much.
52:26Look at this ice at everything.
52:28Amber Lynch was a care coordinator responsible for entering patient data into the algorithm
52:34and ensuring they were discharged on time.
52:38So tell me your background then.
52:39What did you train as?
52:40My background is I'm an occupational therapist.
52:4320 years I was working in clinics, in hospitals.
52:47I actually did home health, so I went to their home.
52:50I did it all, basically.
52:52And then I had the opportunity to go non-clinical.
52:56So talk me through the process.
52:58Well, when I got a new case, I was given doctor's notes.
53:02I was given admission notes.
53:04I was given therapy evaluations and assessments.
53:07I would put all of that information into a program called the PREDICT,
53:12and it would generate a recommended discharge date.
53:18Having an estimate of when somebody is going to no longer need specialist nursing care,
53:24there's nothing wrong with that in theory there, right?
53:28Patients always do best when they're at home.
53:31But if they're not safe at home, you have to do it in rehab.
53:37How often was that number of days about the right ballpark as you saw it?
53:43Probably 20% of the time.
53:45Really?
53:4620, 25%, yeah.
53:48Was it sometimes more, sometimes less?
53:51Generally, it would be three to four days under.
53:55They liked to say that the PREDICT took in six million patients' experiences.
54:01I still believe that, as humans, we made better decisions.
54:07You've got all this experience.
54:09Mm-hmm.
54:09Can you use your human judgment instead if you saw that it was wrong?
54:14Wouldn't that be great?
54:16Unfortunately, no.
54:18The expectation was I would stay within 3% of that estimated discharge date.
54:25But by the time I stopped working, it was down to 1%.
54:301%.
54:311%.
54:32That means this patient, who is very sick and can't get out of bed,
54:38guess what?
54:3910 days, you're out of here.
54:40That's not okay.
54:43I always made it very clear that I was just the messenger.
54:47I do not make the decisions.
54:49I had members' families scream at me.
54:53How did that feel?
54:54Awful.
54:55Because I wanted to just say, no, I 100% agree with you.
55:00I don't think that your mother should be discharged right now.
55:04But I wasn't allowed to.
55:06And they expect you to meet these certain goals.
55:10And the problem is, if you don't meet them,
55:12then they're costing company too much money
55:15and it goes against you as a care coordinator.
55:20So you as an employee have repercussions if you don't...
55:24Yes. If you don't stick with it.
55:25If these patients don't get discharged within 1% of the date
55:29that the algorithm says.
55:31I never met that metric.
55:33Mm-hmm.
55:34That was part of the reason that I was let go.
55:37It was all about the dollar.
55:39And I hated that.
55:45Amber really cares about her patients.
55:48I mean, that is so obvious.
55:50She, like, feels personally affronted by what she was being asked to do.
55:55Where this has fallen down has been in the fact that it is so inflexible.
56:05And it's this.
56:06The idea that we might ultimately give up control from humans to machines
56:11that gets to the heart of the feelings of injustice
56:14that can arise when technology clashes with human pain and suffering.
56:26In New York, jury selection in Luigi Mangione's trial is scheduled to begin later this year.
56:35And it's bound to elicit yet more controversy on all sides.
56:40His supporters hope those jurors will deliver a radical verdict.
56:44We have a feature in the American justice system called jury nullification.
56:48Where if a jury believes that a not guilty verdict would be the best delivery of justice,
56:55they can deliver a not guilty verdict, regardless of whether they think the person actually committed the crime.
57:02The people who made it will never be defeated.
57:14My time in the US had made me think about our fate in the UK.
57:21AI is already being used in the NHS, but it's being done with caution,
57:27and crucially, with human supervision.
57:30A tool to support humans, not to override them.
57:34And for me, that is when AI is at its best.
57:38Not something to be feared, but something to be carefully incorporated into our lives.
57:44AI could achieve extraordinary things.
57:47But this is a revolution that has to happen with us, not to us.
58:21To discover more about AI and how it can shape our future,
58:25go to connect.open.ac.uk
58:31or scan the QR code on the screen now.
58:34.
58:35.
58:35.
58:35.
58:36.
Comments