Skip to playerSkip to main content
  • 20 hours ago
AI.Confidential.S01E03
Transcript
00:05Good morning, New Yorkers. Another beautiful day in the city.
00:12The Marriott Hotel, Midtown Manhattan.
00:18The CEO of the largest health insurer in America sets out for a meeting of investors.
00:28But as he reaches the venue, a masked man approaches from behind.
00:35Breaking news. We're just hearing that the CEO of United Healthcare, which is the largest healthcare company in the country,
00:42was shot and killed right here in New York.
00:45Brian Thompson was shot just before 7 a.m. Police leave the shooting. It was a targeted killing.
00:50The shooting of the father of two led to a nationwide manhood. Five days later, a suspect was arrested.
00:5926-year-old Luigi Mangione became a controversial media sensation.
01:06Luigi Mangione, who has also been dubbed America's hot assassin.
01:11Instead of denouncing the killing, some seem to cheer it on.
01:19Furious about the U.S. health insurance industry.
01:23But what few people realize is that this is a story about artificial intelligence.
01:30Stop denials with AI!
01:32How many people have to die?
01:34And the growing use of AI in modern healthcare.
01:38People are dying!
01:40They're using AI to maximize the profits off of their clients.
01:47What happens when life and death decisions are no longer made by doctors, but by machines?
01:57Artificial intelligence. A machine beyond the mind of man.
02:01For decades, scientists have dreamed of creating incredible machines that could talk like us, learn like us, think like us.
02:13But what we didn't imagine is the impact they would have on us.
02:19In this series, I'm exploring what happens when AI collides with human lives,
02:25unearthing stories far stranger than we could ever have imagined.
02:29What happens when AI collides with human lives,
02:59what happens when AI collides with human lives would transform healthcare?
03:02What I couldn't predict is that it would take me to the scene of a murder on the streets of
03:07Manhattan.
03:08Hey, Hannah!
03:09Hi!
03:10How are you doing?
03:11Good, nice to meet you!
03:12I'm Hannah, lovely to meet you!
03:14I'm Hannah, lovely to meet you!
03:14Hannah Parry is a journalist for Newsweek, who lives in New York and was on duty the day Brian Thompson
03:20was shot.
03:20When did you first hear about it?
03:23Well, we first had the reports of a shooting up in Midtown, but it quickly came through that it was
03:29something more than that.
03:30We were able to see that it had been a targeted attack.
03:35Hi, I'm Brian Thompson, CEO of UnitedHealthcare, and welcome to the attendees of Reuters Total Health.
03:42This polished corporate video shows Brian Thompson addressing a conference, 18 months after he became CEO of UnitedHealthcare, America's largest
03:54insurer, with over 50 million customers.
03:57On the morning of his death, he was on his way to the company's annual investor meeting on West 54th
04:04Street.
04:04This is the entrance to the hotel where the conference is being held, and that's a surveillance camera that captured
04:10the shocking footage.
04:14There were multiple bullet casings, and that kind of became one of the key parts of the case.
04:20The suspect had inscribed three words on them, which were the words, delay, deny, depose.
04:26Those words are extremely similar to a very well-known critique of the insurance industry.
04:32Rather than prove claims, they would rather delay, deny, defend.
04:37It appeared clear that the suspect had something to say about the insurance industry.
04:43In the months prior to the shooting, reports were circulating about UnitedHealthcare's use of AI in deciding when to pay
04:51out to patients, and when to deny claims.
04:57The new investigation alleges one healthcare giant may have been giving the algorithms too much power.
05:04UnitedHealth pressured its medical staff to cut off payments in lockstep with a computer algorithms calculations.
05:13This news reports from 2023, which is a year before Brian Thompson was shot.
05:19I think if you don't live in America, you've probably never even heard of UnitedHealthcare until this Luigi story surfaced.
05:26But there is a lot going on with them, and at the heart of it all are algorithms and artificial
05:34intelligence.
05:39The potential for AI to transform healthcare for the better is enormous, and some of it is already here.
05:47A step toward medical super intelligence. That's what the CEO of Microsoft is calling the company's new artificial intelligence tool.
05:56From accelerating drug development.
05:59A revolution in drug discovery could cure cancer in the next 50 years by bringing new approaches to once impossible
06:07problems.
06:08To reading scans and detecting diseases. See that little square? Yeah. It finds a very subtle polyp that maybe you
06:15would have missed.
06:16And assisting doctors during complex operations.
06:20A 3D map of a patient's pelvis is generated from a CAT scan to help guide the surgeon in real
06:26time.
06:26In the UK, AI is already being used in the NHS to detect the risk of leukemia and to identify
06:34early signs of lung cancer.
06:36For our overstretched health service, this technology has huge potential to boost efficiency and bring down costs.
06:44The revolution in artificial intelligence offers a golden opportunity to deliver better care at better value.
06:51And the NHS will usher in a new age of medicine, leapfrogging disease.
06:56So we are predicting and preventing it, rather than just diagnosing and treating.
07:06In the US, healthcare is a huge business.
07:10I wanted to find out more about the claims that United Healthcare had been using an AI to make crucial
07:16life and death decisions.
07:19So I went to meet a doctor working in the oldest homeless shelter in Los Angeles.
07:25Okay, so let me take a look.
07:26Mary Marfasee is a medical director at the Union Rescue Mission.
07:30And then tell me how you are.
07:33Okay.
07:35Mary, hi.
07:37Hi.
07:38Hi. Nice to meet you.
07:39It's lovely to meet you. How are you doing?
07:41I want to give you a little tour.
07:43Mary has spent her career providing medical care to the city's most vulnerable.
07:48But in 2023, when her family needed help, she believes they found themselves at the mercy of AI.
07:55Mary, is that your hat?
07:56That's my husband's hat.
07:58Oh, that's lovely.
07:59I thought I'd bring it with me today.
08:03Like, isn't that one of the most handsome older white guys?
08:06Oh, gee.
08:06He looks so English. He was Welsh.
08:08He was American Welsh.
08:10The story with your husband. When did this all start?
08:14Going back about three years ago, I noticed he was having a lot of balance problems.
08:18Just falling too many times. And this was a very athletic man.
08:22And then finally, one fall that was so bad, fractured his nose and his face was filled with blood and
08:29had some cranial fractures with it.
08:31Got him in the hospital and finally got a diagnosis, which was accumulation of fluid on the brain.
08:36Oh, my gosh.
08:37Right?
08:38Oh, wow. That's a proper fall.
08:41Yeah, it was rough.
08:42So what happened?
08:43So we stayed in the hospital and then he got into a rehabilitation center.
08:48And then within just a couple of weeks, I was told by the center, okay, his time is up.
08:53He's no longer in need of services. And I said, what? He can't even brush his teeth.
08:58He can't go to the bathroom on his own. And I thought, something's not right here.
09:02So we had him home and then again, another fall within just a few months. And the same thing.
09:07I was told by the center, okay, his time is up. And anytime I would complain about it, I'd hear,
09:12oh, well, at his age.
09:14And I got tired of hearing that preamble.
09:17So when he was discharged, I mean, if he's not in a fit state to walk out of the hospital,
09:22what happened?
09:23Wheelchair. Just wheelchair to the car. Staff takes him out, loads him in my car and I drive away.
09:30And how'd you get him in the house to the other end?
09:32It was a struggle. One time he fell back on me. I dislocated my right shoulder.
09:37Because this is just not a man who's strong enough to be back home.
09:39Right. That wasn't good.
09:42And that's when I started calling UnitedHealthcare, saying, who's making these decisions?
09:46And the response I would get is clinical team.
09:50Right.
09:51And I started to say things like, well, who runs the clinical team?
09:55And then I would just get the runaround.
09:57So I called the ombudsman. And that's when I was told AI is involved.
10:02OK.
10:04After each fall, he wasn't getting enough physical therapy or occupational therapy.
10:09You just get weaker and weaker over time?
10:11Yeah.
10:12Yeah.
10:15You think he didn't need to die when he did?
10:18I don't think so.
10:19He had goals to live much longer.
10:23I really thought he could get back to some of his function.
10:27Had he been given the care he needed?
10:29Yes, I think so.
10:32How angry are you about all of this?
10:36It's more I miss him more than anger.
10:39I don't want it to happen to anybody else.
10:47All right.
10:48Thanks for coming.
10:48Thank you so much.
10:49I got to give you a hug because it meant so much to have you.
10:51All right.
10:52Stay in touch.
10:52I'll send you some things.
10:59There's a kind of irony to this story.
11:01You have somebody who spent her entire life advocating for people's health,
11:06trying to make sure they get the care that they need.
11:08When she needed it, when her family needed it,
11:12despite having paid for it, it wasn't available.
11:17Frank and Mary's case was not an isolated one.
11:21By early 2023, the story was starting to get out,
11:25alleging the widespread use of AI algorithms by big insurers,
11:30including United Healthcare, to deny elderly patients care.
11:36Later that year, there was a Senate inquiry.
11:39The reason we're here today is that all too often the big insurance companies
11:44have been failing seniors when they need care.
11:47And perhaps most troubling of all, there is growing evidence
11:51that insurance companies are relying on algorithms,
11:54rather than doctors or other clinicians,
11:56to make decisions to deny patient care.
12:02Two months after the Senate committee published its report,
12:06Brian Thompson was shocked.
12:09Let's head to New York because police there are continuing their hunt
12:12for the gunman who killed the boss of one of the biggest companies in the world.
12:16The shooter took off down an alleyway around 55th Street
12:19and is currently still at large.
12:21In the days after the shooting,
12:23the hunt for Brian Thompson's killer gripped the nation.
12:27Police are looking for a suspect described as a man about 6'1",
12:31wearing all black.
12:32Police drones, helicopters and thousands of CCTV cameras
12:36are combing the city street by street.
12:38On Thursday, detectives shared new images of the man they want to question.
12:42The image, caught on surveillance camera,
12:45shows him standing at the check-in desk at a hostel.
12:48The man appears relaxed and smiling.
12:51Just a few minutes later, he shot Mr. Thompson dead
12:54before making his escape.
12:57Then, five days after the killing,
12:59police received a tip-off from an employee
13:02at a McDonald's in Pennsylvania.
13:05Um, what's your name?
13:07Uh, Mark.
13:08What is it?
13:09Mark. Mark?
13:10Yes, sir. Mark what?
13:11Azaria.
13:12Azaria?
13:12Someone called.
13:13Right, they thought you were suspicious.
13:15Oh, I'm sorry.
13:16Um...
13:16Is your ID on you?
13:17Yes, sir.
13:18Thanks.
13:22The alleged killer has been identified as Luigi Mangione.
13:27As Luigi was led into court after his arrest,
13:30it was clear he had something to say.
13:33Go, go.
13:37Oh, I got...
13:38I was completely out of touch
13:39and the insult to the intelligence of the American people
13:43and the life experience.
13:47Luigi Mangione, the man accused of killing
13:49the US healthcare insurance chief executive,
13:51Brian Thompson, appeared in New York
13:54to face 11 state criminal counts
13:56that could lead to a death penalty sentence.
13:59Then, something extraordinary happened.
14:02Free Luigi!
14:04Free Luigi!
14:07Almost immediately,
14:09protesters flocked to his court appearances,
14:12hailing him as a folk hero.
14:15I'm here because I support universal healthcare for all
14:18and I'm here because I believe that Luigi Mangione's
14:21civil rights are being violated.
14:23So what brought you here today?
14:24Um, this case is a case about humanity.
14:27I think everybody in the crowd identifies
14:29with the extortionist nature of our American healthcare.
14:32They implemented like an AI bot to review the claims.
14:35Some of those people had life-threatening illnesses
14:37that were rejected for efficiency.
14:39What's your feeling in general about AI?
14:42I mean, um, technology is gonna find its way into everything.
14:46So to stand against it is...
14:47You become a fossil.
14:49But we have got to continue to invest
14:51into the human components of things
14:53because as we saw,
14:54you can't just automate everything, right?
14:55Somebody has to be looking and watching normal.
14:57People are dying!
15:00People are dying!
15:02People are dying!
15:03I myself have had a terrible experience
15:06with American healthcare
15:07despite having one of the better plans.
15:09Both my mom and I have had to battle
15:12and fight tooth and nail AI claim denials.
15:16It's really insidious the way that AI is infecting
15:19every aspect of our anxiety,
15:21but specifically something so important and crucial
15:23such as healthcare
15:24that already shouldn't be for profit.
15:27They're using AI to maximize the profits
15:31that they can make off of their clients.
15:35Stop denials with AI!
15:37How many people have to die?
15:39Stop!
15:39I mean, there's this, like, incredible strength of feeling
15:42from the protesters.
15:44You can understand, too.
15:45You know, if you've been personally affected by this
15:47or if someone in your family has,
15:49like, I understand why this is such an emotional moment.
15:53But, you know, I also think that, like,
15:56someone was murdered here.
15:58You know, someone with a family, with children.
16:01And I think there's a bit of mental gymnastics
16:03going on with these protesters
16:05where they sort of conveniently manage to forget that fact.
16:15In situations like these where emotions are high,
16:19it's sometimes easy to turn on technology,
16:22especially when it feels like it's something
16:24we don't fully understand and can't easily control.
16:29AI didn't create the perceived problems with US healthcare,
16:33but it has supercharged them.
16:36And to understand how, it helps to know
16:39what an AI algorithm actually is.
16:43I think the words algorithm and AI get thrown around quite a lot.
16:46It gets quite confusing.
16:47So I thought I would explain the difference between the two
16:50in the most New York way possible
16:51by imagining that you are opening up a new hot dog stand.
16:56Now, you've got a couple of options here.
16:58You could use an algorithm.
17:01An algorithm is a series of steps for completing a task.
17:07In the case of our hot dog stand, the task is to sell hot dogs.
17:13The inputs are the sausages, the onions.
17:17The algorithm is the instructions.
17:20Cook 100 hot dogs a day, sell them for $4 each,
17:24stay open till 11 on Friday.
17:27And the output is, hopefully, a tidy profit.
17:33With this traditional type of algorithm,
17:36everything is spelled out in advance.
17:37It means it's very precise, it's very reliable,
17:40but it's also completely inflexible.
17:43Now imagine that you've got a stand that is run by an AI algorithm.
17:48So this time, you don't tell it what to do.
17:50You just give it the inputs, the buns, the sausages, the onions,
17:53and you say, the only thing I care about
17:56is how much money you make.
17:58Now, at first, the AI is just going to watch.
18:02It's just going to collect data, it's going to hunt for patterns,
18:04like when people are buying the most,
18:07where has the most footfall, how that changes with the weather.
18:10And then, after a while, it's going to start suggesting things
18:14that you hadn't even thought of,
18:16like pitch up outside a dog park on Sundays,
18:19or start selling vegetarian sausages,
18:21or people tip more when the cart smells like onions.
18:25But the thing is, you didn't tell the AI any of those rules,
18:28it discovered them for itself.
18:31AI algorithms are capable of crunching huge quantities of data
18:35and analysing complex patterns of behaviour,
18:39all in order to make your hot dog stand profitable.
18:46And that is the really key difference here.
18:49For an AI algorithm, you don't need to spell out
18:51every possible scenario in advance.
18:54You just define the goal and let the AI learn for itself.
19:01Just because an algorithm uses AI doesn't mean it's necessarily better,
19:06but it will ruthlessly pursue whatever goal it's set.
19:12I wanted to find out more about the specific algorithm being used by United Healthcare.
19:19So, I tracked down the two investigative journalists
19:22who were the first to uncover the story.
19:25Hey!
19:26Hey, how are you? How are you doing?
19:27Good to meet you.
19:28I'm Hannah.
19:29Nice to meet you.
19:29Lovely to meet you.
19:30Hey, Bob.
19:31Lovely to meet you, Bob. How are you doing?
19:32Good.
19:33Casey Ross and Bob Herman were nominated for a Pulitzer Prize for their investigation.
19:38How big has this story been for you in terms of your journalistic career?
19:42I think this is probably the biggest story that I've done
19:45in terms of the impact that it's had.
19:47I think that the reaction that we got
19:49and some of the fallout in terms of lawsuits in the US Senate
19:52validate that this was a story that had an extraordinary impact.
19:57What was the first spark of the story?
19:59This person in the nursing home industry sent me an email
20:02and it was just this visceral reaction that,
20:05hey, health insurers are issuing a lot of denials
20:08and they're not telling us why.
20:10The people that were in this facility were getting removed
20:14when they were still very sick and were not ready to go home.
20:16And so it was just a signal that, okay,
20:18well maybe we should ask some more questions about this.
20:20And what we found was that this all centers around an algorithm
20:23called NH-Predict and that algorithm is used
20:26on behalf of insurance companies to reduce the amount of time
20:30that people are in these facilities and to control their cost of care.
20:34How much sight do we have of the actual algorithm that's going on underneath this?
20:39Yeah, so there are a bunch of pieces of data that they are feeding into a model.
20:43The age of the person, what was their primary diagnosis,
20:46what other illnesses do they have?
20:48It compares that patient to other patients like them
20:52in a database of six million patients.
20:55And based on this comparison, this is the amount of care you should get.
20:59Once that prediction is made, basically a date is circled on the calendar.
21:03And this is the date that they're trained to push these people toward.
21:07This is the report that's produced by the algorithm.
21:10And it all boils down to this prediction,
21:12which is that the estimated length of stay of the patient in this case is 16.6 days.
21:18Point six?
21:19Yes.
21:20What, so it's like to the hour?
21:21It predicts it down to the decimal point.
21:23You can't predict when somebody's going to be fine to the hour.
21:26No, and it's the type of information that only an AI algorithm could give you.
21:31It doesn't take into account a lot of things about these people.
21:36Every healthcare journey is different.
21:39Every injury, every recovery, everything that comes up in the course of your care
21:43causes all sorts of things to happen that can't possibly be predicted.
21:47Literally, they're boiling down these people to numbers.
21:50Oh, you're not just a number.
21:51Yes, you are.
21:52And there it is.
21:54There is a counterargument to all of this, right?
21:56In that people are quite accustomed to artificial intelligence that compares you to other people.
22:02Like, you know, the recommendation algorithms that you get on Netflix or Spotify or Amazon.
22:06It's like, people like you did this.
22:09Therefore, do you want this?
22:11Yeah, and I think the common experience, or at least my experience, is that most of the time, I think
22:16the algorithm is wrong.
22:18It's way off.
22:18Just because I watch this cooking show does not mean I want to watch this other show.
22:23But if you're talking about me in the hospital or after a serious injury and it gets that wrong, well,
22:29I have a much bigger problem with that.
22:32Casey and Bob's damning reporting concluded in 2024, throwing UnitedHealthcare into the spotlight.
22:41Just months before Brian Thompson was shot dead.
22:46Can you remember when you first heard about Luigi Mangione?
22:49I mean, I remember the morning that it happened, we started getting texts from people.
22:53And we're like, holy crap, like, is this actually for real?
22:56We don't have any indication that this individual was inspired by a reporting.
23:01But nonetheless, as a reporter, you're stunned.
23:04You're shocked.
23:04You're like, you don't ever want any of your reporting to inspire somebody to act that way.
23:20Luigi Mangione is not someone you would expect to take on the role of outlaw vigilante.
23:29He studied at the prestigious University of Pennsylvania, one of America's Ivy League.
23:37Luigi Nicholas Mangione.
23:40Luigi.
23:42You can see in this video of his high school graduation that Luigi already stood out.
23:48He's giving the valedictorian speech, which means he's being celebrated as the most academically successful student in his year.
23:57Family, friends, faculty, and fellow students, good afternoon.
24:04He was clearly a popular, confident young man.
24:10We spoke to a few of his classmates who didn't want to go on the record
24:14because they have been continually hounded by the press ever since, which you can understand.
24:18And everyone, almost to a point, say that Luigi was, like, smart, he was helpful, he was kind, he was
24:25friendly.
24:25He was a popular guy from a good family who was also really well-educated.
24:32But then I discovered a remarkable detail about Luigi.
24:36One that's been all but overlooked.
24:39He did computer and information science as his major, but then he also then got a master's degree from here
24:48in computer science
24:50with a concentration in artificial intelligence.
24:54One of the modules that he was taking in here was exactly the stuff that sits behind this algorithm.
25:01It's called Data Structures and Algorithms.
25:03Very neaty, very mathematical, algorithms course about data structures and computer science.
25:09Like, it's tough.
25:10And the thing is, Luigi wasn't just doing well in this course.
25:13He was doing so well that he was appointed as an assistant tutor.
25:18Look at this photo. I mean, he looks so confident.
25:22He was basically teaching other students who were the same age as him.
25:27This is stuff that he knew incredibly well and, by the sounds of it,
25:33was also very, very good at.
25:38Most of Luigi's friends have so far refused to go on the record.
25:43But back in the UK, I tracked down one person who was willing to talk.
25:49Gawinda Bogal is a British blogger who writes about the impact of technology on society.
25:55Hi.
25:56Hi, Anna. How are you doing?
25:57Nice to meet you.
25:58Lovely to meet you. Thank you for this.
26:01Luigi was a subscriber to Gawinda's blog.
26:05And the two of them talked about their shared concerns.
26:10When I spoke to Luigi, he was in Japan.
26:12And he was remarking on how sort of lonely it felt in a lot of places.
26:18The streets were empty because everything's just automated.
26:21He was quite concerned about mass automation and the knock-on effects that this might be having on society.
26:28You can live your entire life without leaving your house.
26:31You know, you can do your shopping online, you can do your banking online, you can do your dating online.
26:36You can even have a full relationship online.
26:40There's less human connection and AI is going to make this a lot worse.
26:43I mean, he himself probably had no problem making friends.
26:47He was a very charismatic individual.
26:49But I think he was worried about other people and maybe how they were kind of gradually being lost.
26:56The connections that we have, they keep society together.
26:59And he was worried that all of that is unravelling.
27:02Did you talk about healthcare as well?
27:04There was only one brief exchange that we had.
27:08Luigi made a passing remark about how I was lucky because we had the NHS in the UK, you know,
27:14free healthcare.
27:14How did he seem to you?
27:17He actually seemed quite cheerful.
27:19Someone who did not seem particularly pessimistic, although some of the issues that he raised were quite pessimistic.
27:28But his general demeanour was actually the opposite of that.
27:30He said that he wanted me to focus less on the problems and more on the solutions.
27:35He would ask me, you know, so what's the practical takeaway of this?
27:38He wasn't just interested in moaning.
27:41He wanted to find real solutions.
27:47Since his arrest, the Luigi Mangione story had taken on a life of its own.
27:53Luigi Mangione, the internet's favourite hot Italian sausage.
27:57And social media had exploded with memes about the alleged killer.
28:02Luigi, you're a brave Italian stallion whose actions ignited a national dialogue about the USA's crappy healthcare system.
28:13If you feel the same way about me as I feel about you, please do not deny, delay or defend
28:19your love.
28:21There was also no shortage of theories about what happened and why.
28:26There is a note which was supposedly found in his backpack when he was arrested and the press have decided
28:32to call this his manifesto.
28:34In the document, he appeared to talk about the planning of the alleged crime, but then moved on to health
28:40insurance.
28:42There's one bit here that says, frankly, these parasites had it coming.
28:46Then he specifically name checks United and then he says, these, indecipherable, have simply gotten too powerful and they continue
28:56to abuse our country for immense profit because the American public has allowed them to get away with it.
29:04The other thing that the internet is obsessed with is his own personal story.
29:09It's how he ended up feeling so strongly about this particular issue.
29:14I've got his X profile.
29:17In the middle of his banner, he's got this X-ray.
29:22And I mean, we know that he had a spinal problem, we know that he had surgery for it and
29:25that's, I mean, that's pretty extreme, that surgery.
29:29But what is interesting is that there's no evidence that he was actually a client of United Healthcare.
29:36And so far, nobody's found anything to prove that he had a personal insurance claim denied even.
29:45As humans, we understand being let down by other humans.
29:50Real doctors often don't have all the answers, but we forgive them for their very human flaws.
29:58But when it comes to technology, we have very little tolerance for error.
30:04The flip side is also true.
30:07When AI promises extraordinary medical possibilities, it's all too easy for us to believe in the impossible.
30:24Just a few miles across the state line, I've been invited to meet a young tech entrepreneur
30:29who is selling a seemingly incredible new medical breakthrough.
30:34Hey!
30:34Hey!
30:35Using AI.
30:36How are you?
30:37Yeah, very good.
30:3825-year-old Kian Sadehi runs a tech startup called Nucleus Genomics.
30:44Are you sequencing the entire genome?
30:46The entire thing.
30:47The whole thing?
30:48The whole thing.
30:51His company uses AI algorithms to analyze the DNA of his customers' future children like never before.
30:59These are nitrogen tanks, I think?
31:01Yes.
31:03How many embryos do you reckon there are in here? Like millions?
31:06If all the samples here were embryos, yes, there would be millions, yes.
31:10Millions of potential new humans?
31:12Yeah.
31:14Kian uses AI to map each embryo's DNA, comparing it against huge DNA databases to try to predict a baby's
31:23future risk of disease.
31:25Some diseases can be predicted with certainty. For others, he can say how people with similar DNA turned out.
31:34When I talk to a couple, they say my grandfather had Alzheimer's. I want to do anything I can to
31:39make sure my son doesn't have Alzheimer's.
31:41And then I think, well, genetics obviously can help with that. And so I think more people are going to
31:45use IVF and they're going to be using genetic optimization technology to basically pick their child.
31:52Figuring out whether a baby will develop Alzheimer's later in life is a best guess, not a diagnosis. But the
32:00company goes further, helping parents choose an embryo based on eye color, height or even IQ.
32:09Welcome to the Nucleus office.
32:11Kian had already raised $32 million from investors.
32:16By the way, if you ever see how many tabs I have on the right side of the screen, I
32:19don't think you should ever show that on the camera.
32:21It is actually so insane.
32:23I think it also slows down. I think it uses it for your RAM.
32:26Anyways, let me show this.
32:28He just launched a glossy new ad campaign.
32:31I think it is. Okay, let's go.
32:32Nucleus Embryo is for couples doing IVF to uncover the full genetic profile of each embryo in one intuitive platform.
32:43Every parent deserves the power to decide what possibility feels right for their family.
32:51Some people don't think you should have this choice, but it's not their choice to make.
32:58It's yours.
33:00You are not afraid of controversy, are you?
33:04No, no, no, no. It's not.
33:06It's, you know, if you were doing IVF and you had five embryos, would you want to pick your future
33:11baby randomly?
33:13Or would you ask the doctor for more information on each, especially if you have a family history of Alzheimer's,
33:18of cancer?
33:19It's my right to know this information, it's my choice to say I want a baby at lower disease risk,
33:23I want a baby that's slightly taller, or even with a specific eye colour, etc.
33:27It's their right, it's their choice.
33:28I think what you're doing, where people have genetic predispositions for particular diseases, especially when they're preventable or treatable, I
33:35think it's amazing.
33:36What I don't understand is why you would include something like IQ and eye colour, because it's so controversial.
33:43And it's like, it's a little bit eugenics-y, you know?
33:46You are implicitly saying that taller is better.
33:49You are reinforcing the ideas of preferences that some people are more valuable than others.
33:56No, not at all.
33:58I think parents have the right to choose, across their embryos, right, if they want a baby with a lower
34:05disease risk, for example, or if they want a baby that's shorter or taller.
34:08That's absolutely their right to choose.
34:10This is designer babies, though. I mean, bluntly, this is designer babies.
34:12No, no, it's not designer babies at all, actually.
34:15How is it not designer babies?
34:16It's not designer babies at all.
34:16Well, if a parent wants to give their child the best start in life, that is, that is, that is
34:21like a parent doing the most basic in my mind and human thing.
34:25If a parent, if the moment a child is born, they will run multiple tests on a baby to make
34:31sure the baby is healthy.
34:32They'll give it vaccines, for example, to make sure they don't get diseases.
34:35This is just another tool in the toolkit that helps parents do that.
34:40But there was something else I was worried about.
34:43Even the best scientists don't fully understand how genes and environment combine to make us who we are.
34:50I was worried that all of this complexity was being overlooked in favour of tantalisingly simple AI predictions.
34:59Do you think that the technology is mature enough, accurate enough, capable enough to be giving people this illusion that
35:11they have control?
35:12The platform, I think, does an excellent job showing the uncertainty and showing the fact that, again, DNA is not
35:18destiny, and DNA will never be destiny.
35:21People can have certain just genetic dispositions, but there's the whole thing called life, which is there's environment, there's how
35:26you're nurtured, how you're raised, nutrition, etc.
35:29We cannot possibly reduce human life to just a DNA strand.
35:40Testing for physical characteristics is banned in the UK.
35:44But there's nothing to stop British couples travelling to the US for it, as long as they're willing to pay
35:50the approximate $40,000 price tag.
35:57Hi.
35:58How are you doing?
35:59Welcome.
36:00Hello.
36:02Dragos and Laura live in London, and we're at the start of their IVF journey with Kian's company.
36:09These are beautiful.
36:10Are these your babies?
36:12Yes, our two children, two boys.
36:15So your two boys, did you have them naturally?
36:18Yes, we did.
36:19Yes.
36:19So why not naturally for the third?
36:22Because Dragos was kind of afraid we are, I'm 40, he's 43.
36:27When you're over 43, it was clear that the chances of having issues are very high.
36:33If this wasn't possible, would you still want to have another baby?
36:38Definitely not.
36:39Not?
36:40Not.
36:41Because you don't want to take chances.
36:43Everybody knows how important a healthy child is, and I think everybody should do the extra mile for that outcome.
36:50Where are you at now, then, in the process?
36:53In less than two months, we are going to New York to start the IVF process.
37:00And then they will test the egg, they will take a few cells, they will zoom in on anything that
37:05can happen.
37:06And then once we have selected the right embryo, then we will go back to New York for the implantation,
37:11which takes only one or two days.
37:13So in theory, in a few months, you could be pregnant.
37:18Yes.
37:19Fingers crossed.
37:20Yeah.
37:22We both want a healthy child, a healthy embryo.
37:26We are also hoping she's going to be a girl.
37:31Is that the dream? A little girl?
37:33Yes.
37:34This was my dream all along, to have three babies.
37:38So if a girl would come, would be the perfect picture for myself.
37:43When you picture your daughter, or your potential daughter in your head, what does she look like?
37:51Beautiful.
37:55Brown eyes, brown hair, light skin, big lips, full lips. I don't know.
38:04So she'll be looking like you. Not like me.
38:09Hopefully not bald. With a beard.
38:14Some people worry about the nuclear stuff because it effectively prioritizes some characteristics of the baby over others.
38:23For example, IQ, eye colour, height is stuff that you can, in theory, at least give a probability to, right?
38:31You can measure.
38:31I would not go through this just for the height and the eye colour. But if height is what we
38:38can find out now, maybe it's important, maybe not. But, you know, we'll take that because it's on the table.
38:45They give you a probability of the IQ that he or she might have.
38:52And that's going to be one you'll look at?
38:54Of course. Of course it will.
38:58Before Dragos and Laura started their IVF cycle, there was a first step. Getting their own DNA mapped by nucleus.
39:06I don't know. Let's see.
39:08To see if they were carriers for any diseases they could pass on to their baby.
39:12The results of these genetic tests had just come in.
39:16The moment of truth.
39:21Feels like unravelling a present. You don't know exactly what's inside.
39:29So, our family summary.
39:32We have 2,154 rare diseases tested and one detected risk for children.
39:40Focus endotelial corneal dystrophy.
39:44Symptoms often begin at 50. So, imagine our child is born this year.
39:4950 years from now, I'm sure we'll have bionic eyes that we can replace, hardwired to our brain.
39:56So, I don't think this is serious enough for us to be worried about it.
40:01Of all of the things that could potentially be risky, Alzheimer's, Parkinson's, breast cancer.
40:06Yeah. This is very...
40:08Low level. Low level mild.
40:11Don't worry. You and Laura can still have healthy children.
40:17So, I think this is important.
40:19Hmm.
40:23I think if you're having a baby in your 40s, I can completely understand why you would want as much
40:30information as possible.
40:31If you're prone to worry, why wouldn't you try and eliminate the risks where you possibly can?
40:41But at the same time, I'm worried that this technology, in this way, gives the illusion of control that you
40:50don't actually have.
40:52Gives the illusion of certainty and prediction of the future that doesn't really exist.
41:13Over in the US, Luigi's case had proceeded through the courts.
41:18Charged with murder in the first degree, killing as an act of terrorism and criminal possession of a weapon.
41:24He faced his plea hearing.
41:26Are you a creature to indict and serve? Guilty or not guilty?
41:30Not guilty.
41:32Stoking further outrage on both sides.
41:35One struggle, one fight. Now there is a human right.
41:39We have very disturbed people who somehow think that eliminating a father who has two young children over some cause
41:48and somehow justified.
41:51But on the other side of America, another legal case was quietly gaining its own momentum.
42:03A class action lawsuit against UnitedHealthcare's use of AI had been launched by one plucky public interest firm called Clarkson
42:14Law.
42:15They'd also launched suits against other big insurers for similar claims.
42:20Lawyer Glenn Dannas is leading the charge.
42:25We know for sure that there is an algorithm of some kind that is predicting how long you need in
42:32rehabilitation.
42:32And we also know that some people think that that wasn't long enough. Is that fair?
42:39Yes. I mean, from our perspective, that's a vast understatement.
42:42But that is, in fact, true. Yes.
42:45The length of stay value is consistently too low.
42:49And it seems highly unlikely that this is an accident because it's only ever in one direction.
42:56How does the legal case come in?
42:58One way it's illegal is because there are different states, like California, that require that it be a human making
43:04decisions.
43:05So that, by law, it cannot be delegated to an AI, an algorithm, anything other than a human medical professional
43:15exercising his or her professional knowledge and specialty.
43:22How many people are involved in this? How many plaintiffs are there?
43:26Because United has such a large market share in America, it's almost certainly in millions.
43:33If you eventually win this case, does that mean that they have to compensate everybody who held that type of
43:40insurance?
43:41What we want, whether it's by a settlement or by going to trial, is to have all the people who
43:46were denied what was owed to them to be paid for that, and then to change these practices going forward.
43:55The Clarkson case rests on the evidence of former United Healthcare customers who have come forward to testify.
44:03And in particular, those who have survived to tell their tale.
44:08Hey!
44:09Bill, this is Hannah.
44:11Hi Hannah.
44:12Hi Hannah.
44:12One of those is 86-year-old Bill Hull.
44:20How are you guys doing?
44:21Well, as well as can be expected, I guess.
44:25Your koi are quite the beasts.
44:28I chuck them in.
44:31So tell me, how are you doing now? How's your health at the moment?
44:34About 70% of my heart is shot.
44:37Hmm.
44:39But the worst part has been the paralysis that the stroke caused me.
44:45I used to be very active.
44:47Like we did this thing.
44:50Did all the bricking here.
44:51Can't do any of that anymore.
44:54In June 2023, he suffered a heart attack on his way to a medical appointment.
45:00There's a park bench, big long park bench, just outside the building.
45:04I just slumped over apparently.
45:06Two medical technicians, that new CPR, laid me down on the bench and they broke all my ribs.
45:13And I understood later, if you don't break ribs, you ain't doing it right.
45:18I was in the hospital for 25 days.
45:21In intensive care?
45:23Almost all of it was intensive.
45:25I'm there and I think at the 22nd or 23rd day, I got a notice that you're to be released.
45:32They said, well, we'll sign our case manager to you.
45:35She got a hold of me and said, I have talked with your doctors and everyone recommends that you go
45:42into skilled nursing.
45:43A day later, she came back and said, well, I don't know.
45:46They said, no, they would not approve it and wouldn't give me a reason.
45:52Clarkson Law argue that this denial was likely the result of United Health Care's predict algorithm.
45:59They made you feel like you needed to get out and yet they were giving you no place to go.
46:04They said, where do you want to go?
46:05I said, I guess I'm going to go home.
46:07OK, so we'll get you out of here.
46:10Stuck me in the car.
46:11What physical state were you in at this point?
46:14My wife was very worried because she didn't know if she could take care of me.
46:19She's 85 years old.
46:20She's macular degeneration, has a hard time seeing.
46:23She's using a walker too, like I am.
46:27Anyway, I get home.
46:29I was two and a half, three days out of the hospital.
46:32My daughter came over and I was in bad shape.
46:36I think slurring my words, I was starting to have a stroke.
46:40I could not move a finger, couldn't move my arm, couldn't move my foot.
46:44You feel yourself just literally dying.
46:48It's terrifying.
46:50Oh, worst thing that ever happened to me.
46:53Do you think that you would have had the stroke, regardless of where you were?
47:00Well, I think I would have had the stroke.
47:01But I should have been in either a skilled nursing or in the hospital.
47:07And I would have gotten some attention several hours before.
47:12And what I ended up with would have been much less than what I ended up with.
47:20Bill is now looked after by his daughters, Lisa and Laura.
47:25What's your take on artificial intelligence being used in your dad's healthcare?
47:29Certainly the whole time he was in the hospital, we assumed it was doctors who worked for that hospital
47:37who were making these decisions and who were denying this care.
47:41And it wasn't until he found the class action lawsuit that he believed it was actually the AI system that
47:49was being utilized by United Healthcare.
47:51I guess if a human had been making that decision, you would have been able to have a conversation with
47:55them.
47:55You would think.
47:56If an algorithm says it, you can kind of wash your hands of it.
48:00I feel like that's what they're doing.
48:01If they can make AI do their dirty work, I think they're very happy to do that.
48:06How do you feel about the murder of Brian Thompson?
48:08I think it's indicative of how frustrated human beings can become with huge corporations like United Healthcare.
48:18But I 100% don't condone what happened and wouldn't condone any action like that in the future.
48:25But the anger and the upset is real and it's justified.
48:35The thing is, I'm not anti-AI, right? I'm not anti-algorithms.
48:40I think that there is an incredible amount of waste that happens when humans make decisions.
48:46I think there's incredible efficiencies and therefore better care that you can provide people when you carefully introduce these kind
48:54of systems.
48:56The problem is that once you start automating stuff, it all comes down to exactly what that algorithm was designed
49:03to care about.
49:04You know, was it designed to care about improving the outcomes for the patients in the long term?
49:11Or was it designed to minimize the amount of money that is spent on caring for the patients?
49:19And those are two things that are often at odds with one another.
49:30But the problems with algorithms go deeper.
49:37Here is one thing that you should know about AI.
49:40Sometimes people call it a black box.
49:44And there is a good reason for that.
49:45You have to imagine that at one end you are putting in some information, your input.
49:50And at the other end you get some results, your output.
49:53Now, the question is, what is going on in the middle, inside of the algorithm?
50:00Now, there is nothing magical going on here.
50:03There's no voodoo.
50:04It's just loads and loads and loads of calculations.
50:07But these things are so big and unwieldy that it becomes impossible to follow one thread from one end all
50:16the way through to the other.
50:17And that means that artificial intelligence is sometimes finding patterns that we cannot see, are unable to check and might
50:27not like.
50:30This problem was highlighted in 2019 by researcher called Ziad Obermeyer.
50:37He looked at an AI algorithm being used in hospitals to identify patients most in need of care and offer
50:45them help on a special program.
50:49But he discovered that the patients the algorithm selected were disproportionately white.
50:54These algorithms should have been a great use case for AI.
50:58But unfortunately, a design choice in building those algorithms made them biased.
51:04Obermeyer couldn't see inside the AI, so he worked backwards from the results and found that the algorithm had used
51:12a shortcut.
51:12It wasn't finding the sickest patients. It was finding the ones who'd had the most money spent on their care.
51:21Black patients have less money spent on them by our healthcare system today because of barriers to access and because
51:27of discrimination.
51:28And that means that the AI saw that fact clearly. It predicted the cost accurately.
51:34But instead of undoing that inequality, it reinforced it and enshrined it in policy.
51:40Even though the objective of the algorithm was good, the outcome led to discrimination.
51:46And until Obermeyer, no human had been there to spot the difference.
51:58We may never be able to see inside the algorithm used by United Healthcare to assess claims.
52:05The closest we can get is by talking to those who worked alongside it.
52:11And until now, very few company insiders have ever spoken on the record about this AI.
52:16But one former employee had agreed to meet me.
52:21Lovely to meet you. Would you like something to drink?
52:24Thank you so much. Look at this ice and everything.
52:28Amber Lynch was a care coordinator responsible for entering patient data into the algorithm and ensuring they were discharged on
52:36time.
52:37So tell me your background then. What did you train as?
52:40My background is I'm an occupational therapist.
52:4320 years I was working in clinics, in hospitals.
52:47I actually did home health, so I went to their home.
52:50I did it all, basically.
52:52And then I had the opportunity to go non-clinical.
52:56So talk me through the process.
52:58Well, when I got a new case, I was given doctor's notes.
53:02I was given admission notes.
53:04I was given therapy evaluations and assessments.
53:06I would put all of that information into a program called the PREDICT.
53:13And it would generate a recommended discharge date.
53:18Having an estimate of when somebody is going to no longer need specialist nursing care,
53:24there's nothing wrong with that in theory there, right?
53:28Patients always do best when they're at home.
53:31But if they're not safe at home, you have to do it in rehab.
53:37How often was that number of days about the right ballpark as you saw it?
53:43Probably 20% of the time.
53:45Really?
53:4620-25%, yeah.
53:48Was it sometimes more, sometimes less?
53:51Generally, it would be three to four days under.
53:55They liked to say that the PREDICT took in six million patients' experiences.
54:01I still believe that as humans, we made better decisions.
54:07You've got all this experience.
54:09Mm-hmm.
54:09Can you use your human judgment instead if you saw that it was wrong?
54:14Wouldn't that be great?
54:16Unfortunately, no.
54:18No.
54:18The expectation was I would stay within 3% of that estimated discharge date.
54:25But by the time I stopped working, it was down to 1%.
54:301%.
54:311%.
54:33That means this patient who is very sick and can't get out of bed.
54:38Guess what? 10 days you're out of here.
54:40That's not okay.
54:43I always made it very clear that I was just the messenger.
54:47I do not make the decisions.
54:49I had members' families scream at me.
54:53How did that feel?
54:54Awful.
54:56Because I wanted to just say, no, I 100% agree with you.
55:00I don't think that your mother should be discharged right now.
55:04But I wasn't allowed to.
55:06And they expect you to meet these certain goals.
55:09And the problem is, if you don't meet them, then you're costing company too much money
55:15and it goes against you as a care coordinator.
55:20So you as an employee have repercussions if you don't...
55:24Yes. If you don't stick with it.
55:25If these patients don't get discharged within 1% of the date that the algorithm says.
55:31I never met that metric.
55:34That was part of the reason that I was let go.
55:36It was all about the dollar.
55:39And I hated that.
55:45Amber really cares about her patients.
55:48I mean, that is so obvious.
55:50She, like, feels personally affronted by what she was being asked to do.
55:56Where this has fallen down has been in the fact that it is so inflexible.
56:05And it's this, the idea that we might ultimately give up control from humans to machines,
56:11that gets to the heart of the feelings of injustice,
56:14that can arise when technology clashes with human pain and suffering.
56:26In New York, jury selection in Luigi Mangione's trial is scheduled to begin later this year.
56:35And it's bound to elicit yet more controversy on all sides.
56:40His supporters hope those jurors will deliver a radical verdict.
56:44We have a feature in the American justice system called jury nullification,
56:49where if a jury believes that a not guilty verdict would be the best delivery of justice,
56:55they can deliver a not guilty verdict, regardless of whether they think the person actually committed the crime.
57:02The people who live in will never be defeated!
57:14My time in the US had made me think about our fate in the UK.
57:21AI is already being used in the NHS,
57:24but it's being done with caution and, crucially, with human supervision.
57:30A tool to support humans, not to override them.
57:34And for me, that is when AI is at its best.
57:38Not something to be feared, but something to be carefully incorporated into our lives.
57:44AI could achieve extraordinary things.
57:47But this is a revolution that has to happen with us, not to us.
57:57The most important thing is to be seen in the US,
57:58but the most important thing is to be seen in the US.
58:03If you look at the US, you can see the US,
58:04who will be interested in the US.
58:05The US is the US.
58:05And the US is the US.
58:21To discover more about AI and how it can shape our future,
58:25go to connect.open.ac.uk forward slash AI with Hannah Fry
58:31or scan the QR code on the screen now.
59:09To discover more about AI and how it can shape our future,
Comments