Skip to playerSkip to main content
  • 10 minutes ago

Category

📺
TV
Transcript
00:05Good morning, New Yorkers. Another beautiful day in the city.
00:12The Marriott Hotel, Midtown Manhattan.
00:18The CEO of the largest health insurer in America sets out for a meeting of investors.
00:28But as he reaches the venue, a masked man approaches from behind.
00:35Breaking news. We're just hearing that the CEO of United Healthcare, which is the largest healthcare company in the country,
00:42was shot and killed right here in New York.
00:45Brian Thompson was shot just before 7 a.m. Police believe the shooting was a targeted killing.
00:50The shooting of the father of two led to a nationwide manhound.
00:56Five days later, a suspect was arrested.
00:5926-year-old Luigi Mangione became a controversial media sensation.
01:06Luigi Mangione, who has also been dubbed America's hot assassin.
01:11Instead of denouncing the killing, some seemed to cheer it on.
01:16We, the people, will be free!
01:19Furious about the U.S. health insurance industry.
01:23But what few people realize is that this is a story about artificial intelligence.
01:30Stop denials with AI!
01:32How many people have to die?
01:34And the growing use of AI in modern healthcare.
01:38They're dying!
01:38People are dying!
01:40They're using AI to maximize the profits off of their clients.
01:47What happens when life and death decisions are no longer made by doctors, but by machines?
01:57Artificial intelligence, a machine beyond the mind of man.
02:01For decades, scientists have dreamed of creating incredible machines that could talk like us, learn like us, think like us.
02:13But what we didn't imagine is the impact they would have on us.
02:19In this series, I'm exploring what happens when AI collides with human lives, unearthing stories far stranger than we could
02:28ever have imagined.
02:55I'm Professor Hannah Frye, and for years I've been interested in how AI could transform healthcare.
03:01What I couldn't predict is that it would take me to the scene of a murder on the streets of
03:07Manhattan.
03:08Hey Hannah!
03:09Hi!
03:10How are you doing?
03:11Good, nice to meet you.
03:12I'm Hannah, lovely to meet you.
03:14Hannah Parry is a journalist for Newsweek, who lives in New York and was on duty the day Brian Thompson
03:20was shot.
03:21When did you first hear about it?
03:23Well, we first had the reports of a shooting up in Midtown, but it quickly came through that it was
03:29something more than that.
03:31We were able to see that it had been a targeted attack.
03:35Hi, I'm Brian Thompson, CEO of United Healthcare, and welcome to the attendees of Reuters Total Health.
03:42Our mission and values…
03:43This polished corporate video shows Brian Thompson addressing a conference, 18 months after he became CEO of United Healthcare, America's
03:53largest insurer, with over 50 million customers.
03:57On the morning of his death, he was on his way to the company's annual investor meeting on West 54th
04:04Street.
04:04This is the entrance to the hotel where the conference is being held, and that's the surveillance camera that captures
04:10the shocking footage.
04:14There were multiple bullet casings and that kind of became one of the key parts of the case.
04:20The suspect had inscribed three words on them, which were the words delay, deny, depose.
04:26Those words are extremely similar to a very well-known critique of the insurance industry.
04:32Rather than prove claims, they would rather delay, deny, defend.
04:36It appeared clear that the suspect had something to say about the insurance industry.
04:43In the months prior to the shooting, reports were circulating about United Healthcare's use of AI in deciding when to
04:51pay out to patients and when to deny claims.
04:57A new investigation alleges one healthcare giant may have been giving the algorithms too much power.
05:04United Health pressured its medical staff to cut off payments in lockstep with a computer algorithm's calculations.
05:13This news reports from 2023, which is a year before Brian Thompson was shot.
05:19I think if you don't live in America, you've probably never even heard of United Healthcare until this Luigi story
05:25surfaced.
05:26But there is a lot going on with them.
05:30And at the heart of it all are algorithms and artificial intelligence.
05:39The potential for AI to transform healthcare for the better is enormous.
05:45And some of it is already here.
05:47A step toward medical super intelligence.
05:52That's what the CEO of Microsoft is calling the company's new artificial intelligence tool.
05:56From accelerating drug development.
05:59A revolution in drug discovery could cure cancer in the next 50 years by bringing new approaches to once impossible
06:07problems.
06:08To reading scans and detecting diseases.
06:11See that little square? Yeah.
06:13It finds a very subtle polyp that maybe you would have missed.
06:16And assisting doctors during complex operations.
06:20A 3D map of a patient's pelvis is generated from a CAT scan to help guide the surgeon in real
06:26time.
06:27In the UK, AI is already being used in the NHS to detect the risk of leukemia and to identify
06:34early signs of lung cancer.
06:36For our overstretched health service, this technology has huge potential to boost efficiency and bring down costs.
06:44The revolution in artificial intelligence offers a golden opportunity to deliver better care at better value.
06:51And the NHS will usher in a new age of medicine, leapfrogging disease, so we are predicting and preventing it
06:58rather than just diagnosing and treating.
07:06In the US, healthcare is a huge business.
07:10I wanted to find out more about the claims that United Healthcare had been using an AI to make crucial
07:16life and death decisions.
07:19So I went to meet a doctor working in the oldest homeless shelter in Los Angeles.
07:25Okay, so let me take a look.
07:26Mary Marfasee is a medical director at the Union Rescue Mission.
07:35Mary, hi. Hi. Nice to meet you.
07:39It's lovely to meet you. How are you doing?
07:41I want to give you a little tour.
07:43Mary has spent her career providing medical care to the city's most vulnerable.
07:48But in 2023, when her family needed help, she believes they found themselves at the mercy of AI.
07:55Mary, is that your hat?
07:56That's my husband's hat.
07:58Oh, that's lovely.
07:59I thought I'd bring it with me today.
08:03Like, isn't that one of the most handsome older white guys?
08:06Oh, gee.
08:06He looks so English.
08:07He was Welsh.
08:08He was American Welsh.
08:10The story with your husband.
08:12When did this all start?
08:14Going back about three years ago, I noticed he was having a lot of balance problems, just falling too many
08:19times.
08:20And this was a very athletic man.
08:22And then finally, one fall that was so bad, fractured his nose and his face was filled with blood and
08:29had some cranial fractures with it.
08:31Got him in the hospital and finally got a diagnosis, which was accumulation of fluid on the brain.
08:36Oh, my gosh.
08:37Right?
08:38Oh, wow.
08:39That's a proper fall.
08:41Yeah, it was rough.
08:42So what happened?
08:43So we stayed in the hospital and then he got into a rehabilitation center.
08:48And then within just a couple of weeks, I was told by the center, okay, his time is up.
08:53He's no longer in need of services.
08:56And I said, what?
08:57He can't even brush his teeth.
08:58He can't go to the bathroom on his own.
09:00And I thought, something's not right here.
09:02So we had him home and then again, another fall within just a few months.
09:07And the same thing, I was told by the center, okay, his time is up.
09:09And anytime I would complain about it, I'd hear, oh, well, at his age.
09:14And I got tired of hearing that preamble.
09:17So when he was discharged, I mean, if he's not in a fit state to walk out of the hospital,
09:22what happened?
09:23Wheelchair. Just wheelchair to the car. Staff takes him out, loads him in my car and I drive away.
09:30And how did you get him in the house to the other end?
09:32It was a struggle.
09:34One time he fell back on me. I dislocated my right shoulder.
09:37Because this is just not a man who's strong enough to be back home.
09:39Right. That wasn't good.
09:42And that's when I started calling UnitedHealthcare, saying, who's making these decisions?
09:46And the response I would get is clinical team.
09:50Right.
09:51And I started to say things like, well, who runs the clinical team?
09:55And then I would just get the runaround.
09:57So I called the ombudsman.
10:00And that's when I was told AI is involved.
10:02Okay.
10:04After each fall, he wasn't getting enough physical therapy or occupational therapy.
10:09You just get weaker and weaker over time?
10:11Yeah.
10:12Yeah.
10:15You think he didn't need to die when he did?
10:18I don't think so.
10:20He had goals to live much longer.
10:23I really thought he could get back to some of his function.
10:27Had he been given the care he needed?
10:29Yes.
10:29I think so.
10:32How angry are you about all of this?
10:36It's more I miss him more than anger.
10:39I don't want it to happen to anybody else.
10:47All right.
10:48Thanks for coming.
10:48Thank you so much.
10:49I got to give you a hug because it meant so much to have you.
10:51All right.
10:52Stay in touch.
10:52I'll send you some things.
10:59There's a kind of irony to this story.
11:01You have somebody who spent her entire life advocating for people's health, trying to make sure they get the care
11:08they need.
11:08When she needed it, when her family needed it, despite having paid for it, it wasn't available.
11:17Frank and Mary's case was not an isolated one.
11:21By early 2023, the story was starting to get out, alleging the widespread use of AI algorithms by big insurers,
11:30including United Healthcare, to deny elderly patients care.
11:36Later that year, there was a Senate inquiry.
11:39The reason we're here today is that all too often the big insurance companies have been failing seniors when they
11:46need care.
11:47And perhaps most troubling of all, there is growing evidence that insurance companies are relying on algorithms rather than doctors
11:55or other clinicians to make decisions to deny patient care.
12:02Two months after the Senate committee published its report, Brian Thompson was shocked.
12:09Let's head to New York because police there are continuing their hunt for the gunman who killed the boss of
12:15one of the biggest companies in the world.
12:17The shooter took off down an alleyway around 55th Street and is currently still at large.
12:21In the days after the shooting, the hunt for Brian Thompson's killer gripped the nation.
12:27Police are looking for a suspect described as a man about 6'1", wearing all black.
12:32Police drones, helicopters and thousands of CCTV cameras are combing the city street by street.
12:38On Thursday, detectives shared new images of the man they want to question.
12:42The image caught on surveillance camera shows him standing at the check-in desk at a hostel.
12:48The man appears relaxed and smiling.
12:51Just a few minutes later, he shot Mr Thompson dead before making his escape.
12:57Then, five days after the killing, police received a tip-off from an employee at a McDonald's in Pennsylvania.
13:05Um, what's your name?
13:07Uh, Mark.
13:08What is it?
13:09Mark.
13:09Mark?
13:10Yes, sir.
13:10Mark what?
13:11Azaria.
13:12Azaria?
13:12Someone called.
13:13Right, they thought you were suspicious.
13:15Oh, I'm sorry.
13:16Um...
13:16Do you have your ID on you?
13:18Yes, sir.
13:18Thanks.
13:22The alleged killer has been identified as Luigi Mangione.
13:27As Luigi was led into court after his arrest, it was clear he had something to say.
13:33Don't, don't.
13:37Oh, I got a completely out of touch and insult to the intelligence of the American people.
13:43It's my experience!
13:47Luigi Mangione, the man accused of killing the US healthcare insurance chief executive,
13:51Brian Thompson, appeared in New York to face 11 state criminal counts that could lead to
13:57a death penalty sentence.
13:59Then, something extraordinary happened.
14:02Free Luigi!
14:04Free Luigi!
14:07Almost immediately, protesters flocked to his court appearances, hailing him as a folk hero.
14:15I'm here because I support universal healthcare for all, and I'm here because I believe that
14:20Luigi Mangione's civil rights are being violated.
14:23So, what brought you here today?
14:24Um, this case is a case about humanity.
14:27I think everybody in the crowd identifies with the extortionist nature of our American healthcare.
14:32They implemented like an AI bot to review the claims.
14:35Some of these people had life-threatening illnesses that were rejected for efficiency.
14:40What's your feeling in general about AI?
14:42I mean, um, technology is going to find its way into everything.
14:45So, to stand against it is, you become a fossil.
14:49But, we have got to continue to invest into the human components of things.
14:53Because, as we saw, you can't just automate everything, right?
14:55Somebody has to be looking and watching normal.
14:57People are dying!
14:59People are dying!
15:02People are dying!
15:03I, myself, have had a terrible experience with American healthcare,
15:07despite having one of the better plans.
15:09Both my mom and I have had to battle and fight tooth and nail AI claim denials.
15:16It's really insidious the way that AI is infecting every aspect of our anxiety,
15:21but specifically something so important and crucial, such as healthcare,
15:24that already shouldn't be for profit.
15:27They're using AI to maximize the profits that they can make off of their clients.
15:35Stop denials with AI!
15:37How many people have to die?
15:39I mean, there's this, like, incredible strength of feeling from the protesters.
15:44You can understand, too.
15:45You know, if you've been personally affected by this,
15:47or if someone in your family has,
15:49like, I understand why this is such an emotional moment.
15:53But, you know, I also think that, like, someone was murdered here.
15:58You know, someone with a family, with children.
16:01And I think there's a bit of mental gymnastics going on with these protesters,
16:05where they sort of conveniently manage to forget that fact.
16:15In situations like these, where emotions are high,
16:19it's sometimes easy to turn on technology,
16:22especially when it feels like it's something we don't fully understand
16:26and can't easily control.
16:29AI didn't create the perceived problems with US healthcare,
16:33but it has supercharged them.
16:36And to understand how,
16:38it helps to know what an AI algorithm actually is.
16:43I think the words algorithm and AI get thrown around quite a lot.
16:46It gets quite confusing.
16:47So I thought I would explain the difference between the two
16:50in the most New York way possible
16:52by imagining that you are opening up a new hot dog stand.
16:56Now you've got a couple of options here.
16:58You could use an algorithm.
17:01An algorithm is a series of steps for completing a task.
17:07In the case of our hot dog stand, the task is to sell hot dogs.
17:13The inputs are the sausages, the onions.
17:17The algorithm is the instructions.
17:20Cook 100 hot dogs a day, sell them for $4 each,
17:24stay open till 11 on Friday.
17:27And the output is, hopefully, a tidy profit.
17:33With this traditional type of algorithm,
17:36everything is spelled out in advance.
17:37It means it's very precise, it's very reliable,
17:40but it's also completely inflexible.
17:43Now imagine that you've got a stand that is run by an AI algorithm.
17:48So this time, you don't tell it what to do.
17:50You just give it the inputs, the buns, the sausages, the onions,
17:53and you say, the only thing I care about is how much money you make.
17:58Now, at first, the AI is just going to what?
18:02It's just going to collect data, it's going to hunt for patterns,
18:05like when people are buying the most,
18:07where has the most footfall, how that changes with the weather.
18:11And then, after a while, it's going to start suggesting things
18:14that you hadn't even thought of,
18:16like pitch up outside a dog park on Sundays,
18:19or start selling vegetarian sausages,
18:21or people tip more when the cart smells like onions.
18:25But the thing is, you didn't tell the AI any of those rules.
18:28It discovered them for itself.
18:31AI algorithms are capable of crunching huge quantities of data
18:35and analysing complex patterns of behaviour,
18:39all in order to make your hot dog stand profitable.
18:46And that is the really key difference here.
18:49For an AI algorithm,
18:50you don't need to spell out every possible scenario in advance.
18:54You just define the goal and let the AI learn for itself.
19:01Just because an algorithm uses AI doesn't mean it's necessarily better,
19:06but it will ruthlessly pursue whatever goal it's set.
19:13I wanted to find out more about the specific algorithm being used by United Healthcare.
19:19So, I tracked down the two investigative journalists
19:22who were the first to uncover the story.
19:25Hey!
19:26Hey, how are you? How are you doing?
19:27Good to meet you.
19:28I'm Hannah.
19:29Nice to meet you.
19:29Lovely to meet you.
19:30Hey, Bob.
19:31Lovely to meet you, Bob. How are you doing?
19:32Good.
19:33Casey Ross and Bob Herman were nominated for a Pulitzer Prize for their investigation.
19:39How big has this story been for you in terms of your journalistic career?
19:42I think this is probably the biggest story that I've done
19:45in terms of the impact that it's had.
19:47I think that the reaction that we got
19:49and some of the fallout in terms of lawsuits in the US Senate
19:52validate that this was a story that had an extraordinary impact.
19:57What was the first spark of the story?
19:59This person in the nursing home industry sent me an email
20:02and it was just this visceral reaction that,
20:05hey, health insurers are issuing a lot of denials
20:08and they're not telling us why.
20:10The people that were in this facility were getting removed
20:14when they were still very sick and were not ready to go home.
20:16And so it was just a signal that,
20:18okay, well, maybe we should ask some more questions about this.
20:20And what we found was that this all centers around an algorithm called NHPredict
20:24and that algorithm is used on behalf of insurance companies
20:28to reduce the amount of time that people are in these facilities
20:31and to control their cost of care.
20:34How much sight do we have of the actual algorithm that's going on underneath this?
20:39Yeah, so there are a bunch of pieces of data
20:41that they are feeding into a model.
20:43The age of the person, what was their primary diagnosis,
20:46what other illnesses do they have?
20:48It compares that patient to other patients like them
20:52in a database of six million patients.
20:55And based on this comparison, this is the amount of care you should get.
20:59Once that prediction is made, basically a date is circled on the calendar.
21:03And this is the date that they're trained to push these people toward.
21:07This is the report that's produced by the algorithm.
21:10And it all boils down to this prediction,
21:12which is that the estimated length of stay of the patient,
21:16in this case, is 16.6 days.
21:18Point six?
21:19Yes.
21:20So it's like to the hour?
21:21It predicts it down to the decimal point.
21:23You can't predict when somebody's going to be fine to the hour.
21:26No, and it's the type of information that only an AI algorithm could give you.
21:31It doesn't take into account a lot of things about these people.
21:36Every healthcare journey is different.
21:39Every injury, every recovery, everything that comes up in the course of your care
21:43causes all sorts of things to happen that can't possibly be predicted.
21:47Literally, they're boiling down these people to numbers.
21:50Oh, you're not just a number.
21:51Yes, you are.
21:52And there it is.
21:54There is a counterargument to all of this, right?
21:56In that people are quite accustomed to artificial intelligence that compares you to other people.
22:02Like, you know, the recommendation algorithms that you get on Netflix or Spotify or Amazon.
22:06It's like, people like you did this, therefore, do you want this?
22:11Yeah, and I think the common experience, or at least my experience, is that most of the time I think
22:16the algorithm is wrong.
22:17It's way off.
22:18Just because I watch this cooking show does not mean I want to watch this other show.
22:23But if you're talking about me in the hospital or after a serious injury and it gets that wrong, well,
22:29I have a much bigger problem with that.
22:32Casey and Bob's damning reporting concluded in 2024, throwing UnitedHealthcare into the spotlight.
22:41Just months before Brian Thompson was shot dead.
22:46Can you remember when you first heard about Luigi Mangione?
22:49I mean, I remember the morning that it happened.
22:51We started getting texts from people and we're like, holy crap, like, is this actually for real?
22:56We don't have any indication that this individual was inspired by a reporting.
23:01But nonetheless, as a reporter, you're stunned. You're shocked.
23:04You're like, you don't ever want any of your reporting to inspire somebody to act that way.
23:20Luigi Mangione is not someone you would expect to take on the role of outlaw vigilante.
23:30He studied at the prestigious University of Pennsylvania, one of America's Ivy League.
23:37Luigi Nicholas Mangione. Luigi.
23:42You can see in this video of his high school graduation that Luigi already stood out.
23:48He's giving the valedictorian speech, which means he's being celebrated as the most academically successful student in his year.
23:57Family, friends, faculty and fellow students, good afternoon.
24:04He was clearly a popular, confident young man.
24:10We spoke to a few of his classmates who didn't want to go on the record
24:14because they have been continually hounded by the press ever since, which you can understand.
24:19And everyone, almost to a point, say that Luigi was, like, smart, he was helpful, he was kind, he was
24:25friendly.
24:25He was a popular guy from a good family who was also really well educated.
24:31But then I discovered a remarkable detail about Luigi, one that's been all but overlooked.
24:39He did computer and information science as his major, but then he also then got a master's degree from here
24:48in computer science with a concentration in artificial intelligence.
24:54One of the modules that he was taking in here was exactly the stuff that sits behind this algorithm.
25:01It's called Data Structures and Algorithms.
25:03Very neaty, very mathematical algorithms course about data structures and computer science.
25:09Like, it's tough.
25:10And the thing is, Luigi wasn't just doing well in this course, he was doing so well that he was
25:16appointed as an assistant tutor.
25:18Look at this photo. I mean, he looked so confident.
25:22He was basically teaching other students who were the same age as him.
25:27This is stuff that he knew incredibly well and, by the sounds of it, was also very, very good at.
25:38Most of Luigi's friends have so far refused to go on the record.
25:43But back in the UK, I tracked down one person who was willing to talk.
25:49Gawinda Bogal is a British blogger who writes about the impact of technology on society.
25:55Hi. Hi, Anna. How are you doing? Nice to meet you.
25:58Lovely to meet you. Thank you for this.
26:01Luigi was a subscriber to Gawinda's blog and the two of them talked about their shared concerns.
26:10When I spoke to Luigi, he was in Japan and he was remarking on how sort of lonely it felt
26:16in a lot of places.
26:18The streets were empty because everything's just automated.
26:21He was quite concerned about mass automation and the knock-on effects that this might be having on society.
26:28You can live your entire life without leaving your house.
26:31You know, you can do your shopping online. You can do your banking online.
26:34You can do your dating online.
26:36You can even have a full relationship online.
26:40There's less human connection and AI is going to make this a lot worse.
26:43I mean, he himself probably had no problem making friends.
26:47He was a very charismatic individual.
26:48But I think he was worried about other people and maybe how they were kind of gradually being lost.
26:56The connections that we have, they keep society together.
26:59And he was worried that all of that is unravelling.
27:02Did you talk about healthcare as well?
27:04There was only one brief exchange that we had.
27:08Luigi made a passing remark about how I was lucky because we had the NHS in the UK, you know,
27:14free healthcare.
27:14How did he seem to you?
27:17He actually seemed quite cheerful.
27:19Someone who did not seem particularly pessimistic, although some of the issues that he raised were quite pessimistic.
27:28But his general demeanour was actually the opposite of that.
27:31He said that he wanted me to focus less on the problems and more on the solutions.
27:35He would ask me, you know, so what's the practical takeaway of this?
27:37He wasn't just interested in moaning. He wanted to find real solutions.
27:47Since his arrest, the Luigi Mangione story had taken on a life of its own.
27:53Luigi Mangione, the internet's favourite hot Italian sausage.
27:57And social media had exploded with memes about the alleged killer.
28:02Luigi, you're a brave Italian stallion whose actions ignited a national dialogue about the USA's crappy healthcare system.
28:13If you feel the same way about me as I feel about you, please do not deny, delay or defend
28:19your love.
28:21There was also no shortage of theories about what happened and why.
28:26There is a note which was supposedly found in his backpack when he was arrested.
28:31And the press have decided to call this his manifesto.
28:34In the document, he appeared to talk about the planning of the alleged crime.
28:38But then moved on to health insurance.
28:42There's one bit here that says, frankly, these parasites had it coming.
28:46Then he specifically name-checks United.
28:49And then he says, these indecipherable have simply gotten too powerful and they continue to abuse our country for immense
28:59profit because the American public has allowed them to get away with it.
29:04The other thing that the internet is obsessed with is his own personal story.
29:09It's how he ended up feeling so strongly about this particular issue.
29:14I've got his X profile.
29:17In the middle of his banner, he's got this X-ray.
29:22And, I mean, we know that he had a spinal problem.
29:24We know that he had surgery for it.
29:25And that's...
29:26I mean, that's pretty extreme, that surgery.
29:29But what is interesting is that there's no evidence that he was actually a client of United Healthcare.
29:36And so far, nobody's found anything to prove that he had a personal insurance claim denied, even.
29:46As humans, we understand being let down by other humans.
29:50Real doctors often don't have all the answers, but we forgive them for their very human flaws.
29:58But when it comes to technology, we have very little tolerance for error.
30:04The flip side is also true.
30:07When AI promises extraordinary medical possibilities, it's all too easy for us to believe in the impossible.
30:24Just a few miles across the state line, I've been invited to meet a young tech entrepreneur
30:29who is selling a seemingly incredible new medical breakthrough.
30:34Hey!
30:34Hi!
30:35Using AI.
30:36How are you?
30:37Yeah, very good.
30:3825-year-old Kian Sadehi runs a tech startup called Nucleus Genomics.
30:44Are you sequencing the entire genome?
30:46The entire thing.
30:47The whole thing?
30:48The whole thing.
30:51His company uses AI algorithms to analyze the DNA of his customers' future children like never before.
30:59These are nitrogen tanks, are they?
31:01Yes.
31:03How many embryos do you reckon there are in here? Like millions?
31:05If all the samples here were embryos, yes, there would be millions, yes.
31:10Millions of potential new humans?
31:12Yeah.
31:14Kian uses AI to map each embryo's DNA, comparing it against huge DNA databases to try to predict a baby's
31:23future risk of disease.
31:25Some diseases can be predicted with certainty. For others, he can say how people with similar DNA turned out.
31:34When I talk to a couple, they say, my grandfather had Alzheimer's, I want to do anything I can to
31:39make sure my son doesn't have Alzheimer's.
31:41And then I think, well, genetics obviously can help with that.
31:44And so I think more people are going to use IVF, and they're going to be using genetic optimization technology
31:49to basically pick their child.
31:52Figuring out whether a baby will develop Alzheimer's later in life is a best guess, not a diagnosis.
31:59But the company goes further, helping parents choose an embryo based on eye color, height, or even IQ.
32:09Welcome to the Nucleus office.
32:11Kian had already raised $32 million from investors.
32:16By the way, if you ever see how many tabs I have on the right side of the screen, I
32:19don't think you should ever show that on the camera.
32:21Because it is actually so insane.
32:23I think it also slows down. I think it uses it for your RAM.
32:26Anyways, let me show this.
32:27He just launched a glossy new ad campaign.
32:31I think it is.
32:31Okay, let's go.
32:33Nucleus Embryo is for couples doing IVF to uncover the full genetic profile of each embryo in one intuitive platform.
32:43Every parent deserves the power to decide what possibility feels right for their family.
32:50Some people don't think you should have this choice.
32:54But it's not their choice to make.
32:58It's yours.
32:58It's yours.
33:00You are not afraid of controversy, are you?
33:04No, no, no, no.
33:05It's not.
33:06It's, it's, you know, if you were doing IVF and you had five embryos, would you want to pick your
33:11future baby randomly?
33:13Or would you ask the doctor for more information on each, especially if you have a family history of Alzheimer's,
33:18of cancer?
33:19It's my right to know this information, it's my choice to say I want a baby at lower disease risk,
33:23I want a baby that's slightly taller, or even with a specific eye colour, etc.
33:27It's their right, it's their choice.
33:28I think what you're doing, where people have genetic predispositions for particular diseases, especially when they're preventable or treatable, I
33:35think it's amazing.
33:36What I don't understand is why you would include something like IQ and eye colour.
33:41Yeah.
33:41Because it's so controversial, and it's like, it's a little bit eugenics-y, you know?
33:46You are implicitly saying that taller is better.
33:49You are reinforcing the ideas of preferences that some people are more valuable than others.
33:56No, not at all.
33:58I think parents have the right to choose across their embryos, right, if they want a baby with a lower
34:05disease risk, for example, or if they want a baby that's shorter or taller.
34:08That's absolutely their right to choose.
34:10This is designer babies, though.
34:11I mean, bluntly, this is designer babies.
34:13No, no, it's not designer babies at all, actually.
34:15How is it not designer babies?
34:16It's not designer babies at all.
34:16Well, if a parent wants to give their child the best start in life, that is like a parent doing
34:22the most basic, in my mind, and human thing.
34:25If a parent, if the moment a child is born, they will run multiple tests on a baby to make
34:31sure the baby is healthy.
34:32They'll give it vaccines, for example, to make sure they don't get diseases.
34:34This is just another tool in the toolkit that helps parents do that.
34:40But there was something else I was worried about.
34:43Even the best scientists don't fully understand how genes and environment combine to make us who we are.
34:50I was worried that all of this complexity was being overlooked in favour of tantalisingly simple AI predictions.
34:58Do you think that the technology is mature enough, accurate enough, capable enough to be giving people this illusion that
35:11they have control?
35:12The platform, I think, does an excellent job showing the uncertainty and showing the fact that, again, DNA is not
35:18destiny and DNA will never be destiny.
35:20People can have certain just genetic dispositions, but there's the whole thing called life, which is there's environment, there's how
35:26you're nurtured, how you're raised, your nutrition, etc.
35:29We cannot possibly reduce human life to just a DNA strand.
35:40Testing for physical characteristics is banned in the UK.
35:43But there's nothing to stop British couples travelling to the US for it, as long as they're willing to pay
35:50the approximate $40,000 price tag.
35:57Hi.
35:58Hello.
35:59How are you doing?
36:02Dragos and Laura live in London, and we're at the start of their IVF journey with Kian's company.
36:09These are beautiful.
36:10Are these your babies?
36:11Yes, our two children, two boys.
36:15So your two boys, did you have them naturally?
36:18Yes, we did.
36:19Yes.
36:19So why not naturally for the third?
36:22Because Dragos was kind of afraid we are, I'm 40, he's 43.
36:27When you're over 43, it was clear that the chances of having issues are very high.
36:33If this wasn't possible, would you still want to have another baby?
36:38Definitely not.
36:39Not?
36:39Not.
36:41Because you don't want to take chances.
36:43Everybody knows how important a healthy child is, and I think everybody should do the extra mile for that outcome.
36:50Where are you at now than in the process?
36:54In less than two months, we are going to New York to start the IVF process.
37:00And then they will test the egg, they will take a few cells, they will zoom in on anything that
37:05can happen, and then once we have selected the right embryo, then we will go back to New York for
37:10the implantation, which takes only one or two days.
37:13So in theory, in a few months, you could be pregnant.
37:18Yes.
37:19Fingers crossed.
37:21Yeah.
37:21We both want a healthy child, a healthy embryo.
37:26We are also hoping she's going to be a girl.
37:31Is that the dream?
37:32A little girl?
37:33Yes.
37:34This was my dream all along, to have three babies.
37:38So if a girl would come, it would be the perfect picture for myself.
37:44When you picture your daughter, or your potential daughter in your head, what does she look like?
37:51Beautiful.
37:55Brown eyes, brown hair, light skin, big lips, full lips, I don't know.
38:04So she'll be looking like you.
38:07Not like me.
38:09Hopefully not bald, with a beard.
38:14Some people worry about the nuclear stuff, because it effectively prioritizes some characteristics of the baby over others.
38:23For example, eye cue, eye color, height, is stuff that you can, in theory, at least give a probability to,
38:30right? You can measure.
38:31I would not go through this just for the height and the eye color.
38:36But if height is what we can find out now, maybe it's important, maybe not. But, you know, we'll take
38:43that, because it's on the table.
38:45They give you a probability of the eye cue that he or she might have. But yes.
38:52And that's going to be one you'll look at?
38:54Of course. Of course it will.
38:57Before Dragos and Laura started their IVF cycle, there was a first step. Getting their own DNA mapped by nucleus.
39:06I don't know. Let's see.
39:08To see if they were carriers for any diseases they could pass on to their baby.
39:12The results of these genetic tests had just come in.
39:16The moment of truth.
39:21It's like unraveling a present. You don't know exactly what's inside.
39:29So, our family summary.
39:33We have 2,154 rare diseases tested and one detected risk for children.
39:40Focus endotelial corneal dystrophy.
39:44Symptoms often begin at 50. So, imagine our child is born this year.
39:4950 years from now, I'm sure we'll have bionic eyes that we can replace, hardwired to our brain.
39:56So, I don't think this is serious enough for us to be worried about it.
40:01Of all of the things that could potentially be risky, Alzheimer's, Parkinson's, breast cancer.
40:06Yeah, this is very...
40:08Low level.
40:08Low level mild.
40:17So, I think this is important.
40:19Mmm.
40:23I think if you're having a baby in your 40s, I can completely understand why you would
40:29want as much information as possible.
40:31If you're prone to worry, why wouldn't you try and eliminate the risks where you possibly can?
40:41But at the same time, I'm worried that this technology in this way gives the illusion of control that you
40:50don't actually have.
40:52It gives the illusion of certainty and a prediction of the future that doesn't really exist.
41:13Over in the US, Luigi's case had proceeded through the courts.
41:18Charged with murder in the first degree, killing as an act of terrorism and criminal possession of a weapon, he
41:24faced his plea hearing.
41:26Are you pleased to be not concerned, guilty or not guilty?
41:30Not guilty.
41:32Stoking further outrage on both sides.
41:35One struggle, one fight!
41:37Healthcare is a human right!
41:39We have very disturbed people who somehow think that eliminating a father who has two young children over some cause
41:47is somehow justified.
41:51But on the other side of America, another legal case was quietly gaining its own momentum.
42:03A class-action lawsuit against UnitedHealthcare's use of AI had been launched by one plucky public interest firm called Clarkson
42:14Law.
42:15They'd also launched suits against other big insurers for similar claims.
42:20Lawyer Glenn Danas is leading the charge.
42:25We know for sure that there is an algorithm of some kind that is predicting how long you need in
42:32rehabilitation.
42:33And we also know that some people think that that wasn't long enough.
42:38Is that fair?
42:39Yes.
42:40I mean, from our perspective, that's a vast understatement.
42:42But that is, in fact, true.
42:45Yes.
42:45The length of stay value is consistently too low.
42:49And it seems highly unlikely that this is an accident because it's only ever in one direction.
42:56How does the legal case come in?
42:58One way it's illegal is because there are different states, like California, that require that it be a human making
43:04decisions.
43:05So that, by law, it cannot be delegated to an AI, an algorithm, anything other than a human medical professional
43:14exercising his or her professional knowledge and specialty.
43:22How many people are involved in this?
43:24How many plaintiffs are there?
43:26Because United has such a large market share in America, it's almost certainly in millions.
43:33If you eventually win this case, does that mean that they have to compensate everybody who held that type of
43:40insurance?
43:41What we want, whether it's by a settlement or by going to trial, is to have all the people who
43:46were denied what was owed to them to be paid for that, and then to change these practices going forward.
43:55The Clarkson case rests on the evidence of former UnitedHealthcare customers, who have come forward to testify, and in particular,
44:05those who have survived to tell their tale.
44:08Hey.
44:09Bill, this is Hannah.
44:11Bill, such a treat to meet you. How are you doing?
44:14Very good, thank you.
44:14Lovely to meet you.
44:15One of those is 86-year-old Bill Hull.
44:20How are you guys doing?
44:21Well, as well as can be expected, I guess.
44:25Your koi are quite the beasts.
44:28I chuck them in.
44:31So tell me, how are you doing now?
44:33How's your health at the moment?
44:34About 70% of my heart is shot.
44:38Mm.
44:38But the worst part has been the paralysis that the stroke caused me.
44:45I used to be very active.
44:47Well, like we did, did this thing, did all the bricking here.
44:51Can't do any of that anymore.
44:54In June 2023, he suffered a heart attack on his way to a medical appointment.
45:00There's a park bench, big, long park bench, just outside the building.
45:04I just slumped over, apparently.
45:06Two medical technicians, that new CPR, laid me down on the bench, and they broke all my ribs.
45:13And I understood later, if you don't break ribs, you ain't doing it right.
45:18Wow.
45:18I was in the hospital for 25 days.
45:21In intensive care?
45:23Almost all of it was intensive.
45:25I'm there, and I think of the 22nd or 23rd day, I got a notice that you're to be released.
45:32They said, well, we'll sign our case manager to you.
45:35She got a hold of me and said, I have talked with your doctors, and everyone recommends that you go
45:42into skilled nursing.
45:43A day later, she came back, and she said, well, I don't know, they said no, they would not approve
45:49it, and wouldn't give me a reason.
45:52Clarkson Law argue that this denial was likely the result of UnitedHealthcare's Predict algorithm.
45:59They made you feel like you needed to get out, and yet they were giving you no place to go.
46:04They said, where do you want to go?
46:05I said, I guess I'm going to go home.
46:07Okay, so we'll get you out of here.
46:10Stuck me in the car.
46:11What physical state were you in at this point?
46:14My wife was very worried because she didn't know if she could take care of me.
46:19She's 85 years old.
46:20She's macular degeneration, has a hard time seeing.
46:23She's using a walker, too, like I am.
46:27Anyway, I get home.
46:29I was two and a half, three days out of the hospital.
46:32My daughter came over, and I was in bad shape.
46:36I think slurring my words, I was starting to have a stroke.
46:41I could not move a finger, couldn't move my arm, couldn't move my foot.
46:44You feel yourself just literally dying.
46:48It's terrifying.
46:50Oh, worst thing that ever happened to me.
46:53Do you think that you would have had the stroke, regardless of where you were?
46:59Well, I think I'd have had the stroke, but I should have been in either a skilled nursing or in
47:06the hospital.
47:07I'd have gotten some attention several hours before, and what I ended up with would have been much less than
47:14what I ended up with.
47:20Bill is now looked after by his daughters, Lisa and Laura.
47:25What's your take on artificial intelligence being used in your dad's health care?
47:29Certainly the whole time he was in the hospital, we assumed it was doctors who worked for that hospital who
47:37were making these decisions and who were denying this care.
47:41And it wasn't until he found the class action lawsuit that he believed it was actually the AI system that
47:49was being utilized by UnitedHealthcare.
47:51I guess if a human had been making that decision, you would have been able to have a conversation with
47:55them.
47:55You would think.
47:56If an algorithm says it, you can kind of wash your hands of it.
48:00I feel like that's what they're doing.
48:01If they can make AI do their dirty work, I think they're very happy to do that.
48:05How do you feel about the murder of Brian Thompson?
48:08I think it's indicative of how frustrated human beings can become with huge corporations like UnitedHealthcare.
48:18But I 100% don't condone what happened and wouldn't condone any action like that in the future.
48:25But the anger and the upset is real and it's justified.
48:35The thing is, I'm not anti-AI, right?
48:39I'm not anti-algorithms.
48:40I think that there is an incredible amount of waste that happens when humans make decisions.
48:46I think there's incredible efficiencies and therefore better care that you can provide people when you carefully introduce these kind
48:54of systems.
48:56The problem is that once you start automating stuff, it all comes down to exactly what that algorithm was designed
49:03to care about.
49:04You know, was it designed to care about improving the outcomes for the patients in the long term?
49:12Or was it designed to minimise the amount of money that is spent on caring for the patient?
49:19And those are two things that are often at odds with one another.
49:30But the problems with algorithms go deeper.
49:37Here is one thing that you should know about AI.
49:40Sometimes people call it a black box and there is a good reason for that.
49:45You have to imagine that at one end, you're putting in some information, your input, and at the other end,
49:51you get some results, your output.
49:54Now, the question is, what is going on in the middle, inside of the algorithm?
50:00Now, there is nothing magical going on here.
50:03There's no voodoo.
50:04It's just loads and loads and loads of calculations.
50:06But these things are so big and unwieldy that it becomes impossible to follow one thread from one end all
50:16the way through to the other.
50:17And that means that artificial intelligence is sometimes finding patterns that we cannot see, are unable to check, and might
50:27not like.
50:30This problem was highlighted in 2019 by a researcher called Ziad Obermeyer.
50:37He looked at an AI algorithm being used in hospitals to identify patients most in need of care and offer
50:45them help on a special programme.
50:49But he discovered that the patients the algorithm selected were disproportionately white.
50:55These algorithms should have been a great use case for AI.
50:58But unfortunately, a design choice in building those algorithms made them biased.
51:04Obermeyer couldn't see inside the AI, so he worked backwards from the results and found that the algorithm had used
51:12a shortcut.
51:12It wasn't finding the sickest patients, it was finding the ones who'd had the most money spent on their care.
51:21Black patients have less money spent on them by our healthcare system today because of barriers to access and because
51:27of discrimination.
51:28And that means that the AI saw that fact clearly.
51:32It predicted the cost accurately.
51:34But instead of undoing that inequality, it reinforced it and enshrined it in policy.
51:40Even though the objective of the algorithm was good, the outcome led to discrimination.
51:46And until Obermeyer, no human had been there to spot the difference.
51:58We may never be able to see inside the algorithm used by UnitedHealthcare to assess claims.
52:05The closest we can get is by talking to those who worked alongside it.
52:10And until now, very few company insiders have ever spoken on the record about this AI.
52:16But one former employee had agreed to meet me.
52:21Lovely to meet you.
52:23Would you like something to drink?
52:24Yeah.
52:25Thank you so much.
52:26It's ice and everything.
52:28Amber Lynch was a care coordinator responsible for entering patient data into the algorithm and ensuring they were discharged on
52:36time.
52:38So tell me your background then.
52:39What did you train as?
52:40My background is I'm an occupational therapist.
52:42For this 20 years, I was working in clinics, in hospitals.
52:47I actually did home health, so I went to their home.
52:50I did it all, basically.
52:52And then I had the opportunity to go non-clinical.
52:56So talk me through the process.
52:57Well, when I got a new case, I was given doctor's notes, I was given admission notes, I was given
53:04therapy evaluations and assessments.
53:07I would put all of that information into a program called the PREDICT.
53:12And it would generate a recommended discharge date.
53:18Having an estimate of when somebody is going to no longer need specialist nursing care, there's nothing wrong with that
53:27in theory there, right?
53:28Patients always do best when they're at home.
53:31But if they're not safe at home, you have to do it in rehab.
53:37How often was that number of days about the right ballpark, as you saw it?
53:42Probably 20% of the time.
53:45Really?
53:4620, 25%, yeah.
53:48Was it sometimes more, sometimes less?
53:51Generally, it would be three to four days under.
53:55They liked to say that the PREDICT took in six million patients' experiences.
54:01I still believe that as humans, we made better decisions.
54:07You've got all this experience.
54:09Can you use your human judgment instead if you saw that it was wrong?
54:14Wouldn't that be great?
54:16Unfortunately, no.
54:18The expectation was I would stay within 3% of that estimated discharge date.
54:25But by the time I stopped working, it was down to 1%.
54:301%?
54:321%.
54:33That means this patient, who is very sick and can't get out of bed, guess what?
54:3910 days, you're out of here.
54:40That's not okay.
54:43I always made it very clear that I was just the messenger.
54:47I do not make the decisions.
54:49I had members' families scream at me.
54:53How did that feel?
54:54Awful.
54:56Because I wanted to just say, no, I 100% agree with you.
55:00I don't think that your mother should be discharged right now.
55:04But I wasn't allowed to.
55:06And they expect you to meet these certain goals.
55:09And the problem is, if you don't meet them, then you're costing company too much money and it goes against
55:16you as a care coordinator.
55:20So you as an employee have repercussions if you don't stick with it if these patients don't get discharged within
55:281% of the date that the algorithm says?
55:31I never met that metric.
55:33Mm-hmm.
55:34That was part of the reason that I was let go.
55:36It was all about the dollar.
55:39And I hated that.
55:45Amber really cares about her patients.
55:48I mean, that is so obvious.
55:50She, like, feels personally affronted by what she was being asked to do.
55:56Where this has fallen down has been in the fact that it is so inflexible.
56:05And it's this, the idea that we might ultimately give up control from humans to machines, that gets to the
56:12heart of the feelings of injustice, that can arise when technology clashes with human pain and suffering.
56:26In New York, jury selection in Luigi Mangione's trial is scheduled to begin later this year.
56:35And it's bound to elicit yet more controversy on all sides.
56:40His supporters hope those jurors will deliver a radical verdict.
56:44We have a feature in the American justice system called jury nullification.
56:49Where if a jury believes that a not guilty verdict would be the best delivery of justice, they can deliver
56:56a not guilty verdict, regardless of whether they think the person actually committed the crime.
57:03The people who let it will never be defeated.
57:14My time in the U.S. had made me think about our fate in the U.K.
57:21A.I. is already being used in the NHS, but it's being done with caution and crucially with human supervision.
57:30A tool to support humans, not to override them.
57:34And for me, that is when A.I. is at its best.
57:38Not something to be feared, but something to be carefully incorporated into our lives.
57:44A.I. could achieve extraordinary things.
57:47But this is a revolution that has to happen with us, not to us.
57:56A.I. could have been transparent with us.
58:21To discover more about AI and how it can shape our future,
58:25go to connect.open.ac.uk forward slash AI with Hannah Fry
58:31or scan the QR code on the screen now.
Comments

Recommended