Skip to playerSkip to main content
  • 2 days ago
Transcript
00:00AI gives worse care to women and minorities.
00:04AI, touted as a cutting-edge healthcare tool, reveals ingrained biases from historical medical
00:09research continuing to disadvantage women and people of color, as highlighted by a recent
00:14Financial Times report.
00:16For years, clinical trials focused on white male subjects skewing datasets.
00:20Unsurprisingly, AI outputs from these datasets aren't equitable.
00:24Researchers at MIT tested AI models like GPT-4 and LAMA-3, finding them biased towards
00:31women, often suggesting they self-manage at home.
00:34Healthcare-focused Palmyra Med showed similar issues.
00:38Researchers found Google's GEMMA model minimized women's needs, while GPT-4, stereotyped
00:42by race, gender, and ethnicity, recommending costly treatments on a demographic basis.
00:48Compassion for people of color dealing with mental health concerns?
00:52This is more than a technical glitch, lives are at stake, not likes.
00:56Earlier this year, Google's Medgemini even invented a body part, an error that's at least
01:01easy to spot.
01:03Bias however remains elusive within AI systems, raising concerns over unnoticed medical prejudices
01:08during patient care.
Be the first to comment
Add your comment

Recommended

5:43
Up next