- 3 months ago
This episode explores the rise of fake news, uncovering its history, motives and the stories driving its global impact.
The War on Truth examines the pervasive and transformative impact of misinformation in the digital age. Tracing its roots from propaganda to its rapid spread via social media algorithms, the episode explores the factors driving fake news and its consequences, such as polarisation and erosion of public trust.
Featuring compelling stories, including the United States elections, bird flu and COVID-19 misinformation, with insights from experts, the episode employs dynamic visuals and infographics to dissect this pressing issue and offer solutions.
The War on Truth examines the pervasive and transformative impact of misinformation in the digital age. Tracing its roots from propaganda to its rapid spread via social media algorithms, the episode explores the factors driving fake news and its consequences, such as polarisation and erosion of public trust.
Featuring compelling stories, including the United States elections, bird flu and COVID-19 misinformation, with insights from experts, the episode employs dynamic visuals and infographics to dissect this pressing issue and offer solutions.
Category
😹
FunTranscript
00:01Let's play a quick game.
00:03Three statements.
00:04Two are true, one's a lie.
00:06Ready?
00:08One. Bananas are berries.
00:11Two. Goldfish have a three-second memory.
00:15Three. You can't breathe and swallow at the same time.
00:19Think about it. Which one's the lie?
00:22If you said goldfish having a three-second memory, you're right.
00:26That's a myth we've been fed for years.
00:29In reality, goldfish can remember things for months.
00:32But why did so many of us believe that lie for so long?
00:39The truth isn't always as obvious as we think.
00:43For centuries, philosophers, scientists, and scholars have wrestled with what truth actually is.
00:50Is truth just what matches reality?
00:53Is it what makes sense in a system of beliefs?
00:55Or is it simply whatever works best in a given situation?
00:59Now with former President Obama taking on fake news, except, turns out, it is not really President Obama.
01:05You can claim that truth is being mass-personized, but most people would live in their own truth bubbles.
01:13Inspiracy theories are so powerful because they also provide a certainty and they provide a narrative.
01:18It's a story.
01:20And as humans, we really want stories.
01:22So they call it free speech.
01:23I call it the free market of speech.
01:24But they're making money off people's speech, whether it's good or bad.
01:27You might be wondering, does this even matter to me?
01:31But in today's world, where fake news spreads faster than facts, deepfakes make people say things they never did,
01:37and social media keeps us in bubbles of our own beliefs?
01:41Truth is up for grabs.
01:57We live in an era where what feels true often wins over what is true, and that's dangerous.
02:16Because if you can't tell the difference between reality and a well-packaged lie,
02:20who's really in control of what you believe?
02:23And how do we even find the truth anymore?
02:27All right, let's be real. Truth has a marketing problem.
02:31Think about it. What spreads faster?
02:34A long, carefully researched article explaining complex scientific data?
02:39Or a 15-second TikTok confidently claiming scientists are lying to you?
02:45Stop believing what teachers, scientists, and the news, and the NASA, and the government tells you.
02:51Yeah, the lie wins. Every. Single. Time.
02:55So what's wrong with the truth? Why is it so bad at keeping up?
03:02Maybe it's because sometimes the truth sounds just as unbelievable as a lie.
03:07Take these headlines. One of them is completely fake. Can you guess which one?
03:10Scientists have trained rats to drive tiny cars for treats.
03:15The Amazon rainforest now produces more carbon dioxide than it absorbs.
03:20Researchers discover a new species of snake that can survive being frozen for weeks.
03:24Well, the problem is, whether it's real or not, it all feels the same.
03:32In an endless stream of headlines, fact and fiction blur together.
03:36And the truth, slow, careful, deliberate, gets drowned out.
03:40So, one of the paradoxes is that false and misleading information is much more likely to be shared
03:50than real and accurate news. And you think, well, why is that?
03:54Well, let's think about gossip for a second. If I'm gossiping with a friend, am I going to gossip
03:59something kind of dry and factual? Or am I going to gossip something sensational?
04:03I mean, it can be the most outrageous, the most exciting, the most interesting bit of information.
04:11And because of that, it will travel quickly because information travels more quickly or it goes viral
04:17because people find it interesting or it has certain emotional triggers. And we know that,
04:23for example, information that's going to make you laugh or information that makes you angry or outraged
04:28or scared is information that travels quickly and is more likely to go viral.
04:37Disinformation isn't a new problem. It just got a tech upgrade.
04:42Back then, all you needed was a printing press and a wild imagination.
04:46Today, AI deepfakes, bot-driven propaganda and misinformation algorithms are doing the heavy lifting.
04:53In a study on explainable agency, researchers found that people overwhelmingly
04:58preferred the simple, clear-cut explanation, even when the complex one was more accurate.
05:04Because, let's be honest, nuance is annoying.
05:09Nobody wakes up in the morning thinking, you know what I want today?
05:12A complex, multi-layered explanation with conflicting evidence and no clear conclusion.
05:17And that's how we get...
05:20Disinformation is false information that is produced on purpose to manipulate or harm someone.
05:27Governments have been telling lies throughout history. And authoritarian regimes do it,
05:32but democratic regimes do it. I would say all governments lie. That's a given.
05:38Obviously, digital technology has changed the way we lie because it's changed the way we consume information.
05:43It's much easier to ask people to be skeptical of everything they see or hear or watch,
05:49rather than asking them to only trust this source and that source. Because people have seen many of these
05:55trustworthy quote-unquote sources being manipulated by their own states to sell them ideas such as the invasion of Iraq.
06:05Their refusal to do so will result in military conflict.
06:09To give you an example, one of the classic elements of propaganda is, for example,
06:14you know, if you have a war, then one side will say that the other side, they want to convince people
06:19that the other side is more brutal, more barbaric than them, right? So they, how do they do that?
06:24Well, they say things like, oh, the German soldiers in World War I were killing babies.
06:28They were beheading babies, right? Or stabbing babies with their bayonets.
06:32The Cybersecurity and Infrastructure Security Agency, or CISA, responsible for safeguarding
06:39election infrastructure stated, there is no evidence that any voting system deleted or lost votes,
06:45changed votes, or was in any way compromised.
06:48You know, when you think they just stole, they stole the election, it was just,
06:52they walked in, they said, you're out.
06:54Were there voting irregularities? Yes. Do voting irregularities happen in every election?
07:00Also yes. But was there widespread coordinated fraud at a scale that would have changed the outcome? No.
07:08Fight for Trump!
07:10But here's the problem. The election was stolen is a simple, emotionally powerful statement.
07:17The ability to produce and consume propaganda and misinformation has increased dramatically in
07:23the past 100 years, but particularly in the past 20 years, and even more so since the beginning of the smartphone.
07:28During that week was actually rumors, false news, and that's why I was really concerned about
07:33the effect social media is having on consumers of news.
07:36A comprehensive study by researchers at the Massachusetts Institute of Technology analyzed
07:41approximately 126,000 stories tweeted by 3 million users over more than a decade.
07:48The findings? False news stories are about 70% more likely to be retweeted than true ones.
07:56Moreover, these falsehoods reach 1,500 people six times faster than accurate stories.
08:03The study found that humans, not bots, are primarily responsible for the rapid spread of false news.
08:10People didn't just believe the lie, they wanted to believe it. Because beyond politics,
08:16beyond the facts and figures, the lie had something the truth often lacks. A story.
08:22I think drama is such an important part of our humanity and how we function, that whenever we hear
08:30stories that are more dramatic than the ones that are boring and predictable, we tend to be affected
08:37or to like and be attracted to the more dramatic ones compared to the other ones.
08:43Unfortunately, that's why conspiracy theories are so powerful, because they also provide a certainty
08:50and they provide a narrative. It's a story. And as humans, we really want stories. The truth is that
08:56in reality, we don't know. And as humans, we hate not knowing. We hate uncertainty. We hate ambiguity.
09:03So when I'm being told, can you just wait a bit longer? We need to do more research on something
09:08versus the government is coming up with a secret plot. I'm much more likely to want to believe that
09:13because that makes me feel like, oh, that's an answer to my queries. That's an answer to my confusion.
09:20Tonight, we're learning more about the bird flu and what it could mean for humans.
09:25The first human bird flu death. In early 2025, as bird flu cases surged across the U.S.,
09:32a viral video began making waves on fringe platforms featuring a man identified as the
09:38inventor of mRNA technology, claiming the outbreak is nothing more than a manufactured hoax.
09:45Engineered, he alleges, to push another round of vaccines.
09:48Using fear of infectious diseases, promoted fear, which is what we're absolutely seeing right now
09:58with bird flu. Despite its realistic appearance, this narrative is false.
10:10People are rightly concerned because it is a terrible disease. It's my job as a farmer
10:15to reassure people that we're doing everything we can and that they are safe from this disease.
10:23A study analyzing tweets from X, formerly Twitter, over five days in June 2024,
10:29found rampant myths and disinformation regarding the causes and prevention
10:34of highly pathogenic avian influenza. Among the loudest narratives, the idea that the bird flu wasn't
10:41real at all, but part of a so-called plandemic allegedly designed to sway the 2024 U.S. presidential election.
10:49It's impossible to know which were the first false claims, right? There's been so many. I would say
10:54that there are certain narratives that are really common, right? The most profound, I think, claim was that
11:02this bird flu is going to be the next COVID. It's going to result in a lockdown definitely. That was one of the big claims.
11:08When we started hearing that there were bird flu cases, there was this kind of moment of panic
11:14because everybody was like, oh my goodness, is this going to be the next pandemic? What does that mean?
11:19Should I be going to farms or not? Or should I kiss a farmer? I don't know. But we also saw people
11:25saying, well, maybe I should stop eating eggs. Maybe I, you know, if I cook chicken, maybe even if I eat
11:31cooked chicken, then I could be at risk.
11:32I do that one that bird flu or even COVID, that bird flu was a hoax.
11:38This follows a very common pattern of conspiracies, which is that everything is a hoax.
11:43The impact of misinformation around the bird flu crisis was immediate and tangible. Shaken
11:49livelihoods, disrupted operations, and mounting stress for workers already under pressure.
11:54Where an avian influenza, bird flu situation is occurring, you do have to make decisions
12:01quite quickly. And sometimes if the farmers don't have the correct information, you know,
12:07the wrong decisions can be made. The poultry industry faced significant losses due to both
12:13the outbreak and the spread of misinformation, leading to increased operational costs and decreased
12:19consumer trust. Farmers and producers, they're very concerned about the bird flu. In fact,
12:26they're saying, this is just a quote, that the virus is a terrorist.
12:30If our chickens catch this disease, we can't process them. So obviously we can't sell them.
12:36And if we're not selling them, we're not making any money. So it's a huge financial cost
12:42to us as a business and to individual farmers that catch it because you've effectively lost
12:51all of your income from that flock. And in some instances, businesses can't recover from that and
12:58will go bankrupt, unfortunately. Panic buying, fueled by misinformation,
13:05led to worsened egg shortages. In Connecticut, for example,
13:09egg prices jumped by $3 from December 2024 to February 2025, reaching a record national average
13:16price of $4.95 per dozen in January 2025.
13:20Egg prices, they're still up 60 percent.
13:24So how did we get here?
13:29For most of history, truth had bouncers. Institutions, journalists, scientists,
13:35historians, educators stood at the door, checking IDs, verifying facts, making sure the information
13:41that reached the public had at least some level of accountability. Sure, the system had flaws,
13:47biases, blind spots, but there was an understanding. News had to be sourced. Stories had to be fact
13:54checked. Claims needed verification.
13:56Then came the internet. The floodgates opened. Suddenly, everyone had a microphone. That anonymous
14:06Twitter account with a frog avatar? Same reach as a major news network. Your uncle with strong opinions
14:12on Facebook, now an authority on epidemiology. Information became instant, unfiltered. And with
14:18that, the battle for the truth was no longer about facts. It was about speed, engagement, and virality.
14:26And nothing spreads faster than fear.
14:30The thing about social media is, as ever, it can be exceptionally positive and powerful for good.
14:36Meta ran an experiment once where they were trying to see, for example, they classified two
14:41groups of content. They had good for the world content, which was stuff that they thought was
14:45like not disinformation, it was truthful, it was interesting, it was important, and bad for the world
14:49content. So like misinformation, disinformation, harmful information, hate speech, whatever.
14:53And they discovered that people were more likely to spend time looking at bad for the world content.
15:00Right? And so they had a choice. Do we make, we alter the algorithm so it shows less bad
15:05at world content? Or do we, you know, not do that? And they decided not to intervene too much in
15:11negating the bad for the world content. Studies confirm this dynamic. Researchers at USC found that
15:17social media's reward system trains us to share sensational content for likes and shares.
15:23I describe social media algorithms as like the invisible hand. I mean, we heard about the
15:27invisible hand of the market. But algorithms are like the unaccountable, invisible curators of
15:34information. Algorithms decide what we see. They decide what we see, when we see it, for how long we see it,
15:41who we see, what information we see, right?
15:44Misinformation today isn't only a byproduct of confusion. For many, it's a cash crop. Entire
15:51industries and cottage enterprises thrive on disinformation and conspiracy content. It turns
15:57out that lies can be quite profitable. They make money. They like to say that they,
16:03as companies, are the town square, the public square, that they allow freedom of speech.
16:08But you cannot have the town square be the public space if it's not owned by the public.
16:14It's a private company. So the town square can't own, be privately owned, right? So they call it free
16:20speech. I call it the free market of speech. And what do I mean? Well, they're making money off
16:24people's speech, whether it's good or bad. In fact, the online misinformation industry is booming,
16:30breaking in an estimated $2.6 billion per year in ad revenue from unwitting major brands.
16:36He tells us that he's made upwards of $60,000 creating bogus news headlines.
16:41In 2016, teenage trolls in Macedonia famously made tens of thousands of dollars churning out
16:48fake headlines, such as, Pope endorses Trump, because the ad clicks were so lucrative.
16:54Today's misinformation peddlers have only gotten more sophisticated. There's now a commercial
17:00disinformation industry, PR firms for falsehoods.
17:04They have lots of different names, reputation management companies. And the company will come
17:08in and say, okay, we've got some products for you. We can do a number of things. We can
17:13create thousands of fake social media accounts. And we can use these social media accounts to spread
17:19propaganda about your enemy, but also propaganda about how great your country is, right? And we've
17:24seen plenty of examples of this throughout history.
17:27This is Tal Hanan, the mastermind behind Team Jorge, a covert unit that specializes in hacking
17:35and disinformation.
17:36In 2023, an undercover investigation exposed an outfit nicknamed Team George. This secretive group
17:43of Israeli contractors ran a global disinformation for hire operation. For a hefty fee, they boasted they
17:50could manipulate elections, sabotage businesses, you name it. Their tool set? Hacking, digital sabotage,
17:57and bot armies on social media. Team George claimed to have interfered in 30-plus elections worldwide.
18:05In one instance, they created fake whistleblower videos and forged news to smear a political candidate.
18:11Essentially, disinformation has become a global export product.
18:18I call them personalized truth because if we lived in different truths, if everyone in a society lived
18:28in a different sort of truth, people would not feel injustice because they would not have the knowledge
18:35about each other's lives. It's tempting to think misinformation is just bad information that can
18:42be deleted or corrected. But as we've seen, it's woven into the very famric of how our online ecosystem
18:49functions. What we want is more friction. We want more interventions from platforms which slow us down.
18:56I want to be reminded, are you sure, Claire, that you want to share this? Are you sure this is true? And I'm
19:00probably going to be like, you're right, I don't. And I'm going to move on without sharing it. We want
19:05those kind of interventions. We need more media literacy. We need more fact checkers. We need more
19:09flags and labels. None of that is censorship. For example, the Russian news site. Does anyone have
19:14anything on that? In Finland, media literacy isn't an elective. It's a national defense strategy.
19:21Schools teach kids how to dissect headlines, question sources, and recognize bias.
19:27It's already fake because it says that it was 10 years ago in 2013.
19:32They learn that just because something feels true doesn't mean it is true.
19:37The result? Finland consistently ranks as one of the most misinformation-resistant countries in the
19:43world. Media literacy is actually one of the most exciting ways that we should be thinking about
19:48tackling misinformation. We need to engage young people in the creation of these courses so they
19:54actually stay relevant and they're interesting. During the 2020 pandemic, New Zealand didn't just
20:00tell people, don't spread misinformation. They made hilarious, shareable videos poking fun at
20:06disinformation. And it worked. Instead of lecturing people, they made the truth engaging. And that made
20:14all the difference. One notable initiative was the collaboration with the comedy show Wellington Paranormal
20:24to create a series of short videos addressing COVID-19 guidelines. Episodes featured characters
20:30humorously navigating pandemic-related scenarios, such as reminding citizens about social distancing by
20:36showcasing officers dealing with a zombie who wasn't wearing a seatbelt. This approach not only
20:41entertained but also reinforced public health directives in a memorable way. You don't need
20:46to worry about ghosts. They can infect you. They don't have bodies. So what we have now is a new
20:50kind of research field around pre-bunking. So pre-bunking is how can you talk to people before they see
20:58the misinformation? And by doing that, does it actually prime them to be ready? And does it make
21:04them more likely to think critically about the misinformation? Researchers have discovered something
21:09fascinating. If you expose people to small doses of misinformation tactics before they encounter the
21:15real thing, they become far less susceptible to falling for it. It's called inoculation theory. The
21:21idea? Teach people the tricks of misinformation before they see them in the wild. That way, when they do,
21:28they're already skeptical. So it's a bit like warning somebody about a financial scam. Say, hey, you know,
21:33if you get an envelope in the mail that says this, you might want to be careful. So it's the same
21:38kind of mechanism. And what the research shows is it's a much more effective way of making people
21:45kind of wake up a little bit and less likely to believe misinformation when they do see those same
21:50rumors. One of the best examples is the online game Bad News, where players take on the role of a
21:57disinformation creator. There's this gamified solution. It's a game where you pretend you're
22:03an influencer, right? And you have options to spread information that's truthful about COVID-19 and
22:10vaccination or information that's false, right? And when you do it in this game, if you spread the
22:17truthful information, you don't get as much engagement or followers. But when you spread the false
22:21information that's harmful, you get lots of engagement and followers. And you play this game and it's
22:25only about 15 minutes, but they've done studies now that show people who just play this game for
22:3015 minutes already demonstrate a more critical and resilient nature to disinformation.
22:38All right, let's put your truth detecting skills to the test. If we're going to rebuild trust in
22:44facts, we first have to train our brains to question what we see. Remember those three headlines in the
22:50beginning? Scientists have trained rats to drive tiny cars for treats. The Amazon rainforest now produces
22:55more carbon dioxide than it absorbs. Researchers discover a new species of snake that can survive
23:00being frozen for weeks. Are you able to guess which one is fake? Before you lock in your answer,
23:08think critically. Got your pick? Ready for the reveal? The answer, headline number three, is fake.
23:15No known species of snake can survive being frozen for weeks, at least not yet. Some reptiles enter a state
23:23called brumation, a hibernation-like slowdown. But full-on cryogenic survival? Still a sci-fi fantasy.
23:31But here's the real twist. Headline number one is true. Researchers at the University of Richmond
23:37actually taught rats to drive tiny cars made from plastic food containers, demonstrating the animal's
23:42ability to learn complex tasks. Headline number two is also true. Recent studies published in the journal
23:49Nature revealed that parts of the Amazon rainforest now emit more CO2 than they absorb due to deforestation
23:56and climate change, a dangerous reversal for the planet's lungs. How did you do? Fake news thrives
24:04on making the unbelievable sound just believable enough, and making the believable seem outrageous.
24:10So if truth is out there, but buried under misinformation, emotional manipulation,
24:16and viral deception, how do we make sure we're actually finding it? Maybe the better question
24:23is this. Are we really seeking the truth? Or are we just after the version of reality that makes us feel
24:30the most comfortable?
Recommended
19:48
3:43
24:31
24:15
25:20
0:46
2:55
Be the first to comment