- 2 hours ago
The Matter of Facts Season 1 Episode 2
Category
😹
FunTranscript
00:03Hi, I'm Hamish McDonald and I'm a journalist, which means facts are my business.
00:09Where do you get your information from?
00:11Social media.
00:12Facebook.
00:12Facebook, Instagram and YouTube.
00:15Social media is where so many of us now get our information.
00:19But can we rely on it?
00:22When tech makes it so easy to create a fictional story, which can look and sound real.
00:29I went to the moon last week, don't you know?
00:31Even though it's entirely fake.
00:35Are you like so many people now, struggling to know what's true and what's false?
00:41In this series, I'm going to explore how big tech is transforming our lives and our minds.
00:47We're going through a period of radical change in the way that people consume information.
00:51We've created this monster.
00:53An algorithm is outsourcing human choice into a machine's decision.
00:59The end goal is keep you scrolling.
01:01Hack our biology, change the way we feel to change the way we see the world.
01:06Is anyone else paying attention to how all of this is affecting society?
01:12I want to understand how technology is changing the way we think.
01:16It really does question your reality.
01:19And why we're all susceptible to mis- and disinformation.
01:23We are just as vulnerable, but we think we're not.
01:27And actually, that's what makes us more vulnerable.
01:30Do you worry about what happens if we can't agree on facts?
01:34This has been an information war. Facts stop mattering.
01:37This growing sense that anything could be fake at any time.
01:42Looks like you.
01:44Just not me.
01:45The facts are so polar opposite.
01:47Which ones are the facts?
01:49We can't even agree on how to determine what a fact is.
01:52In this episode, I'm travelling far and wide to understand how disinformation disrupts us.
01:58And democracy.
02:00They're posting on a pattern.
02:01These are bots.
02:03So is this safe for you?
02:05There might be people here hiding from us.
02:08It's part of the propaganda.
02:10It's a biased content production.
02:12The tearing down of democracy began by tearing down facts.
02:17Join me on a journey right around Australia and the world
02:20to make people determined to keep facts alive
02:23in this real-time information revolution.
02:38If you wanted to damage someone's reputation, how would you do it?
02:43Maybe the easiest way would be to start a rumour.
02:48And no-one ever needs to know it came from you.
02:54So what if you wanted to damage an entire country's reputation?
02:59It's the same principle.
03:01Let me show you how it's done.
03:04I promise there's no funny business.
03:06I want to invite you right now to come and join me in a hotel room.
03:09In Paris.
03:10Just before the Olympics.
03:22G'day.
03:23Welcome.
03:23Come in.
03:24Do you want to take a seat?
03:25Just come through here.
03:27I might move this guy out of the way for you first.
03:30There you go.
03:31Make yourself comfy.
03:32So listen.
03:33I just want you to imagine I'm one of more than 11 million people in Paris.
03:37Ahead of the 2024 Olympic Games.
03:39I've actually come a bit early.
03:41Because I've got a heap of cash to drop on this.
03:46My phone is running hot.
03:48With advice on great wine bars.
03:50How to use the metro.
03:52And...
03:52Ugh.
03:54Bugs.
03:54There's bed bugs everywhere.
03:55They're on the planes.
03:56They're on the trains.
03:57They're on the buses.
04:00Paris is in the grips of a bed bug apocalypse.
04:04It's an icky situation.
04:05The mayor of Paris is declaring no one is safe from bed bugs.
04:10Madame la Première Ministre, ces petits insectes répandent le désespoir dans notre pays.
04:17Bed bugs in Paris, public enemy number one.
04:21Une vague de panique saisit le pays.
04:23Les punaises de lits prolifèrent dans tous les lieux du quotidien.
04:26Elles font vivre un calvaire aux millions de nos concitoyens infestés.
04:30If you've never met a bed bug, lucky you.
04:33They're blood sucking parasites with a nasty bite.
04:37So, why is it that they're flooding into Paris ahead of the Olympic tourists?
04:44Oh!
04:46So, looking at social media, it's telling me that this influx in bed bugs coincides with
04:51the arrival of thousands of Ukrainian refugees.
04:54It is true, France is accepting a lot of people from Ukraine.
04:58They're fleeing the Russian invasion.
05:00It's also true that every year in France, the number of bed bugs rises and falls, along
05:06with the seasons.
05:10The bed bug crisis was a social media beat-up.
05:14When French authorities traced the bed bug and Ukrainian refugee story, they found it
05:20was a fiction spread by Russian agents to turn public sentiment against the refugees.
05:26Here's the post that's been circulating on a Telegram channel since October 2nd...
05:31Russian sites even created counterfeit versions of respected French publications, mimicking
05:37reports on the bed bug crisis.
05:39...where they usurped Western media logos...
05:43The Russia would have favored and fed this fear on the internet by flooding the false
05:48information networks thanks to bots, accounts created by machines to destabilize France
05:53because of its weight support in Ukraine.
05:59This whole campaign demonstrates almost perfectly the genius of modern disinformation warfare.
06:06All you need is to isolate an issue with the hint of truth to it, make sure you've got
06:11all the ingredients to make it go viral, and then amplify it for good measure.
06:15You see, by the time France realised it was Russia pushing all of this stuff, refugee tensions
06:20had been stoked, and the host nation, well, they were more than a bit embarrassed.
06:34The goal of Russian disinformation is not to make you believe anything.
06:41It is to break open the fracture lines of society, to weaken from within.
06:54This information, or hybrid wars, are being waged across the world.
07:00If you're on social media, you may well be inadvertently consuming or even sharing this content.
07:07These campaigns have been identified as major global threats to peace and democracy.
07:12And while they're not physically devastating like conventional wars,
07:17the deliberate spread of false information targeting a country's population
07:21is considered a form of warfare.
07:39During the Second World War, this part of London was bombed night after night.
07:44Churchill was hunkered down in his underground war rooms, directly beneath us in fact.
07:49The air raids were ear-piercing, people were shaken from their beds.
07:53And although it's been some 80 years since the air raid sirens have rung out across London,
07:58warning of an attack, that doesn't mean this city's been free from invasion.
08:04Today, though, the threat comes through firewalls, and disinformation is spread
08:09to create confusion and fear.
08:15It's the job of counterintelligence experts scouring online activity to identify and expose disinformation attacks.
08:24Ben Strick does just that.
08:26He's an Australian based in London.
08:29He believes we need to recognise how the mechanics of warfare are fundamentally shifting.
08:35I mean, what do you see when you see this piece of equipment right here?
08:39A very big rocket, that's what I see.
08:42Probably a very expensive rocket too, right?
08:44Yeah, OK.
08:44So I see something that costs a lot of money.
08:47Its intent is to cause harm, maybe to target a military unit or a strategic objective, right?
08:53The research to create this is hundreds of millions.
08:57But a series like a campaign online could be so much cheaper to target the enemy and cause,
09:04you know, damage financially.
09:06It can be done at scale with, you know, a cheap subscription to an AI platform en masse
09:12and a couple of hundred dollars.
09:14The lieutenants in a hybrid war can wage their battle from the comfort of their own homes.
09:19Their soldiers can be anonymous trolls or machine-generated bots.
09:26Bots are given their own accounts and identities which trick us into thinking they're real.
09:32In 2024 alone, Meta identified 4.23 billion fake Facebook accounts.
09:40Every year, thousands of Australians are scammed out of millions of dollars.
09:49I want to show you how a campaign can be waged online technically.
09:54Yeah.
09:55Ben Strick busts disinformation campaigns across the world.
09:59He wants to show me an example from much closer to home.
10:02A campaign generated in Indonesia to undermine the autonomous province of West Papua's bid for independence.
10:11This is in West Papua, so it's an independent area under Indonesian occupation.
10:22So there were large protests that happened in August 2019 on the streets.
10:28These were the sorts of videos that were being uploaded online.
10:31And so the Indonesian government shut off the internet.
10:36They shut off all connectivity for people in West Papua.
10:39And I think they shut it off as a way to stop information flow, right?
10:43So that no one could access social media channels.
10:45But the thing is, when we saw it shut down, we saw Indonesian accounts sharing videos like this.
10:51What is Indonesia trying to hide in West Papua?
10:54It's trying to hide its great budget decisions, its great work with schools and things like that.
10:59And it was actually being shared by accounts like this guy called Marco267.
11:04He's a friendly white guy and he's posting saying,
11:07what are some of the secrets that Indonesia have been hiding in West Papua?
11:10He's using hashtags that are not really relevant, like West Papua Genocide and Free West Papua.
11:16But it's kind of odd for me because it's not only Marco uploading it,
11:21it's hundreds of accounts all posting the exact same message.
11:25And this is not retweets, this is actual individual.
11:28Separate accounts.
11:29So let me take you back to Marco267.
11:33Now I do something like an image reverse search on his profile picture.
11:37Yeah.
11:37Marco looks like a friendly Western white guy, but he's also used in dating forums,
11:42in stock photo websites, in ski resort hotels.
11:46This is a fake account.
11:49And so what I did was I collected time codes of Marco's posting activity and his friends.
11:55And you can see this repeat pattern.
11:57Bella now, one of his friends, is posting at 032.56, 932.56, 832.56.
12:04They're posting on a pattern.
12:06These are bolts.
12:07And if we have a look at one of Marco's friends, which is West Papua ID, that discusses fact
12:13and data in West Papua.
12:15Sounds like a legitimate website.
12:16You can even see there's articles.
12:18And these articles talk about, again, the great work that Indonesia is doing.
12:23We start to think, well, okay, who made this website?
12:26And one of the tools that we have is metadata.
12:31Who registered the website?
12:33This person left their real name, their phone number, their email address, and everything
12:38like that on this website.
12:40I pop that phone number into WhatsApp and immediately I have someone that owns that website.
12:45And simply doing an image reverse search, I'm able to find that person's LinkedIn account.
12:51And now we've identified a company.
12:54That company owns all of these websites.
12:56It's one marketing firm in Indonesia.
12:59We shared these findings with Facebook and with Twitter.
13:03And Facebook published a huge takedown of a network.
13:07Twitter had taken down 795 accounts as well.
13:11And we noticed that the website had folded up.
13:14And got it.
13:16This small marketing firm in Jakarta was closed overnight after we published that.
13:21Wow.
13:21That's impressive.
13:23Cheers, mate.
13:27These campaigns go by various names.
13:30Hybrid threats.
13:32Cognitive warfare.
13:33Or simply propaganda.
13:36Even when there's a pause in the bombing, there's no lull in the propaganda war.
13:42Propaganda has existed for as long as the powerful have sought to manipulate public thinking.
13:47It's how it can be spread that's changed.
13:50Psychological warfare is a major part of the American and South Vietnamese effort.
13:55With massive leaflet drops urging them to defect and join the government side.
14:01Our information systems were once ruled by the printing presses.
14:06You're tuned to Radio Australia, the overseas service of the Australia...
14:10In the 20th century, radio expanded audiences across the oceans.
14:14Radio Australia.
14:16Service outermair.
14:16This century, anyone with an internet connection can instantly deliver information or disinformation
14:23to billions.
14:26I love the idea of using technology to facilitate consultation, engagement, a greater sense of
14:33democracy.
14:34How social media platforms currently work is to not create those kinds of conditions.
14:39Every technology comes with advantages and disadvantages.
14:43And the disadvantage here is that it allowed a lot of people to advance their own personal,
14:49professional, commercial or political purposes.
14:59Russia leads the world in creating clandestine hybrid threats, targeting its neighbours.
15:06Just like the bedbugs campaign, hybrid threats weaponise existing tensions to fuel public fear.
15:13Now, if you're anything like me at this point, you might be thinking hybrid threats?
15:18Somewhat confusing.
15:19What exactly is the objective?
15:21Where do they come from?
15:23Who and what are they targeting?
15:25And how on earth would you ever know for sure?
15:27And I suppose that's the point.
15:29It's confusion by design.
15:31So, that's why I've come here to the border between Poland and Belarus, where the threat
15:35is a bit more tangible.
15:37It involves real human lives.
15:40And one very big fence.
15:44This deep primeval forest on Poland's eastern border has become a death trap for people who
15:51believe the lies they've been sold online.
15:55Like promises of freedom and opportunity.
16:01In 2021, thousands of migrants travelled from the Middle East and North Africa via Russia
16:07and Belarus to the Polish border.
16:20Polish guards struggled to hold them back.
16:25They kept on coming.
16:27By October, numbers had swelled to tens of thousands of people.
16:33Illegal border crossing is forbidden.
16:37You will face criminal charges.
16:40Despite waves of migrants into Europe, only a trickle ever attempted this crossing.
16:45So, why the sudden surge?
16:49The answer is all over social media.
16:52Messages, many in Arabic, offering flights to Minsk, the capital of Belarus.
16:58And taxis to the border.
17:00Russian travel agents promise Polish visas to migrants across Africa.
17:05According to the Polish government, this is a hybrid threat campaign created by Belarus
17:11and Russia to manufacture a migration crisis.
17:15A claim both countries deny.
17:18The European Union says Russia works with its proxy state Belarus to disrupt democracy in
17:25Poland and broader Europe.
17:27This is a hybrid attack of an authoritarian regime to try to destabilise democratic neighbours
17:33and this will not succeed.
17:41Over the past five years, tens of thousands of migrants have tried to cross this border.
17:47But the Polish government greets them with razor wire, a five-metre fence and border guards,
17:53pushing them back to Belarus.
17:57The hybrid threat has worked just as intended.
18:01Anti-migrant anger is on the rise across Poland.
18:04Far-right groups claim the country is now flooded with unconstrained migration.
18:10But the facts just don't support that.
18:13Only a small proportion of those arriving are granted asylum in Poland.
18:19Among them, a young North African who's agreed with the promise of anonymity to tell me how he got here.
18:25How did you know that you could get to Europe by coming to Russia and then Belarus?
18:31Who told you that this was a pathway?
18:40Which social media?
18:41TikTok, Facebook, Instagram?
18:43I use only Facebook.
18:45Facebook, okay.
18:46Yes.
18:46The visa that you had for Russia, was it a tourist visa or a business visa?
18:52Tourist like that.
18:53Tourist visa, for three months?
18:55It was three months, yes.
18:56And when you were in Belarus, did any of the border guards or police tell you that it's safe to
19:02go through to Poland?
19:03Did they encourage you to go to Poland?
19:08Go to Europe.
19:10Go to Poland.
19:11Go to Europe.
19:12Go to Europe, yes.
19:12That was what they said?
19:13Yes.
19:15Did you think it would be safe to travel this way?
19:19We cross the jungle.
19:21No safety.
19:27Migrants can spend days, weeks, even months in the forest and swamplands on both sides of the border.
19:35The border with Belarus is in about eight kilometres distance.
19:40All right.
19:40Shall we walk?
19:41Yeah, we can walk.
19:44I got a message from Telstra welcoming me to Belarus.
19:49Alexandra Kushinovska and Anna Albot have allowed me to join them in a section of forest where migrants often hide
19:55after crossing the border.
19:58So it's pretty rough.
19:59You're not really following trails.
20:00You're just literally walking off into the bush.
20:02No.
20:02Yes.
20:04These women are part of a network of volunteers who can't offer people safe passage, but they
20:10can deliver life-saving supplies.
20:13We would come here with dry clothes, shoes, with food.
20:17Do you find that you get lost?
20:19We all the time use GPS.
20:21Yeah, right.
20:22So we do get lost.
20:23OK.
20:24But not so bad.
20:25Yeah, right.
20:28There might be people here hiding from us.
20:30Really?
20:32Mm-hmm.
20:33Oh, those boots.
20:34Yeah, it's a boot.
20:35There's a jacket just back there.
20:37Yeah.
20:37Yeah, and here.
20:38More there.
20:39It's unbelievable how much there is.
20:41Everywhere in front of us here.
20:42And a jacket here.
20:45And a jacket here.
20:45Or actually a sleeping bag.
20:47Oh, and a backpack.
20:48Oh, wow.
20:49Sometimes if you run away, you leave everything behind.
20:53It must be so bewildering arriving into this.
20:57Like, I just don't know what you'd make of it if you'd arrived over a fence from North
21:01Africa or the Middle East and landed here, in the middle of a swamp.
21:07Unbelievable.
21:12Oh, sorry.
21:14Our alarm phone.
21:15OK.
21:16Hello?
21:17Hello, hello.
21:22So, this is an alert phone call they're getting from the other activists.
21:27Mm-hmm.
21:31They've just received a request for help from two people from Afghanistan.
21:36And they're waiting right now for their individual needs.
21:40So, is this safe for you?
21:41Is it safe for the migrants to be in contact with you?
21:46For us, it is safe, and for them, it's an additional risk, because more people around
21:51them, the more chance that the drones or Polish border guards will see them.
21:56So, they wouldn't contact us if they don't meet.
21:59And what would happen to them if Polish border guards find them?
22:02They'd be sent back?
22:03They would be immediately through to the other side.
22:05OK.
22:06Brutally.
22:08Every day, these women come face to face with the human cost of hybrid warfare.
22:13This is the wound of the...
22:15Oh, wow.
22:16So, it's really...
22:16And that's on the heel?
22:17On the heel, yes.
22:18So, it's quite a big...
22:19Oh, they've got a huge cut in the heel.
22:20This is really big.
22:21Yeah.
22:22They've met with hundreds of people fleeing authoritarian regimes in search of asylum.
22:29But less than 10% of migrants on this border have been officially accepted in Poland.
22:34The majority have been pushed back to Belarus.
22:38Poland has suspended the right to apply for international protection at the border with Belarus.
22:44So, they have to do everything possible to avoid the contact with Polish border guards or soldiers.
22:50Let's just go.
22:52And maybe let's just give up...
22:53Other volunteers meet us.
22:55Let's put back store and garbage here.
22:58They've packed supplies, including medical gear, to treat the men's injuries.
23:05So that we can move quickly, my crew has stayed behind.
23:08I'm now on camera and sound.
23:12This is funny.
23:14It's a camera.
23:15360 degrees.
23:17Oh.
23:18So, front and back.
23:19And it has audio as well.
23:22So, we just need to behave very carefully.
23:26Not making too much noise and not being seen.
23:30We should even avoid being seen by random people.
23:33Because you never know who will be willing to help and who will call the border guard.
23:51And, and, and, and, and that's what we need to say about this one.
23:52And, and, and, and, and, and, and that's why.
23:53And, and, and then, and, and, and, and, and, and, and, and.
24:25Anna shares clothes, shoes, hot soup and water.
24:36That's it. It's a very big wound.
24:41It's very deep, very deep.
24:43He needs proper medical attention.
24:45This will look like easy working.
24:51His hands.
24:54Can I see your hands?
25:00We use a translation app to ask about their journey.
25:05Tell me. Tell me. Tell me. Tell me.
25:08I should. I should.
25:11So I came via Russia to Belarus six months ago.
25:20The men are nervous and want to keep moving.
25:23Nice to meet you.
25:28Anna and Alexandra deliberately don't ask where the men are headed.
25:34They're unlikely to ever meet again.
25:39She wrote in a translator that they are very grateful for our help.
25:44That's it.
25:46Yeah, it's always difficult to receive these words of thankfulness because this situation shouldn't happen.
25:56Yeah, but we can't do a lot.
26:05The death of democracy happens by a thousand cuts where you're bleeding from every cut and you don't know how
26:12weak you are until you die.
26:16It's happening everywhere.
26:19Every place where the tech is ruling us.
26:25Hybrid threats and disinformation campaigns don't just target foreign countries.
26:31Most of it is actually designed to work internally to manipulate opinion and maintain power at home.
26:44We used to think revolutions happened when people took to the streets or that democracies were toppled through bloody military
26:51coup.
26:52Today we can all sense that those lines are blurring between democracy and autocracy and dictatorship.
26:58And for anyone that wants to influence what you think, there's a pretty good opportunity every time you reach into
27:04your pocket.
27:13To understand disinformation's impact on democracy, I've come to the Philippines.
27:19It's an Indo-Pacific neighbour of ours, offering a cautionary tale.
27:26The Philippines has been crowned the social media capital of the world.
27:30People here devote an average of three and a half hours a day to scrolling.
27:34That's an hour longer than the world average.
27:38But Facebook is where most people get their nudes.
27:42So the end goal is keep you scrolling.
27:44Hack our biology, change the way we feel, to change the way we see the world, to change the way
27:49we act, to change the way we vote.
27:57In just a couple of days, more than 60 million Filipinos are going to cast their vote.
28:02It's the nation's midterm elections.
28:06So you're voting, where do you get your information from?
28:09Social media is much easier when we select the candidates.
28:12Do you think that facts matter when it comes to elections in the Philippines?
28:18Well, facts matter for me. It's hard to find a fact.
28:21They are already running. Oh, the same face, the same family, the same people.
28:31Politics here is dominated by two warring families, Marcos and Duterte.
28:38Once allies, they now work tirelessly to backstab and discredit each other.
28:44Their weapon of choice is the modern-day dagger, online disinformation.
28:55Former President Rodrigo Duterte long ago recognised the role of technology to help maintain authoritarian power.
29:03If you destroy my country, I will kill you.
29:07Over his six-year term, he ran an indiscriminate war on drugs, which left thousands dead.
29:15Including 20 journalists.
29:18Just because you are a journalist, you are not exempted from assassination.
29:23If you are the son of a bitch.
29:27Independently, reporting facts in the Philippines can be dangerous.
29:32There's one news organisation here that will not be silenced.
29:36Founded back in 2011, it is small, independent, online.
29:41Rappler is a digital news outlet which began its life on Facebook.
29:46What are we fighting? First, tech, social media, corruption.
29:52Rappler's co-founder, Maria Ressa, has won a swag of awards.
29:56Including the Nobel Peace Prize for her commitment to press freedom.
30:01Let's have fun. Please, let's keep having hope.
30:05Rappler has created and now publishes on its own innovative app.
30:10It works much like a chat app and has freed the news outlet from a reliance on big tech platforms.
30:16I drank the Kool-Aid and believed that social media could be an enabling factor for better governance.
30:22And so we thought there was tremendous potential and it worked for a while.
30:27Are you talking about a moment in time where there was just this great hope?
30:31Yes.
30:31Like the Arab Spring.
30:33There was this idea that social media was going to democratise the message.
30:39Give the people more power, right?
30:41That was the idea, that it was an enabler for democracy.
30:47In the early 2010s, the Arab Spring began with demonstrations across the Arab world to oust repressive leaders.
30:55And I was there reporting at the time.
30:57This is a revolution.
30:59Yeah, this is an Egyptian revolution.
31:01We don't want this government.
31:03The crowds organised themselves via social media.
31:06They called them Twitter revolutions.
31:09And it felt like a powerful tool, creating real political change.
31:14Egypt was so incredible because they were leaderless networks, right?
31:19And it was so exciting that the people could work together to gain more power.
31:27These leaderless networks were able to share information online.
31:31But then, from 2014, algorithms began promoting content that would boost engagement.
31:38The messaging from grassroots campaigns was eclipsed by more sensational content.
31:45The public information ecosystem literally was rewarding lies, spreading lies six times faster than facts.
31:53If you lace it with fear, anger and hate, it spreads even faster.
31:58For Duterte, algorithms presented an opportunity.
32:02He could now flood social media with disinformation that justified his actions and discredited his critics.
32:10I repeat it.
32:11F*** you.
32:13Oh.
32:18He also encouraged an orchestrated hate campaign against Maria Ressa.
32:23In 2016, exponential lies on social media attacking us.
32:30I was getting an average of 90, 9-0 hate messages per hour.
32:36It's dehumanising.
32:37And then 2019 was the beginning of the arrests.
32:42The chief executive of one of the Philippines' most popular news sites has been arrested for the seventh time.
32:49Maria Ressa has been found guilty of cyber libel charges and faces up to six years in prison.
32:54I'd say the tipping point when I realised the world had really changed was when I was arrested in this
33:00newsroom.
33:02It's a shock.
33:03It's a shock.
33:03It's a shock, but we're going.
33:05Press freedom is the foundation of every single right of every Filipino to the truth.
33:12Ten arrest warrants in a little over a year.
33:152020, I was convicted of cyber libel.
33:18We're still fighting this at the Supreme Court.
33:20Can democracy exist without shared facts?
33:25Absolutely no.
33:27No.
33:28The tearing down of democracy began by tearing down facts.
33:32Democracy is taking that shared reality where we agree on what the facts are so that we can mobilise, move
33:40together.
33:40In the past, it used to be journalists.
33:43We were the gatekeepers.
33:46Maria Ressa's clarion call is echoed by journalists across the world.
33:51I don't think democracy can survive without a free and independent press.
33:54It never has.
33:56Marty Baron is a celebrated American newspaper editor.
34:00He's been pivotal to the success of papers like The Washington Post and The Boston Globe.
34:06He led The Globe's investigation into clerical sex abuse, immortalised in the Oscar-winning film Spotlight.
34:14Journalism can influence ordinary people's lives for the better.
34:18It is essential to democracy.
34:20Just telling the truth in too many countries, it will get you killed.
34:26Our mission is holding powerful individuals and institutions accountable, particularly those who are in government.
34:35When you have aspiring autocrats, one of the first things they do is try to crush the free and independent
34:39press.
34:40We are fighting the fake news.
34:42It's fake.
34:43Phony.
34:43Fake.
34:44CNN is scum.
34:45It's your fake news.
34:46ABC is one of the worst.
34:47I hope to be honest with you.
34:49They go after the press first and there will be nobody to hold them to account.
34:53And when you look around the world at authoritarian governments and how they came into existence,
34:57you see that one of the first things they did was try to destroy a free and independent press.
35:03In my opinion, you are hurting Australia very much right now.
35:06They want to get along with me.
35:14Traditional media organisations bear responsibility for what they publish.
35:18There's at least a group of people who are professionally trained.
35:22They're not perfect, but they're dedicated to actually verifying the facts.
35:26And it's not just because of the laws.
35:27It's because we actually feel morally responsible for what we publish.
35:33I don't think it's enough now to just say you should trust us because we're journalists.
35:37But I think it's also fair to say that there are a set of people that exploit the trust gap
35:41with mainstream media organisations for more nefarious purposes.
35:46Something's wrong with the press, guys.
35:48Journalism is dead.
35:49Like, what the heck is going on?
35:51Citizen journalism is the future.
35:53By the people, for the people.
36:01The social media platforms that do have a very significant dominance in the information ecosystem
36:08have a business model that prioritises polarising content.
36:13They're not designed to facilitate quality information distribution.
36:18X or Twitter is now just probably the largest disseminator of false information in the world right now.
36:28And that's why we have a society where we can't agree on a common set of facts.
36:32That's why we have a society where we can't even agree on how to determine what a fact is.
36:36And just remember, what you're seeing and what you're reading is not what's happening.
36:44I don't really want government to be in the business of dictating what's true and what's false.
36:50That's a very dangerous place to be.
36:57If you're watching this and you're thinking to yourself,
37:00journalists are the problem, or at least part of it, you may well be right.
37:03There are plenty of legitimate criticisms of us, the role we've played, in diminishing trust.
37:11The question is whether journalism can rise to meet the challenge to help fight against this assault on facts.
37:28With only 24 hours to go, election fever is building on the streets of Manila.
37:34As the vote gets closer, they stage these huge political rallies.
37:38So that's where I'm heading now.
37:54You know, when you're in the middle of this, there's something quite captivating about it.
37:57It's kind of hard to know where the politics actually is, but there's a heck of a lot of people
38:03engaged.
38:03Let's have a look.
38:06This is a pro-Marcos rally.
38:09Ferdinand Bongbong-Marcos is the nation's current president.
38:12So tell me, where do you get your information about politics?
38:15I watch on Facebook.
38:19And do you believe everything you see and hear on Facebook?
38:22Yes.
38:22So you can spot the fake news, can you?
38:25OK.
38:26In the media zone, there's over 50 social media influencers, known here as vloggers, live streaming.
38:34The Independent Electoral Commission says it's fair to assume they're all being paid for their support.
38:39I know lots of people in Philippines use social media.
38:43Is that where most people get their politics from?
38:46Five years ago, they shifted from mainstream media to blogging.
38:51It becomes an industry.
38:52Yeah.
38:533.2 million views is pretty good.
38:56Influencer or vlogger Karkofi hasn't confirmed if he's paid for his devoted support of the Marcos family.
39:03People believe vloggers as compared to mainstream media.
39:09That's my observation because they look at mainstream media as biased.
39:12Which I agree.
39:13All media organisations are biased.
39:16I am pro Marcos.
39:17So I produce content with leanings with bias to the government.
39:21Thank you, ma'am.
39:23Thank you, Paul.
39:24So I focus on topics that I think this is good.
39:26This will promote the administration that I do content for that.
39:30Do you think what you do is propaganda?
39:35By definition of propaganda, I think yes.
39:39It's part of the propaganda.
39:41It's a biased content production.
39:44Yes.
39:45If the propaganda doesn't harm anybody, it doesn't harm the society, it doesn't harm the democracy,
39:51I don't see there's a problem with the propaganda.
39:56You know, historically, countries where the media disappears, the democracy disappears.
40:03But there are vloggers who are also sort of media.
40:11Prime Minister Anthony Albanese.
40:13Welcome to Big Small Top.
40:14Of course, an alliance between influencers and politicians isn't unique to the Philippines.
40:19And they have this nuclear fantasy.
40:21We'll get into that.
40:23They're opposed to renewables.
40:25In Australia, politicians must disclose paid support from influencers.
40:30But they now often appear as guests on their high-rating shows.
40:34I can't dive.
40:35Were you one of those people that dove and did the belly flop straight in?
40:38There was a little bit of that.
40:39Yeah, yeah.
40:40Delulu.
40:41I dare you to drop that in a speech.
40:43Do you reckon I can do that?
40:43Delulu with no salulu.
40:45Can you drop that in a speech?
40:45They are Delulu with no salulu, Mr Speaker.
40:49The 2025 ballot has been dubbed Australia's first social media-led election.
40:55Have you hit your gardening phase yet?
40:57No, I'm not a gardener.
40:58No, me either.
40:59Yeah.
40:59You just have to observe what podcasters are doing and what influencers are doing.
41:04They're speaking in a much more informal way, as if they were talking to their buddies.
41:08He's scared!
41:10I'm joking.
41:11Yep tends to stop shitting himself.
41:13Okay.
41:14They're much more relatable than traditional journalists tend to be, and people can connect
41:19with them.
41:21I found often audiences would say, you journalists in the mainstream media, you've brought this
41:28on yourself.
41:29I don't like your facts.
41:30There's other facts.
41:32There are no alternative facts.
41:34Because, let's say right here.
41:35But why do so many people believe there are?
41:37I guess that's what I'm getting at.
41:38Because it's easy to insidiously manipulate people today.
41:41What used to take intelligence agencies a lot more money can now be done cheaply, easily.
41:49Silicon Valley opening the gates to this kind of insidious manipulation.
41:54And the crazy thing is, people don't believe they can be manipulated.
42:00Again, it goes back to our cognitive biases.
42:03And until we realize that, we are Pavlov's dogs.
42:09The media environment, the technology environment, has allowed for and encouraged greater tribalism.
42:16We feel more comfortable with our tribe.
42:18We trust them.
42:19And so, it just reinforces an instinct that's been part of society forever.
42:33Across town, Duterte's supporters are enjoying what feels more like a pop concert than a political rally.
42:41Vloggers are out in force here too.
42:43And they're upset that Rodrigo Duterte isn't here.
42:47He's in The Hague.
42:48The former Filipino leader, Rodrigo Duterte, is now in the custody of the International Criminal Court
42:54after being arrested on charges linked to his war on drugs.
42:58Thousands of small-time dealers, users, activists and others were killed without trial.
43:06Despite incarceration, Duterte is actually vying for a mayoral seat in this election.
43:12And his daughter is Vice President.
43:14Well, this is where they're all here to see.
43:17The Vice President, Sarah Duterte.
43:19Her dad was the President.
43:20He's in The Hague.
43:22And she's at war with the current President.
43:24In the past months, my family's name have been dragged through the mud.
43:30I am not the problem of this country.
43:42Of course, the Duterte camp has its own army of full-time vloggers.
43:52Between 10 and 30 million US dollars gets funneled to vloggers and influencers here each election cycle.
44:00Before the platforms reward them, they're rewarded by money here.
44:05People can make money from disinformation.
44:07People see the opportunity to make money.
44:09They see the opportunity to acquire power by exploiting the media environment as it exists today.
44:14People are using it for their own commercial ends.
44:17They're using it for their own political ends.
44:18We don't have a counterweight to that.
44:20And that's, that's where we are right now.
44:25Stop by 10 seconds.
44:2710, 9, 8, 7, 6, 5, 4, 3.
44:33Hello and welcome.
44:35I'm Maria Ressa.
44:36And you're watching Rappler's special coverage of the 2025 national and local elections.
44:42And the polls have officially opened.
44:4568.4 million Filipinos are registered to vote.
44:48President Ferdinand Marcos declared today a holiday.
44:52So, Filipinos, go out and vote.
44:56Voters don't disappoint.
44:58It's the highest turnout in the history of midterm elections.
45:02More than 80%.
45:06Joining us now is Jairo Balendo.
45:15As results start rolling in, former President Rodrigo Duterte is delivered a sweeping victory.
45:23The Duterte has emerged as big winners in this year's elections in Davao City.
45:28Former President Rodrigo Duterte set to be elected mayor of Davao City by a landslide, despite being detained at The
45:35Hague.
45:38It's a symbolic win.
45:40A mayoral seat Duterte can't reach from his prison cell.
45:44Social media has been flooded with revisionist versions of Duterte's brutal leadership.
45:51And the voters seem to have bought it.
45:55What's happened in the Philippines has happened in many other parts of the world now.
46:04The White House has banned journalists from entering the Pentagon if they haven't signed a legal document agreeing to only
46:11report information that's been authorised by the Department of Defence.
46:14I think that the environment now is a dangerous one for our democracy.
46:20I think we have an administration that's pushing our democracy to the limits in many ways, aiming to break it.
46:2972% of the world is now under authoritarian rule.
46:33We are electing illiberal leaders democratically.
46:38All because of what's on that device.
46:40I think this is the accelerant and an enabler for digital authoritarians.
46:47Every day that democratic nations do not reclaim their rights from big tech is a day they get weaker.
46:55And a government that wins when there's no trust, the only government that survives in that environment is a dictatorship.
47:04Do you think we can win this fight?
47:06We have to wake up.
47:10We have to look at the tech.
47:13We have to admit that we have been manipulated.
47:18And we have to demand better.
47:21This is a thinking fast world.
47:24That's where the big tech companies want you to stay because that's where they make the most profit.
47:29But you need to move into the thinking slow world and we need to solve it at scale or we
47:36will lose our agency and our humanity.
47:44Join me next time.
47:45This is very new and very weird.
47:47Are these people real?
47:48Are we diving head first into AI?
47:51Was there anything about it that looks weird to you?
47:54No.
47:55Where is AI taking us?
47:56So you reckon he's real?
47:58That looks fake.
48:00I was really easily tricked by it.
48:02And do we even want to go there?
48:03We don't want to sleepwalk into a future that nobody wants.
48:06They've invested hundreds of billions of dollars into this industry and they want to seek a return.
48:11Are you scared?
48:12I'm very scared.
48:18Don't wait a week when all episodes of The Matter of Facts are streaming now on ABC iview.
Comments