- 9 hours ago
The Matter Of Facts S01E02
Category
📺
TVTranscript
00:03Hi, I'm Hamish McDonald and I'm a journalist, which means facts are my business.
00:09Where do you get your information from?
00:11Social media.
00:12Facebook.
00:13Facebook, Instagram and YouTube.
00:15Social media is where so many of us now get our information.
00:20But can we rely on it, when tech makes it so easy to create a fictional story, which can
00:28look and sound real?
00:29I went to the moon last week, don't you know?
00:31Even though it's entirely fake.
00:35Are you like so many people now, struggling to know what's true and what's false?
00:41In this series, I'm going to explore how big tech is transforming our lives and our minds.
00:47We're going through a period of radical change in the way that people consume information.
00:51We've created this monster, an algorithm is outsourcing human choice into a
00:58machine's decision.
00:59The end goal is keep you scrolling, hack our biology, change the way we feel to change
01:04the way we see the world.
01:06Is anyone else paying attention to how all of this is affecting society?
01:12I want to understand how technology is changing the way we think.
01:17It really does question your reality.
01:19And why we're all susceptible to mis- and disinformation.
01:23We are just as vulnerable, but we think we're not, and actually that's what makes us more
01:29vulnerable.
01:30Do you worry about what happens if we can't agree on facts?
01:34This has been an information war.
01:36Facts stop mattering.
01:37This growing sense that anything could be fake at any time.
01:42Looks like you.
01:44Just not me.
01:45The facts are so polar opposite.
01:47Which ones are the facts?
01:49We can't even agree on how to determine what a fact is.
01:52In this episode, I'm traveling far and wide to understand how disinformation disrupts us
01:58and democracy.
02:00They're posting on a patent.
02:01Hmm, these are bots.
02:03So is this safe for you?
02:05There might be people here hiding from us.
02:08It's part of the propaganda.
02:10It's a biased content production.
02:12The tearing down of democracy began by tearing down facts.
02:17Join me on a journey right around Australia and the world to meet people determined to
02:22keep facts alive in this real-time information revolution.
02:38If you wanted to damage someone's reputation, how would you do it?
02:43Maybe the easiest way would be to start a rumor.
02:48And no one ever needs to know it came from you.
02:54So what if you wanted to damage an entire country's reputation?
02:59It's the same principle.
03:01Let me show you how it's done.
03:04I promise there's no funny business.
03:06I want to invite you right now to come and join me in a hotel room in Paris, just before
03:11the Olympics.
03:22G'day.
03:23Welcome.
03:24Come in.
03:24Do you want to take a seat?
03:25Just come through here.
03:27I might move this guy out of the way for you first.
03:30There you go.
03:31Make yourself comfy.
03:32So listen.
03:33I just want you to imagine I'm one of more than 11 million people in Paris ahead of the
03:382024 Olympic Games.
03:39I've actually come a bit early, because I've got a heap of cash to drop on this.
03:46My phone is running hot, with advice on great wine bars, how to use the metro, and, uh, ugh.
03:54Bugs.
03:54There's bed bugs everywhere.
03:55They're on the planes, they're on the trains, they're on the buses.
04:00Paris is in the grips of a bed bug apocalypse.
04:04It's an icky situation.
04:05The mayor of Paris is declaring no one is safe from bed bugs.
04:09Madame la Première Ministre, ces petits insectes répandent le désespoir dans notre
04:14pays.
04:17Bed bugs in Paris, public enemy number one.
04:21Une vague de panique saisit le pays.
04:23Les punaises de lits prolifèrent dans tous les lieux du quotidien.
04:26Elles font vivre un calvaire aux millions de nos concitoyens infestés.
04:30If you've never met a bed bug, lucky you.
04:33They're blood sucking parasites with a nasty bite.
04:37So, why is it that they're flooding into Paris ahead of the Olympic tourists?
04:46So, looking at social media, it's telling me that this influx in bed bugs coincides with
04:51the arrival of thousands of Ukrainian refugees.
04:54It is true, France is accepting a lot of people from Ukraine.
04:58They're fleeing the Russian invasion.
05:00It's also true that every year in France, the number of bed bugs rises and falls along
05:06with the seasons.
05:10The bed bug crisis was a social media beat up.
05:14When French authorities traced the bed bug and Ukrainian refugees' story, they found
05:20it was a fiction spread by Russian agents to turn public sentiment against the refugees.
05:25Here's the post that's been circulating on a telegram channel since October 2nd.
05:31Russian sites even created counterfeit versions of respected French publications, mimicking reports
05:38on the bed bug crisis.
05:43Russia would have prevented and alimented this fear on the internet by flooding the
05:47false information networks through bots, accounts created by machines to destabilize
05:52France because of its weight support to Ukraine.
05:59Russia would have prevented the fact that it would have been released.
06:00This whole campaign demonstrates almost perfectly the genius of modern disinformation warfare.
06:06All you need is to isolate an issue with a hint of truth to it.
06:10Make sure you've got all the ingredients to make it go viral and then amplify it for
06:14good measure.
06:15You see, by the time France realised it was Russia pushing all of this stuff, refugee tensions
06:20had been stoked, and the host nation, well, they were more than a bit embarrassed.
06:34The goal of Russian disinformation is not to make you believe anything.
06:41It is to break open the fracture lines of society to weaken from within.
06:54This information, or hybrid wars, are being waged across the world.
07:00If you're on social media, you may well be inadvertently consuming or even sharing this content.
07:07These campaigns have been identified as major global threats to peace and democracy.
07:13And while they're not physically devastating like conventional wars,
07:17the deliberate spread of false information targeting a country's population is considered a form of warfare.
07:39During the Second World War, this part of London was bombed night after night.
07:43Churchill was hunkered down in his underground war rooms, directly beneath us, in fact.
07:49The air raids were ear-piercing, people were shaken from their beds.
07:54And although it's been some 80 years since the air raid sirens have rung out across London,
07:58warning of an attack, that doesn't mean this city's been free from invasion.
08:03Today, though, the threat comes through firewalls and disinformation is spread to create confusion and fear.
08:15It's the job of counterintelligence experts scouring online activity to identify and expose disinformation attacks.
08:24Ben Strick does just that. He's an Australian based in London.
08:29He believes we need to recognise how the mechanics of warfare are fundamentally shifting.
08:35I mean, what do you see when you see this piece of equipment right here?
08:39A very big rocket, that's what I see.
08:42Probably a very expensive rocket too, right?
08:44So I see something that costs a lot of money.
08:47Its intent is to cause harm, maybe to target a military unit or a strategic objective, right?
08:53The research to create this is hundreds of millions.
08:57But a series like a campaign online could be so much cheaper to target the enemy and cause, you know,
09:05damage financially.
09:06It can be done at scale with, you know, a cheap subscription to an AI platform en masse and a
09:13couple of hundred dollars.
09:14The lieutenants in a hybrid war can wage their battle from the comfort of their own homes.
09:20Their soldiers can be anonymous trolls or machine-generated bots.
09:26Bots are given their own accounts and identities which trick us into thinking they're real.
09:32In 2024 alone, Meta identified 4.23 billion fake Facebook accounts.
09:40Every year, thousands of Australians are scammed out of millions of dollars.
09:49I want to show you how a campaign can be waged online technically.
09:54Ben Strick busts disinformation campaigns across the world.
09:59He wants to show me an example from much closer to home.
10:02A campaign generated in Indonesia to undermine the autonomous province of West Papua's bid for independence.
10:11This is in West Papua.
10:13So it's an independent area under Indonesian occupation.
10:22So there were large protests that happened in August 2019 on the streets.
10:28These were the sorts of videos that were being uploaded online.
10:31And so the Indonesian government shut off the internet.
10:36They shut off all connectivity for people in West Papua.
10:39And I think they shut it off as a way to stop information flow, right?
10:43So that no one could access social media channels.
10:45But the thing is, when we saw it shut down, we saw Indonesian accounts sharing videos like this.
10:51What is Indonesia trying to hide in West Papua?
10:54It's trying to hide its great budget decisions.
10:56It's great work with schools and things like that.
10:59And it was actually being shared by accounts like this guy called Marko267.
11:03He's a friendly white guy and he's posting saying what are some of the secrets that Indonesia
11:08have been hiding in West Papua.
11:10He's using hashtags that are not really relevant like West Papua Genocide and Free West Papua.
11:16But it's kind of odd for me because it's not only Marko uploading it, it's hundreds of accounts
11:22all posting the exact same message.
11:25And this is not retweets, this is actual individual?
11:28No.
11:28Separate accounts.
11:30So let me take you back to Marko267.
11:33Now I do something like an image reverse search on his profile picture.
11:37Yeah.
11:37Marko looks like a friendly Western white guy, but he's also used in dating forums,
11:42in stock photo websites, in ski resort hotels.
11:46This is a fake account.
11:49And so what I did was I collected time codes of Marko's posting activity and his friends.
11:55And you can see this repeat pattern.
11:57Bella now, one of his friends, is posting at 0.3256, 9.3256, 8.3256.
12:04They're posting on a pattern.
12:06These are bots.
12:07And if we have a look at one of Marko's friends, which is West PapuaID,
12:11that discusses fact and data in West Papua.
12:15Sounds like a legitimate website.
12:16You can even see there's articles.
12:18And these articles talk about, again, the great work that Indonesia is doing.
12:23We start to think, well, okay, who made this website?
12:26And one of the tools that we have is metadata.
12:31Who registered the website?
12:33This person left their real name, their phone number, their email address
12:38and everything like that on this website.
12:40I pop that phone number into WhatsApp
12:42and immediately I have someone that owns that website.
12:45And simply doing an image reverse search, I'm able to find that person's LinkedIn account.
12:51And now we've identified a company.
12:54That company owns all of these websites.
12:56It's one marketing firm in Indonesia.
12:59We shared these findings with Facebook and with Twitter.
13:03And Facebook published a huge takedown of a network.
13:08Twitter had taken down 795 accounts as well.
13:11And we noticed that the website had folded up.
13:15Thank God.
13:15This small marketing firm in Jakarta was closed overnight after we published that.
13:21Wow.
13:22That's impressive.
13:23Cheers, mate.
13:27These campaigns go by various names.
13:30Hybrid threats, cognitive warfare, or simply propaganda.
13:36Even when there's a pause in the bombing, there's no lull in the propaganda war.
13:42Propaganda has existed for as long as the powerful have sought to manipulate public thinking.
13:47It's how it can be spread that's changed.
13:50Psychological warfare is a major part of the American and South Vietnamese effort.
13:55With massive leaflet drops urging them to defect and join the government side.
14:01Our information systems were once ruled by the printing presses.
14:06You're tuned to Radio Australia, the overseas service of the Australia...
14:10In the 20th century, radio expanded audiences across the oceans.
14:14Radio Australia, service outermain...
14:16This century, anyone with an internet connection can instantly deliver information or disinformation to billions.
14:26I love the idea of using technology to facilitate consultation, engagement, a greater sense of democracy.
14:34How social media platforms currently work is to not create those kinds of conditions.
14:39Every technology comes with advantages and disadvantages.
14:43And the disadvantage here is that it allowed a lot of people to advance their own personal, professional, commercial or
14:51political purposes.
14:59Russia leads the world in creating clandestine hybrid threats, targeting its neighbours.
15:06Just like the bedbugs campaign, hybrid threats weaponise existing tensions to fuel public fear.
15:14Now, if you're anything like me at this point, you might be thinking, hybrid threats? Somewhat confusing.
15:19What exactly is the objective? Where do they come from? Who and what are they targeting?
15:25And how on earth would you ever know for sure?
15:27And I suppose that's the point. It's confusion by design.
15:31So, that's why I've come here to the border between Poland and Belarus,
15:35where the threat is a bit more tangible.
15:37It involves real human lives and one very big fence.
15:44This deep primeval forest on Poland's eastern border has become a death trap for people who believe the lies they've
15:53been sold online.
15:55Like promises of freedom and opportunity.
16:01In 2021, thousands of migrants travelled from the Middle East and North Africa via Russia and Belarus to the Polish
16:09border.
16:20Polish guards struggled to hold them back.
16:25They kept on coming.
16:28By October, numbers had swelled to tens of thousands of people.
16:32Illegal border crossing is forbidden. You will face criminal charges.
16:40Despite waves of migrants into Europe, only a trickle ever attempted this crossing.
16:46So, why the sudden surge?
16:49The answer is all over social media.
16:52Messages, many in Arabic, offering flights to Minsk, the capital of Belarus, and taxis to the border.
16:59Russian travel agents promise Polish visas to migrants across Africa.
17:05According to the Polish government, this is a hybrid threat campaign created by Belarus and Russia to manufacture a migration
17:14crisis.
17:15A claim both countries deny.
17:18The European Union says Russia works with its proxy state Belarus to disrupt democracy in Poland and broader Europe.
17:26This is a hybrid attack of an authoritarian regime to try to destabilise democratic neighbours and this will not succeed.
17:41Over the past five years, tens of thousands of migrants have tried to cross this border.
17:47But the Polish government greets them with razor wire, a five-metre fence, and border guards pushing them back to
17:54Belarus.
17:57The hybrid threat has worked just as intended.
18:01Anti-migrant anger is on the rise across Poland.
18:04Far-right groups claim the country is now flooded with unconstrained migration.
18:10But the facts just don't support that.
18:13Only a small proportion of those arriving are granted asylum in Poland.
18:19Among them, a young North African who's agreed with the promise of anonymity to tell me how he got here.
18:25How did you know that you could get to Europe by coming to Russia and then Belarus?
18:31Who told you that this was a pathway?
18:40Which social media? TikTok, Facebook, Instagram?
18:43I use only Facebook.
18:45Facebook, OK.
18:46The visa that you had for Russia, was it a tourist visa or a business visa?
18:52Tourist like that.
18:53Tourist visa, for three months?
18:55It was three months, yes.
18:56And when you were in Belarus, did any of the border guards or police tell you that it's safe to
19:02go through to Poland?
19:03Did they encourage you to go to Poland?
19:07Go to Poland, go to Europe.
19:10Go to Europe, yes.
19:12That was what they said?
19:13Yes.
19:15Did you think it would be safe to travel this way?
19:19We cross the jungle.
19:21No safety.
19:27Migrants can spend days, weeks, even months in the forest and swamplands on both sides of the border.
19:35The border with Belarus is in about eight kilometres distance.
19:40All right, shall we walk?
19:41Yeah, we can walk.
19:44I got a message from Telstra welcoming me to Belarus.
19:49Alexandra Kushinovska and Anna Albot have allowed me to join them in a section of forest where migrants often hide
19:55after crossing the border.
19:58So it's pretty rough. You're not really following trails. You're just literally walking off into the bush.
20:02No. Yes.
20:04These women are part of a network of volunteers who can't offer people safe passage, but they can deliver life
20:11-saving supplies.
20:13We would come here with dry clothes, shoes, with food.
20:17Do you find that you get lost?
20:19We all the time use GPS.
20:21Yeah, right.
20:22So we do get lost, but not so bad.
20:25Yeah, right.
20:28There might be people here hiding from us.
20:30Really?
20:32Mm-hmm.
20:33Oh, those boots.
20:34Yeah, it's a boot.
20:35There's a jacket just back there.
20:37Yeah, and here.
20:38More there.
20:39It's unbelievable how much there is.
20:41Everywhere in front of us here.
20:43And a jacket here, or actually a sleeping bag.
20:47Oh, and a backpack.
20:48Oh, wow.
20:49Sometimes if you run away, you leave everything behind.
20:53It must be so bewildering arriving into this.
20:57Like, I just don't know what you'd make of it if you'd arrived over a fence from North Africa or
21:02the Middle East and landed here in the middle of a swamp.
21:07Unbelievable.
21:12Oh, sorry.
21:14Our alarm phone.
21:15OK.
21:16Hello.
21:17Hello, hello.
21:19Tak.
21:19Tak.
21:20Tak.
21:21Mm-hmm.
21:22Mm-hmm.
21:23So this is an alert phone call they're getting from the other activists.
21:27Mm-hmm.
21:30Dzięki.
21:31They've just received a request for help from two people from Afghanistan.
21:35And they're waiting right now for their individual needs.
21:40So is this safe for you?
21:42Is it safe for the migrants to be in contact with you?
21:46For us it is safe and for them it's an additional risk because more people around them, the more chance
21:53that the drones or Polish border guards will see them.
21:56So they wouldn't contact us if they don't meet.
21:59And what would happen to them if Polish border guards find them?
22:02They'd be sent back?
22:03They would be immediately through to the other side.
22:05OK.
22:06Brutally.
22:08Every day these women come face to face with the human cost of hybrid warfare.
22:13This is the wound of the...
22:15Oh, wow.
22:16So it's really...
22:16And that's on the heel?
22:17On the heel, yes.
22:18So it's quite a huge cut in the heel.
22:20This is really big.
22:21Yeah.
22:22They've met with hundreds of people fleeing authoritarian regimes in search of asylum.
22:29But less than 10% of migrants on this border have been officially accepted in Poland.
22:34The majority have been pushed back to Belarus.
22:38Poland has suspended the right to apply for international protection at the border with Belarus.
22:44So they have to do everything possible to avoid the contact with Polish border guards or soldiers.
22:51Let's just go.
22:52And maybe let's just give up...
22:53Other volunteers' meters.
22:58They've packed supplies, including medical gear, to treat the men's injuries.
23:05So that we can move quickly, my crew has stayed behind.
23:08I'm now on camera and sound.
23:12This is funny.
23:14It's a camera.
23:15360 degrees.
23:17So front and back.
23:19And it has audio as well.
23:22So we just need to behave very carefully, not making too much noise and not being seen.
23:30We should even avoid being seen by random people.
23:33Because you never know who will be willing to help and who will call the border guard.
23:38We'll hop.
24:25Anna shares clothes, shoes, hot soup and water.
24:39It's a very big wound, it's very deep, he needs proper medical attention.
25:00We use a translation app to ask about their journey.
25:20The men are nervous and want to keep moving.
25:28Anna and Alexandra deliberately don't ask where the men are headed.
25:34They are unlikely to ever meet again.
25:39She wrote in a translator that they are very grateful for our help.
25:47It's always difficult to receive these words of thankfulness because this situation shouldn't happen and we can't do a lot.
26:05The death of democracy happens by a thousand cuts where you're bleeding from every cut and you don't know how
26:12weak you are until you die.
26:16It's happening everywhere, every place where the tech is ruling us.
26:25Hybrid threats and disinformation campaigns don't just target foreign countries.
26:31Most of it is actually designed to work internally to manipulate opinion and maintain power at home.
26:44We used to think revolutions happened when people took to the streets or that democracies were toppled through bloody military
26:51coup.
26:52Today we can all sense that those lines are blurring between democracy and autocracy and dictatorship.
26:58And for anyone that wants to influence what you think there's a pretty good opportunity every time you reach into
27:04your pocket.
27:12To understand disinformation's impact on democracy, I've come to the Philippines.
27:20It's an Indo-Pacific neighbour of ours, offering a cautionary tale.
27:26The Philippines has been crowned the social media capital of the world.
27:30People here devote an average of three and a half hours a day to scrolling.
27:34That's an hour longer than the world average.
27:38And Facebook is where most people get their news.
27:42So the end goal is keep you scrolling.
27:44Hack our biology, change the way we feel to change our lives.
27:47Change the way we see the world, to change the way we act, to change the way we vote.
27:58In just a couple of days, more than 60 million Filipinos are going to cast their vote.
28:02It's the nation's midterm elections.
28:06So you're voting, where do you get your information from?
28:09Social media is much easier when we select the candidates.
28:12Do you think that facts matter when it comes to elections in the Philippines?
28:18Well, facts matter for me.
28:19It's hard to find the facts.
28:21They are already running.
28:23Oh, the same face, the same family, the same people.
28:31Politics here is dominated by two warring families, Marcos and Duterte.
28:39Once allies, they now work tirelessly to backstab and discredit each other.
28:44Their weapon of choice is the modern day dagger, online disinformation.
28:55Former President Rodrigo Duterte long ago recognised the role of technology to help maintain authoritarian power.
29:03If you destroy my country, I will kill you.
29:07Over his six-year term, he ran an indiscriminate war on drugs, which left thousands dead.
29:15Including 20 journalists.
29:18Just because you are a journalist, you are not exempt from the assassination.
29:23If you are the son of a bitch.
29:27Independently, reporting facts in the Philippines can be dangerous.
29:32There's one news organisation here that will not be silenced.
29:36Founded back in 2011, it is small, independent, online.
29:41Rappler is a digital news outlet which began its life on Facebook.
29:46What are we fighting?
29:47First, tech, social media, corruption.
29:52Rappler's co-founder, Maria Ressa, has won a swag of awards, including the Nobel Peace Prize,
29:59for her commitment to press freedom.
30:02Let's have fun.
30:03Please, let's keep having hope.
30:05Rappler has created and now publishes on its own innovative app.
30:10It works much like a chat app and has freed the news outlet from a reliance on big tech platforms.
30:16I drank the Kool-Aid and believed that social media could be an enabling factor for better governance.
30:23And so we thought there was tremendous potential and it worked for a while.
30:27Are you talking about a moment in time where there was just this great hope?
30:31Yes.
30:31Like the Arab Spring, there was this idea that social media was going to democratise the message.
30:39Give the people more power, right?
30:41That was the idea, that it was an enabler for democracy.
30:47In the early 2010s, the Arab Spring began with demonstrations across the Arab world to oust repressive leaders.
30:55And I was there, reporting at the time.
30:57This is a revolution.
30:59Yeah, this is an Egyptian revolution.
31:01We don't want this government.
31:03The crowds organised themselves via social media.
31:06They called them Twitter revolutions.
31:09And it felt like a powerful tool, creating real political change.
31:14Egypt was so incredible because they were leaderless networks, right?
31:19And it was so exciting that the people could work together to gain more power.
31:27These leaderless networks were able to share information online.
31:31But then from 2014, algorithms began promoting content that would boost engagement.
31:38The messaging from grassroots campaigns was eclipsed by more sensational content.
31:44The public information ecosystem literally was rewarding lies, spreading lies six times faster than facts.
31:53If you lace it with fear, anger and hate, it spreads even faster.
31:57Aguai!
31:59For Duterte, algorithms presented an opportunity.
32:02He could now flood social media with disinformation that justified his actions and discredited his critics.
32:10I repeated, f*** you.
32:18He also encouraged an orchestrated hate campaign against Maria Ressa.
32:23In 2016, exponential lies on social media attacking us.
32:30I was getting an average of 90, 9-0 hate messages per hour.
32:36It's dehumanising.
32:37And then 2019 was the beginning of the arrests.
32:42The chief executive of one of the Philippines most popular news sites has been arrested for the seventh time.
32:49Maria Ressa has been found guilty of cyber libel charges and faces up to six years in prison.
32:54I'd say the tipping point when I realised the world had really changed was when I was arrested in this
33:00newsroom.
33:01It's a shock.
33:03It's a shock.
33:03It's a shock, but we're going.
33:05Press freedom is the foundation of every single right of every Filipino to the truth.
33:1210 arrest warrants in a little over a year.
33:15In 2020, I was convicted of cyber libel.
33:18We're still fighting this at the Supreme Court.
33:21Can democracy exist without shared facts?
33:25Absolutely not.
33:27No.
33:28The tearing down of democracy began by tearing down facts.
33:32Democracy is taking that shared reality where we agree on what the facts are so that we can mobilise, move
33:40together.
33:40In the past, it used to be journalists. We were the gatekeepers.
33:46Maria Ressa's clarion call is echoed by journalists across the world.
33:51I don't think democracy can survive without a free and independent press. It never has.
33:56Marty Baron is a celebrated American newspaper editor.
34:00He's been pivotal to the success of papers like The Washington Post and The Boston Globe.
34:06He led The Globe's investigation into clerical sex abuse, immortalised in the Oscar-winning film Spotlight.
34:14Journalism can influence ordinary people's lives for the better.
34:18It is essential to democracy.
34:20Just telling the truth, in too many countries, it will get you killed.
34:26Our mission is holding powerful individuals and institutions accountable, particularly those who are in government.
34:35When you have aspiring autocrats, one of the first things they do is try to crush the free and independent
34:39press.
34:40We are fighting the fake news.
34:42It's fake.
34:43Phony.
34:43Fake.
34:44CNN is scum.
34:45It's your fake news.
34:46ABC is one of the worst.
34:47I have to be honest with you.
34:49They go after the press first, and there will be nobody to hold them to account.
34:53And when you look around the world at authoritarian governments and how they came into existence, you see that one
34:59of the first things they did was try to destroy a free and independent press.
35:03In my opinion, you are hurting Australia very much right now.
35:06They want to get along with me.
35:14Traditional media organisations bear responsibility for what they publish.
35:18There's at least a group of people who are professionally trained, they're not perfect, but they're dedicated to actually verifying
35:25the facts.
35:26And it's not just because of the laws, it's because we actually feel morally responsible for what we publish.
35:33I don't think it's enough now to just say, you should trust us because we're journalists.
35:37But I think it's also fair to say that there are a set of people that exploit the trust gap
35:42with mainstream media organisations for more nefarious purposes.
35:46Something's wrong with the press, guys.
35:48Journalism is dead.
35:49Like, what the heck is going on?
35:51Citizen journalism is the future.
35:53By the people, for the people.
36:01The social media platforms that do have a very significant dominance in the information ecosystem have a business model that
36:09prioritises polarising content.
36:13They're not designed to facilitate quality information distribution.
36:18X or Twitter is now just probably the largest disseminator of false information in the world right now.
36:28And that's why we have a society where we can't agree on a common set of facts.
36:32That's why we have a society where we can't even agree on how to determine what a fact is.
36:36And just remember, what you're seeing and what you're reading is not what's happening.
36:44I don't really want government to be in the business of dictating what's true and what's false.
36:50That's a very dangerous place to be.
36:57If you're watching this and you're thinking to yourself journalists are the problem, or at least part of it, you
37:02may well be right.
37:03There are plenty of legitimate criticisms of us, the role we've played, in diminishing trust.
37:11The question is whether journalism can rise to meet the challenge to help fight against this assault on facts.
37:28With only 24 hours to go, election fever is building on the streets of Manila.
37:34As the vote gets closer, they stage these huge political rallies.
37:38So that's where I'm heading now.
37:54You know, when you're in the middle of this, there's something quite captivating about it.
37:57It's kind of hard to know where the politics actually is, but there's a heck of a lot of people
38:03engaged.
38:04Let's have a look.
38:06This is a pro-Marcos rally.
38:09Ferdinand Bongbong-Marcos is the nation's current president.
38:12So tell me, where do you get your information about politics?
38:15I watch on Facebook.
38:19And do you believe everything you see and hear on Facebook?
38:22Yes.
38:23So you can spot the fake news, can you?
38:24Yes.
38:25OK.
38:26In the media zone, there's over 50 social media influencers, known here as vloggers, live streaming.
38:34The Independent Electoral Commission says it's fair to assume they're all being paid for their support.
38:40I know lots of people in Philippines use social media.
38:43Is that where most people get their politics from?
38:46Five years ago, they shifted from mainstream media to vlogging.
38:51It becomes an industry.
38:52Yeah.
38:53That 3.2 million views is pretty good.
38:56Influencer or vlogger Karkofi hasn't confirmed if he's paid for his devoted support of the Marcos family.
39:03People believe vloggers as compared to mainstream media.
39:09That's my observation because they look at mainstream media as biased.
39:12Which I agree.
39:13All media organizations are biased.
39:16Um...
39:16I'm pro-Marcos.
39:17So I produce content with leanings with bias to the government.
39:21Thank you, ma'am.
39:24So I focus on topics that I think this is good.
39:26This will promote the administration that I do content for that.
39:30Do you think what you do is propaganda?
39:35By definition of propaganda, I think yes.
39:39It's part of the propaganda.
39:41It's a biased content production.
39:44Yes.
39:45If the propaganda doesn't harm anybody, it doesn't harm the society, it doesn't harm the democracy,
39:51I don't see there's a problem with the propaganda.
39:57You know, historically, countries where the media disappears, the democracy disappears.
40:03But there are vloggers who are also sort of media.
40:11Prime Minister Anthony Albanese, welcome to Big Small Top.
40:14Of course, an alliance between influencers and politicians isn't unique to the Philippines.
40:19And they have this nuclear fantasy.
40:21We'll get into that.
40:23They're opposed to renewables.
40:25In Australia, politicians must disclose paid support from influencers.
40:30But they now often appear as guests on their high-rating shows.
40:34I can't dive.
40:35Were you one of those people that dove and did the belly flop straight in?
40:38There was a little bit of that.
40:39Yeah, yeah.
40:40Delulu.
40:41I dare you to drop that in a speech.
40:43Do you reckon I could do that?
40:43Delulu with no salulu.
40:44Can you drop that in a speech?
40:45They are Delulu with no salulu, Mr Speaker.
40:49The 2025 ballot has been dubbed Australia's first social media-led election.
40:55Have you hit your gardening phase yet?
40:57No, I'm not a gardener.
40:58No, me either.
40:59Yeah.
40:59You just have to observe what podcasters are doing or what influencers are doing.
41:04They're speaking in a much more informal way as if they were talking to their buddies.
41:08He's scared!
41:10I'm joking.
41:10Yep.
41:11Yep.
41:11Tenses up, shitting himself.
41:13Okay.
41:14They're much more relatable than traditional journalists tend to be and people can connect
41:19with them.
41:21I found often audiences would say, you journalists in the mainstream media, you've brought this
41:28on yourself.
41:29I don't like your facts.
41:30There's other facts.
41:31There are no alternative facts.
41:33There are no alternative facts.
41:34Because, let's say right here.
41:35But why do so many people believe there are?
41:37I guess that's what I'm getting at.
41:38Because it's easy to insidiously manipulate people today.
41:41What used to take intelligence agencies a lot more money can now be done cheaply, easily.
41:49Silicon Valley opening the gates to this kind of insidious manipulation.
41:53And the crazy thing is, people don't believe they can be manipulated.
42:00Again, it goes back to our cognitive biases.
42:03And until we realize that, we are Pavlov's dogs.
42:09The media environment, the technology environment has allowed for and encouraged greater tribalism.
42:16We feel more comfortable with our tribe.
42:18We trust them.
42:19And so, it just reinforces an instinct.
42:23It's been part of society forever.
42:33Across town, Duterte's supporters are enjoying what feels more like a pop concert than a political rally.
42:41Vloggers are out in force here too.
42:43And they're upset that Rodrigo Duterte isn't here.
42:47He's in The Hague.
42:48The former Filipino leader, Rodrigo Duterte, is now in the custody of the Internet.
42:52He's in the National Criminal Court after being arrested on charges linked to his war on drugs.
42:58Thousands of small-time dealers, users, activists and others were killed without trial.
43:04Luterte! Luterte! Luterte!
43:07Despite incarceration, Duterte is actually vying for a mayoral seat in this election.
43:12And his daughter is Vice President.
43:14Well, this is who they're all here to see.
43:17The Vice President, Sarah Duterte.
43:19Her dad was the President.
43:21He's in The Hague.
43:22And she's at war with the current President.
43:24In the past months, my family's name have been dragged through the mud.
43:30I am not the problem of this country.
43:42Of course, the Duterte camp has its own army of full-time vloggers.
43:49Duterte! Duterte!
43:51Between 10 and 30 million US dollars gets funneled to vloggers and influencers here each election cycle.
44:00Before the platforms reward them, they're rewarded by money here.
44:05People who can make money from disinformation.
44:07People see the opportunity to make money.
44:09They see the opportunity to acquire power by exploiting the media environment as it exists today.
44:14People are using it for their own commercial ends.
44:17They're using it for their own political ends.
44:18We don't have a counterweight to that.
44:21And that's where we are right now.
44:25Stand by 10 seconds.
44:2710, 9, 8, 7, 6, 5, 4, 3...
44:33Hello and welcome. I'm Maria Ressa.
44:36And you're watching Rappler's special coverage of the 2025 national and local elections.
44:42And the polls have officially opened.
44:4568.4 million Filipinos are registered to vote.
44:49President Ferdinand Marcos declared today a holiday.
44:52So Filipinos, go out and vote!
44:56Voters don't disappoint.
44:58It's the highest turnout in the history of midterm elections.
45:01More than 80%.
45:06Joining us now is Jairo Bolendo.
45:10Hi, I'm Maria. I'm reporting live from Davao City.
45:12And all eyes on Davao City.
45:14Because of many...
45:15As results start rolling in,
45:18former President Rodrigo Duterte is delivered a sweeping victory.
45:23The Duterte has emerged as big winners in this year's elections in Davao City.
45:28Former President Rodrigo Duterte set to be elected mayor of Davao City by a landslide despite being detained at The
45:35Hague.
45:38It's a symbolic win.
45:40A mayoral seat Duterte can't reach from his prison cell.
45:44Social media has been flooded with revisionist versions of Duterte's brutal leadership.
45:51And the voters seem to have bought it.
45:55What's happened in the Philippines has happened in many other parts of the world now.
46:04The White House has banned journalists from entering the Pentagon if they haven't signed a legal document agreeing to only
46:11report information that's been authorised by the Department of Defence.
46:14I think that the environment now is a dangerous one for our democracy.
46:20I think we have an administration that's pushing our democracy to the limits in many ways, aiming to break it.
46:2972% of the world is now under authoritarian rule.
46:33We are electing illiberal leaders democratically.
46:38All because of what's on that device.
46:40I think this is the accelerant and an enabler for digital authoritarians.
46:47Every day that democratic nations do not reclaim their rights from big tech is a day they get weaker.
46:56And a government that wins when there's no trust, the only government that survives in that environment is a dictatorship.
47:04Do you think we can win this fight?
47:06We have to wake up.
47:10We have to look at the tech.
47:13We have to admit that we have been manipulated.
47:17And we have to demand better.
47:21This is a thinking fast world.
47:24That's where the big tech companies want you to stay because that's where they make the most profit.
47:29But you need to move into the thinking slow world.
47:33And we need to solve it at scale or we will lose our agency and our humanity.
47:43Join me next time.
47:45This is very new and very weird.
47:47Are these people real?
47:48I'll be diving headfirst into AI.
47:51Is there anything about it that looks weird to you?
47:54No.
47:55Where is AI taking us?
47:56So you reckon these real?
47:58That looks fake.
48:00I was really easily tricked by it.
48:02And do we even want to go there?
48:03We don't want to sleepwalk into a future that nobody wants.
48:06They've invested hundreds of billions of dollars into this industry and they want to seek a return.
48:11Are you scared?
48:12I'm very scared.
48:18Don't wait a week when all episodes of The Matter of Facts are streaming now on ABC iview.
48:25The Matter of Facts
48:26The Matter of Facts
48:34You
Comments