Skip to playerSkip to main content
  • 11 hours ago

Category

📺
TV
Transcript
00:03Hi, I'm Hamish McDonald and I'm a journalist, which means facts are my business.
00:09Where do you get your information from?
00:11Social media.
00:12Facebook.
00:12Facebook, Instagram and YouTube.
00:15Social media is where so many of us now get our information.
00:19But can we rely on it?
00:22When tech makes it so easy to create a fictional story, which can look and sound real.
00:29I went to the moon last week, don't you know?
00:31Even though it's entirely fake.
00:35Are you like so many people now, struggling to know what's true and what's false?
00:41In this series, I'm going to explore how big tech is transforming our lives and our minds.
00:47We're going through a period of radical change in the way that people consume information.
00:51We've created this monster.
00:53An algorithm is outsourcing human choice into a machine's decision.
00:59The end goal is keep you scrolling.
01:01Hack our biology, change the way we feel to change the way we see the world.
01:06Is anyone else paying attention to how all of this is affecting society?
01:12I want to understand how technology is changing the way we think.
01:16It really does question your reality.
01:20And why we're all susceptible to mis and disinformation.
01:23We are just as vulnerable, but we think we're not.
01:27And actually, that's what makes us more vulnerable.
01:30Do you worry about what happens if we can't agree on facts?
01:34This has been an information war. Facts stop mattering.
01:37This growing sense that anything could be fake at any time.
01:42Looks like you.
01:43Just not me.
01:45The facts are so polar opposite.
01:47Which ones are the facts?
01:49We can't even agree on how to determine what a fact is.
01:52In this episode, I'm travelling far and wide to understand how disinformation disrupts us.
01:58And democracy.
01:59They're posting on a pattern.
02:01These are bots.
02:03So is this safe for you?
02:05There might be people here hiding from us.
02:08It's part of the propaganda.
02:10It's a biased content production.
02:12The tearing down of democracy began by tearing down facts.
02:17Join me on a journey right around Australia and the world
02:20to meet people determined to keep facts alive
02:23in this real-time information revolution.
02:38If you wanted to damage someone's reputation, how would you do it?
02:43Maybe the easiest way would be to start a rumour.
02:48And no-one ever needs to know it came from you.
02:54So what if you wanted to damage an entire country's reputation?
02:59It's the same principle.
03:01Let me show you how it's done.
03:04I promise there's no funny business.
03:06I want to invite you right now to come and join me in a hotel room.
03:09In Paris.
03:10Just before the Olympics.
03:22G'day.
03:23Welcome.
03:23Come in.
03:24Do you want to take a seat?
03:25Just come through here.
03:27I might move this guy out of the way for you first.
03:30There you go.
03:31Make yourself comfy.
03:32So listen.
03:33I just want you to imagine I'm one of more than 11 million people in Paris.
03:37Ahead of the 2024 Olympic Games.
03:39I've actually come a bit early.
03:41Because I've got a heap of cash to drop on this.
03:46My phone is running hot.
03:48With advice on great wine bars.
03:50How to use the metro.
03:52And uh...
03:52Ugh.
03:54Bugs.
03:54There's bed bugs everywhere.
03:55They're on the planes.
03:56They're on the trains.
03:57They're on the buses.
04:00Paris is in the grips of a bed bug apocalypse.
04:04It's an icky situation.
04:05The mayor of Paris is declaring no one is safe from bed bugs.
04:10Madame la Première Ministre,
04:12ces petits insectes répandent le désespoir dans notre pays.
04:17Bed bugs in Paris, public enemy number one.
04:21Une vague de panique saisit le pays.
04:23Les punais de lits prolifèrent dans tous les lieux du quotidien.
04:26Elles font vivre un calvaire aux millions de nos concitoyens infestés.
04:30If you've never met a bed bug, lucky you.
04:33They're blood-sucking parasites with a nasty bite.
04:37So why is it that they're flooding into Paris ahead of the Olympic tourists?
04:46So, looking at social media, it's telling me that this influx in bed bugs coincides with
04:51the arrival of thousands of Ukrainian refugees.
04:54It is true France is accepting a lot of people from Ukraine.
04:58They're fleeing the Russian invasion.
04:59It's also true that every year in France, the number of bed bugs rises and falls, along with the seasons.
05:10The bed bug crisis was a social media beat up.
05:14When French authorities traced the bed bug and Ukrainian refugees' story,
05:19they found it was a fiction spread by Russian agents
05:22to turn public sentiment against the refugees.
05:25Here's the post that's been circulating on a telegram channel since October 2nd.
05:31Russian sites even created counterfeit versions of respected French publications,
05:36mimicking reports on the bed bug crisis.
05:39Where they usurped Western media logos.
05:43The Russian would have favored and fed this fear on the internet
05:46by flooding false information networks
05:49through bots, accounts created by machines to destabilize France
05:53because of its weight support in Ukraine.
05:59This whole campaign demonstrates almost perfectly the genius of modern disinformation warfare.
06:06All you need is to isolate an issue with a hint of truth to it.
06:10Make sure you've got all the ingredients to make it go viral
06:13and then amplify it for good measure.
06:15You see, by the time France realised it was Russia pushing all of this stuff,
06:19refugee tensions had been stoked and the host nation, well,
06:23they were more than a bit embarrassed.
06:34The goal of Russian disinformation is not to make you believe anything.
06:40It is to break open the fracture lines of society to weaken from within.
06:54Disinformation, or hybrid wars, are being waged across the world.
06:59If you're on social media, you may well be inadvertently consuming
07:03or even sharing this content.
07:07These campaigns have been identified as major global threats
07:11to peace and democracy.
07:13And while they're not physically devastating like conventional wars,
07:17the deliberate spread of false information
07:19targeting a country's population is considered a form of warfare.
07:39During the Second World War, this part of London was bombed night after night.
07:44Churchill was hunkered down in his underground war rooms,
07:47directly beneath us, in fact.
07:49The air raids were ear-piercing.
07:51People were shaken from their beds.
07:53And although it's been some 80 years since the air raid sirens
07:57have rung out across London warning of an attack,
08:00that doesn't mean this city's been free from invasion.
08:03Today, though, the threat comes through firewalls
08:07and disinformation is spread to create confusion and fear.
08:15It's the job of counterintelligence experts scouring online activity
08:19to identify and expose disinformation attacks.
08:24Ben Strick does just that.
08:26He's an Australian based in London.
08:29He believes we need to recognise how the mechanics of warfare
08:33are fundamentally shifting.
08:35I mean, what do you see when you see this piece of equipment right here?
08:39A very big rocket, that's what I see.
08:42Probably a very expensive rocket too, right?
08:44So I see something that costs a lot of money.
08:47Its intent is to cause harm,
08:49maybe to target a military unit or a strategic objective, right?
08:53The research to create this is hundreds of millions,
08:57but a series like a campaign online
09:01could be so much cheaper to target the enemy
09:03and cause, you know, damage financially.
09:06It can be done at scale with, you know,
09:09a cheap subscription to an AI platform en masse
09:12and a couple of hundred dollars.
09:14The lieutenants in a hybrid war can wage their battle
09:17from the comfort of their own homes.
09:20Their soldiers can be anonymous trolls
09:22or machine-generated bots.
09:26Bots are given their own accounts and identities
09:29which trick us into thinking they're real.
09:33In 2024 alone,
09:35Meta identified 4.23 billion fake Facebook accounts.
09:40Every year, thousands of Australians are scammed
09:43out of millions of dollars.
09:50I want to show you how a campaign can be waged online technically.
09:55Ben Strick busts disinformation campaigns across the world.
09:59He wants to show me an example from much closer to home.
10:03A campaign generated in Indonesia
10:05to undermine the autonomous province of West Papua's bid
10:08for independence.
10:11This is in West Papua, so it's an independent area
10:15under Indonesian occupation.
10:22So there were large protests that happened in August 2019 on the streets.
10:28These were the sorts of videos that were being uploaded online.
10:31And so the Indonesian government shut off the internet.
10:36They shut off all connectivity for people in West Papua.
10:39And I think they shut it off as a way to stop information flow, right?
10:43So that no one could access social media channels.
10:45But the thing is, when we saw it shut down,
10:48we saw Indonesian accounts sharing videos like this.
10:51What is Indonesia trying to hide in West Papua?
10:54It's trying to hide its great budget decisions.
10:56It's great work with schools and things like that.
10:59And it was actually being shared by accounts like this guy called Marko267.
11:04He's a friendly white guy and he's posting saying,
11:07what are some of the secrets that Indonesia have been hiding in West Papua?
11:10Mm-hm.
11:10He's using hashtags that are not really relevant,
11:13like West Papua Genocide and Free West Papua.
11:16But it's kind of odd for me because it's not only Marko uploading it,
11:21it's hundreds of accounts all posting the exact same message.
11:25And this is not retweets, this is actual individual.
11:28No, separate accounts.
11:29So let me take you back to Marko267.
11:33Now I do something like an image reverse search on his profile picture.
11:37Yeah.
11:37Marko looks like a friendly Western white guy,
11:39but he's also used in dating forums, in stock photo websites,
11:44in ski resort hotels.
11:46This is a fake account.
11:49And so what I did was I collected time codes of Marko's posting activity
11:54and his friends.
11:55And you can see this repeat pattern.
11:57Bella now, one of his friends, is posting at 0.3256, 9.3256, 8.3256.
12:04They're posting on a pattern.
12:05These are bots.
12:07And if we have a look at one of Marko's friends, which is West Papua ID,
12:11that discusses fact and data in West Papua.
12:15Sounds like a legitimate website.
12:16You can even see there's articles.
12:18And these articles talk about, again, the great work that Indonesia is doing.
12:23We start to think, well, okay, who made this website?
12:26And one of the tools that we have is metadata.
12:30Who registered the website?
12:33This person left their real name, their phone number, their email address,
12:38and everything like that on this website.
12:40I pop that phone number into WhatsApp and immediately I have someone that owns that website.
12:45And simply doing an image reverse search, I'm able to find that person's LinkedIn account.
12:51And now we've identified a company.
12:53That company owns all of these websites, this one marketing firm in Indonesia.
12:59We shared these findings with Facebook and with Twitter.
13:03And Facebook published a huge takedown of a network.
13:07Twitter had taken down 795 accounts as well.
13:11And we noticed that the website had folded up.
13:15It had gone.
13:16This small marketing firm in Jakarta was closed overnight after we published that.
13:21Wow. That's impressive.
13:23Cheers, mate.
13:28These campaigns go by various names.
13:30Hybrid threats, cognitive warfare, or simply propaganda.
13:37Even when there's a pause in the bombing, there's no lull in the propaganda war.
13:42Propaganda has existed for as long as the powerful have sought to manipulate public thinking.
13:47It's how it can be spread that's changed.
13:50Psychological warfare is a major part of the American and South Vietnamese effort.
13:55With massive leaflet drops urging them to defect and join the government side.
14:01Our information systems were once ruled by the printing presses.
14:06You're tuned to Radio Australia, the overseas service of the Australia...
14:10In the 20th century, radio expanded audiences across the oceans.
14:16This century, anyone with an internet connection can instantly deliver information or disinformation to billions.
14:26I love the idea of using technology to facilitate consultation, engagement, a greater sense of democracy.
14:34How social media platforms currently work is to not create those kinds of conditions.
14:38Every technology comes with advantages and disadvantages.
14:43And the disadvantage here is that it allowed a lot of people to advance their own personal, professional, commercial or
14:51political purposes.
14:58Russia leads the world in creating clandestine hybrid threats targeting its neighbours.
15:06Just like the bedbugs campaign, hybrid threats weaponise existing tensions to fuel public fear.
15:14Now, if you're anything like me at this point, you might be thinking, hybrid threats? Somewhat confusing.
15:19What exactly is the objective? Where do they come from? Who and what are they targeting?
15:25And how on earth would you ever know for sure?
15:27And I suppose that's the point. It's confusion by design.
15:31So, that's why I've come here to the border between Poland and Belarus, where the threat is a bit more
15:36tangible.
15:37It involves real human lives and one very big fence.
15:44This deep primeval forest on Poland's eastern border has become a death trap for people who believe the lies they've
15:53been sold online.
15:55Like promises of freedom and opportunity.
16:01In 2021, thousands of migrants travelled from the Middle East and North Africa via Russia and Belarus to the Polish
16:09border.
16:20Polish guards struggled to hold them back.
16:25They kept on coming.
16:27By October, numbers had swelled to tens of thousands of people.
16:33Illegal border crossing is forbidden.
16:37You will face criminal charges.
16:40Despite waves of migrants into Europe, only a trickle ever attempted this crossing.
16:46So, why the sudden surge?
16:49The answer is all over social media.
16:52Messages, many in Arabic, offering flights to Minsk, the capital of Belarus, and taxis to the border.
17:00Russian travel agents promise Polish visas to migrants across Africa.
17:05According to the Polish government, this is a hybrid threat campaign created by Belarus and Russia to manufacture a migration
17:14crisis.
17:15A claim both countries deny.
17:18The European Union says Russia works with its proxy state Belarus to disrupt democracy in Poland and broader Europe.
17:26This is a hybrid attack of an authoritarian regime to try to destabilise democratic neighbours and this will not succeed.
17:41Over the past five years, tens of thousands of migrants have tried to cross this border.
17:47But the Polish government greets them with razor wire, a five metre fence, and border guards pushing them back to
17:54Belarus.
17:57The hybrid threat has worked just as intended.
18:01Anti-migrant anger is on the rise across Poland.
18:04Far-right groups claim the country is now flooded with unconstrained migration.
18:10But the facts just don't support that.
18:13Only a small proportion of those arriving are granted asylum in Poland.
18:19Among them, a young North African who's agreed with the promise of anonymity to tell me how he got here.
18:25How did you know that you could get to Europe by coming to Russia and then Belarus?
18:31Who told you that this was a pathway?
18:40Which social media? TikTok, Facebook, Instagram?
18:43I use only Facebook.
18:45Facebook, okay.
18:46The visa that you had for Russia, was it a tourist visa or a business visa?
18:52Tourist visa for three months?
18:55It was three months, yes.
18:56And when you were in Belarus, did any of the border guards or police tell you that it's safe to
19:02go through to Poland?
19:03Did they encourage you to go to Poland?
19:07Go to Poland, go to Europe.
19:10Go to Poland, go to Europe.
19:12That was what they said?
19:13Yes.
19:15Did you think it would be safe to travel this way?
19:19Across the jungle, no safety.
19:27Migrants can spend days, weeks, even months in the forest and swamplands on both sides of the border.
19:35The border with Belarus is in about eight kilometres distance.
19:40All right, shall we walk?
19:41Yeah, we can walk.
19:44I got a message from Telstra welcoming me to Belarus.
19:49Alexandra Kershinovska and Anna Albot have allowed me to join them in a section of forest where migrants often hide
19:55after crossing the border.
19:58So it's pretty rough, you're not really following trails, you're just literally walking off into the bush.
20:02No. Yes.
20:04These women are part of a network of volunteers who can't offer people safe passage, but they can deliver life
20:11-saving supplies.
20:13We would come here with dry clothes, shoes, with food.
20:17Do you find that you get lost?
20:19We all the time use GPS.
20:21Yeah, right.
20:22So we do get lost, but not so bad.
20:25Yeah, right.
20:28There might be people here hiding from us.
20:32Mm-hmm.
20:33Oh, there's boots.
20:34Yeah, it's a boot.
20:35There's a jacket just back there.
20:37Yeah, and here.
20:38More there.
20:39It's unbelievable how much there is.
20:41Everywhere in front of us here.
20:43And a jacket here, or actually a sleeping bag.
20:47Oh, and a backpack.
20:48Oh, wow.
20:49Sometimes if you run away, you leave everything behind.
20:53It must be so bewildering arriving into this.
20:57Like, I just don't know what you'd make of it if you'd arrived over a fence from North Africa or
21:02the Middle East and landed here in the middle of a swamp.
21:07Unbelievable.
21:12Oh, sorry.
21:14Our alarm phone.
21:15Okay.
21:16Hello?
21:18Hello, hello.
21:18There.
21:19There.
21:19There.
21:21Mm-hmm.
21:22Mm-hmm.
21:23Mm-hmm.
21:23So this is an alert phone call they're getting from the other activists.
21:27Mm-hmm.
21:31They've just received a request for help from two people from Afghanistan, and they're waiting right now for their individual
21:39needs.
21:39So is this safe for you?
21:41Is it safe for the migrants to be in contact with you?
21:46For us it is safe, and for them it's an additional risk, because more people around them, the more chance
21:53that the drones or Polish border guards will see them.
21:56So they wouldn't contact us if they don't meet.
21:58Right.
21:59And what would happen to them if Polish border guards find them?
22:02They'd be sent back?
22:03They would be immediately thrown to the other side.
22:05Okay.
22:06Brutally.
22:08Every day, these women come face to face with the human cost of hybrid warfare.
22:13This is the wound of the...
22:15Oh, wow.
22:16So it's really...
22:16And that's on the heel?
22:17On the heel, yes.
22:18So it's quite a huge cut.
22:20This is really big.
22:21Yeah.
22:22They've met with hundreds of people fleeing authoritarian regimes in search of asylum.
22:29But less than 10% of migrants on this border have been officially accepted in Poland.
22:34The majority have been pushed back to Belarus.
22:38Poland has suspended the right to apply for international protection at the border with Belarus.
22:44So they have to do everything possible to avoid the contact with Polish border guards or soldiers.
22:50Let's just go.
22:52And maybe let's just give up...
22:53Other volunteers meet us.
22:55Let's put back store and garbage here.
22:58They've packed supplies, including medical gear, to treat the men's injuries.
23:05So that we can move quickly, my crew has stayed behind.
23:08I'm now on camera and sound.
23:12This is funny.
23:14It's a camera.
23:15360 degrees.
23:17Oh.
23:18So front and back.
23:19And it has audio as well.
23:22So we just need to behave very carefully, not making too much noise and not being seen.
23:30We should even avoid being seen by random people.
23:33Because you never know who will be willing to help and who will call the border guard.
23:38You never see any at its roots.
23:42It's good.
23:43It's better that.
23:55I want to take care of that.
23:55I'll never know what anymore.
23:55But this has more cards in the right field.
23:56The school, the dad who is or below the marc remarks.
23:56It has two locations for these threeそれs.
23:57The mother can do.
23:57I'll get..
23:59And I'll pick some 민주 for still.
24:02He will try this far.
24:02For a few years a while.
24:02But her children are not 100% free.
24:02That's difficult.
24:03And he has little people likeiah!
24:08He'll keep Frühling her.
24:25Anna shares clothes, shoes, hot soup, and water.
24:32It's here, it's here, it's here.
24:36It's a very big word.
24:41It's very deep.
24:41Very deep.
24:43He needs proper medical attention.
24:45That's what you look like.
24:47Easy looking.
24:51His hands.
24:54Can I see your hands?
25:00We use a translation app to ask about their journey.
25:05Jump me, jump me, jump me, jump me.
25:08No shit, I should have been exposed.
25:11So I came via Russia to Belarus six months ago.
25:20The men are nervous and want to keep moving.
25:23Nice to meet you.
25:28Anna and Alexandra deliberately don't ask where the men are headed.
25:34They're unlikely to ever meet again.
25:39She wrote in a translator that they are very grateful for our help.
25:47Yeah, it's always difficult to receive these words of thankfulness.
25:53because this situation shouldn't happen here, but we can't do a lot.
26:05The death of democracy happens by a thousand cuts, where you're bleeding from every cut and you don't know how
26:12weak you are until you die.
26:16It's happening everywhere, every place where the tech is ruling us.
26:26Hybrid threats and disinformation campaigns don't just target foreign countries.
26:31Most of it is actually designed to work internally to manipulate opinion and maintain power at home.
26:44We used to think revolutions happened when people took to the streets or that democracies were toppled through bloody military
26:51coup.
26:52Today we can all sense that those lines are blurring between democracy and autocracy and dictatorship.
26:58And for anyone that wants to influence what you think, there's a pretty good opportunity every time you reach into
27:04your pocket.
27:12To understand disinformation's impact on democracy, I've come to the Philippines.
27:19It's an Indo-Pacific neighbour of ours, offering a cautionary tale.
27:26The Philippines has been crowned the social media capital of the world.
27:30People here devote an average of three and a half hours a day to scrolling.
27:34That's an hour longer than the world average.
27:38And Facebook is where most people get their news.
27:42So the end goal is keep you scrolling, hack our biology, change the way we feel,
27:47to change the way we see the world, to change the way we act, to change the way we vote.
27:57In just a couple of days, more than 60 million Filipinos are going to cast their vote.
28:02It's the nation's midterm elections.
28:06So you're voting, where do you get your information from?
28:09Social media is much easier when we select the candidates.
28:13Do you think that facts matter when it comes to elections in the Philippines?
28:17Well, facts matter for me. It's hard to find facts.
28:21They are already running. Oh, the same face, the same family, the same people.
28:31Politics here is dominated by two warring families, Marcos and Duterte.
28:38Once allies, they now work tirelessly to backstab and discredit each other.
28:45Their weapon of choice is the modern-day dagger, online disinformation.
28:55Former President Rodrigo Duterte long ago recognised the role of technology
29:00to help maintain authoritarian power.
29:02If you destroy my country, I will kill you.
29:07Over his six-year term, he ran an indiscriminate war on drugs,
29:12which left thousands dead.
29:15Including 20 journalists.
29:18Just because you are a journalist,
29:20you are not exempted from the isolation.
29:23If you are the son of a bitch.
29:27Independently, reporting facts in the Philippines can be dangerous.
29:32There's one news organisation here that will not be silenced.
29:36Founded back in 2011, it is small, independent, online.
29:41Rappler is a digital news outlet which began its life on Facebook.
29:46What are we fighting?
29:48First, tech, social media, corruption.
29:51Congratulations.
29:52Congratulations.
29:52Rappler's co-founder, Maria Ressa, has won a swag of awards,
29:56including the Nobel Peace Prize, for her commitment to press freedom.
30:02Let's have fun.
30:03Please, let's keep having hope.
30:05Rappler has created and now publishes on its own innovative app.
30:10It works much like a chat app and has freed the news outlet
30:14from a reliance on big tech platforms.
30:16I drank the Kool-Aid and believed that social media
30:20could be an enabling factor for better governance.
30:22And so we thought there was tremendous potential
30:25and it worked for a while.
30:27Are you talking about a moment in time
30:29where there was just this great hope?
30:31Yes.
30:31Like the Arab Spring.
30:32There was this idea that social media was going to democratise the message.
30:39Give the people more power, right?
30:41That was the idea, that it was an enabler for democracy.
30:47In the early 2010s, the Arab Spring began with demonstrations
30:51across the Arab world to oust repressive leaders.
30:55And I was there, reporting at the time.
30:57This is a revolution.
30:59Yeah, this is an Egyptian revolution.
31:01We don't want this government.
31:03The crowds organised themselves via social media.
31:06They called them Twitter revolutions.
31:09And it felt like a powerful tool, creating real political change.
31:14Egypt was so incredible because they were leaderless networks, right?
31:19And it was so exciting.
31:22That the people could work together to gain more power.
31:27These leaderless networks were able to share information
31:30online.
31:31But then from 2014, algorithms began promoting content
31:36that would boost engagement.
31:38The messaging from grassroots campaigns
31:40was eclipsed by more sensational content.
31:45The public information ecosystem literally was rewarding lies,
31:50spreading lies six times faster than facts.
31:52If you lace it with fear, anger and hate, it spreads even faster.
31:58For Duterte, algorithms presented an opportunity.
32:02He could now flood social media with disinformation
32:05that justified his actions and discredited his critics.
32:18He also encouraged an orchestrated hate campaign against Maria Ressa.
32:24In 2016, exponential lies on social media attacking us.
32:30I was getting an average of 90,
32:33nine zero hate messages per hour.
32:35It's dehumanising.
32:37And then 2019 was the beginning of the arrests.
32:42The chief executive of one of the Philippines' most popular news sites
32:46has been arrested for the seventh time.
32:49Maria Ressa has been found guilty of cyber libel charges
32:52and faces up to six years in prison.
32:54I'd say the tipping point when I realised the world had really changed
32:58was when I was arrested in this newsroom.
33:02It's a shock, but we're going.
33:05Press freedom is the foundation of every single right
33:09of every Filipino to the truth.
33:12Ten arrest warrants in a little over a year.
33:15In 2020, I was convicted of cyber libel.
33:18We're still fighting this at the Supreme Court.
33:21Can democracy exist without shared facts?
33:25Absolutely not. No.
33:28The tearing down of democracy began by tearing down facts.
33:32Democracy is taking that shared reality
33:35where we agree on what the facts are
33:38so that we can mobilise, move together.
33:40In the past, it used to be journalists.
33:43We were the gatekeepers.
33:46Maria Ressa's clarion call is echoed by journalists across the world.
33:51I don't think democracy can survive without a free and independent press.
33:54It never has.
33:56Marty Baron is a celebrated American newspaper editor.
34:00He's been pivotal to the success of papers like
34:02The Washington Post and The Boston Globe.
34:06He led the Globe's investigation into clerical sex abuse,
34:10immortalised in the Oscar-winning film Spotlight.
34:14Journalism can influence ordinary people's lives for the better.
34:17It is essential to democracy.
34:20Just telling the truth in too many countries
34:24they will get you killed.
34:26Our mission is holding powerful individuals and institutions accountable,
34:31particularly those who are in government.
34:35When you have aspiring autocrats,
34:37one of the first things they do is try to crush the free and independent press.
34:40We are fighting the fake news.
34:42It's fake, phony, fake.
34:44CNN is scum to fake news.
34:46ABC is one of the worst.
34:49They go after the press first, and there will be nobody to hold them to account.
34:53And when you look around the world at authoritarian governments
34:56and how they came into existence,
34:58you see that one of the first things they did was try to destroy a free and independent press.
35:03In my opinion, you are hurting Australia very much right now.
35:06They want to get along with me.
35:14Traditional media organisations bear responsibility for what they publish.
35:18There's at least a group of people who are professionally trained.
35:22They're not perfect, but they're dedicated to actually verifying the facts.
35:26And it's not just because of the laws, it's because we actually feel morally responsible for what we publish.
35:33I don't think it's enough now to just say,
35:35you should trust us because we're journalists.
35:37But I think it's also fair to say that there are a set of people that exploit the trust gap
35:41with mainstream media organisations for more nefarious purposes.
35:46Something's wrong with the press, guys.
35:48Journalism is dead. Like, what the heck is going on?
35:51Citizen journalism is the future.
35:53By the people, for the people.
36:01The social media platforms that do have a very significant dominance in the information ecosystem
36:08have a business model that prioritises polarising content.
36:13They're not designed to facilitate quality information distribution.
36:18X or Twitter is now just probably the largest disseminator of false information in the world.
36:26Right now.
36:28And that's why we have a society where we can't agree on a common set of facts.
36:32That's why we have a society where we can't even agree on how to determine what a fact is.
36:36And just remember, what you're seeing and what you're reading is not what's happening.
36:44I don't really want government to be in the business of dictating what's true and what's false.
36:50That's a very dangerous place to be.
36:57If you're watching this and you're thinking to yourself, journalists are the problem, or at least part of it, you
37:02may well be right.
37:03There are plenty of legitimate criticisms of us, the role we've played in diminishing trust.
37:11The question is whether journalism can rise to meet the challenge to help fight against this assault on facts.
37:28With only 24 hours to go, election fever is building on the streets of Manila.
37:33As the vote gets closer, they stage these huge political rallies.
37:37So that's where I'm heading now.
37:54You know, when you're in the middle of this, there's something quite captivating about it.
37:58It's kind of hard to know where the politics actually is, but there's a heck of a lot of people
38:02engaged.
38:03Let's have a look.
38:06This is a pro Marcos rally.
38:09Ferdinand Bongbong Marcos is the nation's current president.
38:12So tell me, where do you get your information about politics?
38:15I watch on Facebook.
38:19And do you believe everything you see and hear on Facebook?
38:22Yes.
38:23So you can spot the fake news, can you?
38:25Yes.
38:25Okay.
38:25In the media zone, there's over 50 social media influencers, known here as vloggers, live streaming.
38:34The Independent Electoral Commission says it's fair to assume they're all being paid for their support.
38:40I know lots of people in Philippines use social media.
38:43Is that where most people get their politics from?
38:46Five years ago, they shifted from mainstream media to blogging.
38:51It becomes an industry.
38:52Yeah.
38:533.2 million views is pretty good.
38:56Influencer or vlogger Karkofi hasn't confirmed if he's paid for his devoted support of the Marcos family.
39:04People believe vloggers as compared to mainstream media.
39:09That's my observation because they look at mainstream media as biased, which I agree.
39:13All media organisations are biased.
39:16I'm pro-Marcos.
39:17So I produce content with leanings with bias to the government.
39:21Thank you, ma'am.
39:24So I focus on topics that I think this is good, this will promote the administration, that I do content
39:28for that.
39:30Do you think what you do is propaganda?
39:35By definition of propaganda, I think yes.
39:39It's part of the propaganda.
39:41It's a biased content production.
39:44Yes.
39:45If the propaganda doesn't harm anybody, it doesn't harm the society, it doesn't harm the democracy,
39:51I don't see there's a problem with the propaganda.
39:56You know, historically, countries where the media disappears, the democracy disappears.
40:03But there are vloggers who are also sort of media.
40:11Prime Minister Anthony Albanese, welcome to Big Small Top.
40:14Of course, an alliance between influencers and politicians isn't unique to the Philippines.
40:19And they have this nuclear fantasy.
40:21We'll get into that.
40:23They're opposed to renewables.
40:25In Australia, politicians must disclose paid support from influencers.
40:29But they now often appear as guests on their high-rating shows.
40:34I can't dive.
40:35Were you one of those people that dove and did the belly flop straight in?
40:38There was a little bit of that.
40:39Yeah, yeah.
40:40Delulu.
40:41I dare you to drop that in a speech.
40:43Do you reckon I could do that?
40:43Delulu with no salulu.
40:45Can you drop that in a speech?
40:45They are Delulu with no salulu, Mr Speaker.
40:49The 2025 ballot has been dubbed Australia's first social media-led election.
40:55Have you hit your gardening phase yet?
40:57No, I'm not a gardener.
40:58No, me either.
40:59Yeah.
40:59You just have to observe what podcasters are doing and what influencers are doing.
41:04They're speaking in a much more informal way, as if they were talking to their buddies.
41:08He's scared!
41:10I'm joking.
41:10Yep.
41:11Yep.
41:11Tenses up shitting himself.
41:13Okay.
41:14They're much more relatable than traditional journalists tend to be, and people can connect
41:19with them.
41:21I found often audiences would say, you journalists in the mainstream media, you've brought this
41:28on yourself.
41:29I don't like your facts.
41:30There's other facts.
41:32There are no alternative facts.
41:33Because, let's say right here.
41:35But why do so many people believe there are?
41:37I guess that's what I'm getting at.
41:38Because it's easy to insidiously manipulate people today.
41:41What used to take intelligence agencies a lot more money can now be done cheaply, easily.
41:49Silicon Valley opening the gates to this kind of insidious manipulation.
41:54And the crazy thing is, people don't believe they can be manipulated.
42:00Again, it goes back to our cognitive biases.
42:03And until we realize that, we are Pavlov's dogs.
42:09The media environment, the technology environment, has allowed for and encouraged greater tribalism.
42:16We feel more comfortable with our tribe.
42:18We trust them.
42:19And so, it just reinforces an instinct that's been part of society forever.
42:33Across town, Duterte's supporters are enjoying what feels more like a pop concert than a political rally.
42:41Vloggers are out in force here, too.
42:43And they're upset that Rodrigo Duterte isn't here.
42:46He's in The Hague.
42:48The former Filipino leader, Rodrigo Duterte, is now in the custody of the International Criminal Court
42:54after being arrested on charges linked to his war on drugs.
42:58Thousands of small-time dealers, users, activists and others were killed without trial.
43:06Despite incarceration, Duterte is actually vying for a mayoral seat in this election.
43:12And his daughter is vice-president.
43:14Well, this is where they're all here to see, the vice-president, Sarah Duterte.
43:19Her dad was the president, he's in The Hague, and she's at war with the current president.
43:24In the past months, my family's name has been dragged through the mud.
43:30I am not the problem of this country.
43:42Of course, the Duterte camp has its own army of full-time vloggers.
43:52Between 10 and 30 million US dollars gets funneled to vloggers and influencers here each election cycle.
44:00Before the platforms reward them, they're rewarded by money here.
44:05People can make money from disinformation.
44:07People see the opportunity to make money.
44:09They see the opportunity to acquire power by exploiting the media environment as it exists today.
44:14People are using it for their own commercial ends.
44:17They're using it for their own political ends.
44:18We don't have a counterweight to that.
44:21And that's where we are right now.
44:25It's not like 10 seconds.
44:2710, 9, 8, 7, 6, 5, 4, 3...
44:33Hello and welcome. I'm Maria Ressa.
44:36And you're watching Rappler's special coverage of the 2025 national and local elections.
44:41And the polls have officially opened.
44:4568.4 million Filipinos are registered to vote.
44:49President Ferdinand Marcos declared today a holiday.
44:52So, Filipinos, go out and vote!
44:56Voters don't disappoint.
44:58It's the highest turnout in the history of midterm elections.
45:02More than 80%.
45:06Joining us now is Jairo Balendo.
45:10Hi, I'm Maria. I'm reporting live from Davao City and all eyes on Davao City.
45:14Because of many...
45:16As results start rolling in,
45:18former President Rodrigo Duterte is delivered a sweeping victory.
45:23The Dutertes emerged as big winners in this year's elections in Davao City.
45:28Former President Rodrigo Duterte is set to be elected mayor of Davao City
45:32by a landslide despite being detained at The Hague.
45:37It's a symbolic win.
45:40A mayoral seat Duterte can't reach from his prison cell.
45:44Social media has been flooded with revisionist versions of Duterte's brutal leadership.
45:51And the voters seem to have bought it.
45:55What's happened in the Philippines has happened in many other parts of the world now.
46:04The White House has banned journalists from entering the Pentagon if they haven't signed a legal document agreeing to only
46:11report information that's been authorised by the Department of Defence.
46:14I think that the environment now is a dangerous one for our democracy.
46:20I think we have an administration that's pushing our democracy to the limits in many ways, aiming to break it.
46:2872% of the world is now under authoritarian rule.
46:33We are electing illiberal leaders democratically.
46:37All because of what's on that device.
46:40I think this is the accelerant and an enabler for digital authoritarians.
46:47Every day that democratic nations do not reclaim their rights from big tech is a day they get weaker.
46:56And a government that wins when there's no trust, the only government that survives in that environment is a dictatorship.
47:04Do you think we can win this fight?
47:06We have to wake up.
47:10We have to look at the tech.
47:13We have to admit that we have been manipulated.
47:17And we have to demand better.
47:21This is a thinking fast world.
47:24That's where the big tech companies want you to stay because that's where they make the most profit.
47:29But you need to move into the thinking slow world.
47:33And we need to solve it at scale or we will lose our agency and our humanity.
47:43Join me next time.
47:45This is very new and very weird.
47:47Are these people real?
47:48Are we diving headfirst into AI?
47:51Is there anything about it that looks weird to you?
47:54No.
47:55Where is AI taking us?
47:56So you reckon he's real?
47:58That looks fake.
48:00I was really easily tricked by it.
48:01And do we even want to go there?
48:03We don't want to sleepwalk into a future that nobody wants.
48:06They've invested hundreds of billions of dollars into this industry and they want to seek a return.
48:11Are you scared?
48:12I'm very scared.
48:18Don't wait a week when all episodes of The Matter of Fact are streaming now on ABC iview.
48:36So leaving this part, are doing tours right now and are the second questions.
48:38K-sids might not have less money.
48:38So far the amazingbal wonder,
Comments

Recommended