Skip to playerSkip to main content
  • 13 minutes ago
#film#shows#usa#usashows#hot#filmhot

Category

😹
Fun
Transcript
00:03Hi, I'm Hamish MacDonald, and I'm a journalist, which means facts are my business.
00:09Where do you get your information from?
00:11Social media.
00:12Facebook.
00:12Facebook, Instagram and YouTube.
00:15Social media is where so many of us now get our information.
00:19But can we rely on it?
00:22Where tech makes it so easy to create a fictional story, which can look and sound real.
00:29I went to the moon last week, don't you know?
00:31Even though it's entirely fake.
00:35Are you like so many people now, struggling to know what's true and what's false?
00:41In this series, I'm going to explore how big tech is transforming our lives and our minds.
00:47We're going through a period of radical change in the way that people consume information.
00:51We've created this monster.
00:53An algorithm is outsourcing human choice into a machine's decision.
00:59The end goal is keep you scrolling.
01:01Hack our biology, change the way we feel to change the way we see the world.
01:06Is anyone else paying attention to how all of this is affecting society?
01:12I want to understand how technology is changing the way we think.
01:16It really does question your reality.
01:19And why we're all susceptible to mis and disinformation.
01:23We are just as vulnerable, but we think we're not.
01:27And actually, that's what makes us more vulnerable.
01:30Do you worry about what happens if we can't agree on facts?
01:34This has been an information war. Facts stop mattering.
01:37This growing sense that anything could be fake at any time.
01:42Looks like you.
01:44Just not me.
01:45The facts are so polar opposite.
01:47Which ones are the facts?
01:49We can't even agree on how to determine what a fact is.
01:52In this episode, I'm travelling far and wide to understand how disinformation disrupts us.
01:58And democracy.
01:59They're posting on a pattern.
02:01Hmm, these are bots.
02:03So is this safe for you?
02:05There might be people here hiding from us.
02:08It's part of the propaganda.
02:10It's a biased content production.
02:12The tearing down of democracy began by tearing down facts.
02:17Join me on a journey right around Australia and the world
02:20to make people determined to keep facts alive
02:23in this real-time information revolution.
02:38If you wanted to damage someone's reputation, how would you do it?
02:43Maybe the easiest way would be to start a rumour.
02:48And no-one ever needs to know it came from you.
02:54So what if you wanted to damage an entire country's reputation?
02:59It's the same principle.
03:01Let me show you how it's done.
03:04I promise there's no funny business.
03:06I want to invite you right now to come and join me in a hotel room in Paris.
03:10Just before the Olympics.
03:22G'day.
03:23Welcome.
03:24Come in.
03:24Do you want to take a seat?
03:25Just come through here.
03:27I might move this guy out of the way for you first.
03:30There you go.
03:31Make yourself comfy.
03:32So, listen.
03:33I just want you to imagine I'm one of more than 11 million people in Paris
03:37ahead of the 2024 Olympic Games.
03:39I've actually come a bit early
03:41because I've got a heap of cash to drop on this.
03:46My phone is running hot
03:48with advice on great wine bars,
03:50how to use the metro,
03:51and, uh, ugh.
03:54Bugs.
03:54There's bed bugs everywhere.
03:55They're on the planes.
03:56They're on the trains.
03:57They're on the buses.
04:00Paris is in the grips of a bed bug apocalypse.
04:04It's an icky situation.
04:05The mirror of Paris is declaring no one is safe from bed bugs.
04:10Madame la Première Ministre,
04:12ces petits insectes répandent le désespoir dans notre pays.
04:17Bed bugs in Paris, public enemy number one.
04:20Une vague de panique saisit le pays.
04:23Les punaises de lits prolifèrent dans tous les lieux du quotidien.
04:26Elles peuvent vivre un calvaire aux millions de nos concitoyens infestés.
04:30If you've never met a bed bug,
04:32lucky you.
04:33They're blood-sucking parasites with a nasty bite.
04:37So, why is it that they're flooding into Paris
04:40ahead of the Olympic tourists?
04:46So, looking at social media,
04:48it's telling me that this influx in bed bugs
04:51coincides with the arrival of thousands of Ukrainian refugees.
04:54It is true France is accepting a lot of people from Ukraine.
04:58They're fleeing the Russian invasion.
05:00It's also true that every year in France
05:03the number of bed bugs rises and falls,
05:05along with the seasons.
05:10The bed bug crisis was a social media beat-up.
05:14When French authorities traced the bed bug and Ukrainian refugees' story,
05:19they found it was a fiction spread by Russian agents
05:22to turn public sentiment against the refugees.
05:25Here's the post that's been circulating on a telegram channel
05:30since October 2nd...
05:31Russian sites even created counterfeit versions
05:34of respected French publications,
05:36mimicking reports on the bed bug crisis.
05:39...where they usurp Western media logos...
05:43The Russia would have favored and alimented this fear
05:45on the internet
05:46by flooding false information networks
05:49through bots,
05:50accounts created by machines
05:51to destabilize France
05:53because of its power support to Ukraine.
05:59This whole campaign demonstrates almost perfectly
06:02the genius of modern disinformation warfare.
06:06All you need is to isolate an issue
06:08with a hint of truth to it.
06:10Make sure you've got all the ingredients to make it go viral
06:12and then amplify it for good measure.
06:15You see, by the time France realised it was Russia
06:18pushing all of this stuff,
06:19refugee tensions had been stoked
06:21and the host nation, well,
06:23they were more than a bit embarrassed.
06:34The goal of Russian disinformation
06:37is not to make you believe anything.
06:41It is to break open the fracture lines of society
06:45to weaken from within.
06:54Disinformation, or hybrid wars,
06:57are being waged across the world.
06:59If you're on social media,
07:01you may well be inadvertently consuming
07:03or even sharing this content.
07:07These campaigns have been identified
07:09as major global threats
07:11to peace and democracy.
07:13And while they're not physically devastating
07:15like conventional wars,
07:17the deliberate spread of false information
07:19targeting a country's population
07:21is considered a form of warfare.
07:39During the Second World War,
07:41this part of London was bombed night after night.
07:43Churchill was hunkered down in his underground war rooms,
07:47directly beneath us, in fact.
07:49The air raids were ear-piercing.
07:51People were shaken from their beds.
07:54And although it's been some 80 years
07:56since the air raid sirens have rung out across London,
07:58warning of an attack,
08:00that doesn't mean this city's been free from invasion.
08:04Today, though, the threat comes through firewalls
08:07and disinformation is spread
08:09to create confusion and fear.
08:15It's the job of counterintelligence experts
08:17scouring online activity
08:19to identify and expose disinformation attacks.
08:24Ben Strick does just that.
08:26He's an Australian based in London.
08:29He believes we need to recognise
08:31how the mechanics of warfare are fundamentally shifting.
08:34I mean, what do you see
08:37when you see this piece of equipment right here?
08:40A very big rocket, that's what I see.
08:42Probably a very expensive rocket too, right?
08:44Yeah, OK.
08:44So I see something that costs a lot of money.
08:47Its intent is to cause harm,
08:49maybe to target a military unit
08:52or a strategic objective, right?
08:53The research to create this is hundreds of millions,
08:57but a series like a campaign online
09:00could be so much cheaper to target the enemy
09:04and cause, you know, damage financially.
09:06It can be done at scale with, you know,
09:09a cheap subscription to an AI platform en masse
09:12and a couple of hundred dollars.
09:14The lieutenants in a hybrid war
09:16can wage their battle from the comfort of their own homes.
09:20Their soldiers can be anonymous trolls
09:22or machine-generated bots.
09:26Bots are given their own accounts and identities,
09:29which trick us into thinking they're real.
09:32In 2024 alone,
09:35Meta identified 4.23 billion fake Facebook accounts.
09:40Every year, thousands of Australians are scammed
09:43out of millions of dollars.
09:49I want to show you how a campaign can be waged online technically.
09:54Yeah.
09:55Ben Strick busts disinformation campaigns across the world.
09:59He wants to show me an example from much closer to home.
10:02A campaign generated in Indonesia
10:05to undermine the autonomous province of West Papua's bid for independence.
10:11This is in West Papua,
10:13so it's an independent area under Indonesian occupation.
10:22So there were large protests that happened in August 2019 on the streets.
10:28These were the sorts of videos that were being uploaded online.
10:31And so the Indonesian government shut off the internet.
10:36They shut off all connectivity for people in West Papua.
10:39And I think they shut it off as a way to stop information flow, right?
10:43So that no one could access social media channels.
10:45But the thing is, when we saw it shut down,
10:48we saw Indonesian accounts sharing videos like this.
10:51What is Indonesia trying to hide in West Papua?
10:54It's trying to hide its great budget decisions.
10:56It's great work with schools and things like that.
10:59And it was actually being shared by accounts like this guy called Marco267.
11:03Yeah.
11:04He's a friendly white guy and he's posting saying,
11:07what are some of the secrets that Indonesia have been hiding in West Papua?
11:10Mm-hm.
11:10He's using hashtags that are not really relevant,
11:13like West Papua Genocide and Free West Papua.
11:16But it's kind of odd for me because it's not only Marco uploading it,
11:21it's hundreds of accounts all posting the exact same message.
11:25And this is not retweets, this is actual individual.
11:28No.
11:28Separate accounts.
11:29So let me take you back to Marco267.
11:33Now I do something like an image reverse search on his profile picture.
11:37Yeah.
11:37Marco looks like a friendly Western white guy,
11:39but he's also used in dating forums, in stock photo websites,
11:44in ski resort hotels.
11:46This is a fake account.
11:49And so what I did was I collected time codes of Marco's posting activity
11:54and his friends.
11:55And you can see this repeat pattern.
11:57Bella now, one of his friends, is posting at 0.3256,
12:019.3256, 8.3256.
12:04They're posting on a pattern.
12:06Mm-hm.
12:06These are bots.
12:07And if we have a look at one of Marco's friends,
12:10which is West Papua ID,
12:11that discusses fact and data in West Papua.
12:15Sounds like a legitimate website.
12:17You can even see there's articles.
12:18And these articles talk about, again,
12:20the great work that Indonesia is doing.
12:23Mm-hm.
12:23We start to think, well, okay, who made this website?
12:26And one of the tools that we have is metadata.
12:31Mm-hm.
12:31Who registered the website?
12:33This person left their real name, their phone number,
12:37their email address and everything like that on this website.
12:40I pop that phone number into WhatsApp
12:42and immediately I have someone that owns that website.
12:45And simply doing an image reverse search,
12:48I'm able to find that person's LinkedIn account.
12:51And now we've identified a company.
12:54That company owns all of these websites.
12:56It's one marketing firm in Indonesia.
12:59We shared these findings with Facebook and with Twitter
13:03and Facebook published a huge takedown of a network.
13:07Twitter had taken down 795 accounts as well.
13:11And we noticed that the website had folded up.
13:15Thank God.
13:16This small marketing firm in Jakarta was closed overnight
13:19after we published that.
13:21Wow.
13:21That's impressive.
13:23Cheers, mate.
13:27These campaigns go by various names.
13:30Hybrid threats, cognitive warfare, or simply propaganda.
13:36Even when there's a pause in the bombing,
13:39there's no lull in the propaganda war.
13:42Propaganda has existed for as long as the powerful
13:45have sought to manipulate public thinking.
13:47It's how it can be spread that's changed.
13:50Psychological warfare is a major part
13:53of the American and South Vietnamese effort.
13:55With massive leaflet drops urging them to defect
13:58and join the government side.
14:01Our information systems were once ruled
14:04by the printing presses.
14:06You're tuned to Radio Australia,
14:08the overseas service of the Australian...
14:10In the 20th century, radio expanded audiences across the oceans.
14:16This century, anyone with an internet connection
14:19can instantly deliver information or disinformation to billions.
14:26I love the idea of using technology
14:29to facilitate consultation, engagement,
14:32a greater sense of democracy.
14:34How social media platforms currently work
14:36is to not create those kinds of conditions.
14:39Every technology comes with advantages and disadvantages,
14:42and the disadvantage here is that it allowed a lot of people
14:46to advance their own personal, professional,
14:50commercial or political purposes.
14:59Russia leads the world in creating clandestine hybrid threats,
15:03targeting its neighbours.
15:06Just like the bedbugs campaign,
15:08hybrid threats weaponise existing tensions
15:11to fuel public fear.
15:14Now, if you're anything like me at this point,
15:16you might be thinking,
15:17hybrid threats?
15:18Somewhat confusing.
15:19What exactly is the objective?
15:22Where do they come from?
15:23Who and what are they targeting?
15:25And how on earth would you ever know for sure?
15:27And I suppose that's the point.
15:29It's confusion by design.
15:31So, that's why I've come here to the border between Poland and Belarus,
15:35where the threat is a bit more tangible.
15:37It involves real human lives and one very big fence.
15:44This deep primeval forest on Poland's eastern border
15:48has become a death trap for people who believe the lies
15:52they've been sold online.
15:55Like promises of freedom and opportunity.
16:01in 2021, thousands of migrants travelled from the Middle East
16:05and North Africa via Russia and Belarus to the Polish border.
16:20Polish guards struggled to hold them back.
16:25They kept on coming.
16:28By October, numbers had swelled to tens of thousands of people.
16:32Illegal border crossing is forbidden.
16:37You will face criminal charges.
16:40Despite waves of migrants into Europe,
16:43only a trickle ever attempted this crossing.
16:46So, why the sudden surge?
16:49The answer is all over social media.
16:52Messages, many in Arabic, offering flights to Minsk,
16:56the capital of Belarus, and taxis to the border.
16:59Russian travel agents promise Polish visas to migrants across Africa.
17:05According to the Polish government,
17:08this is a hybrid threat campaign created by Belarus and Russia
17:12to manufacture a migration crisis,
17:15a claim both countries deny.
17:18The European Union says Russia works with its proxy state Belarus
17:22to disrupt democracy in Poland and broader Europe.
17:26This is a hybrid attack of an authoritarian regime
17:30to try to destabilise democratic neighbours,
17:33and this will not succeed.
17:41Over the past five years,
17:43tens of thousands of migrants have tried to cross this border.
17:47But the Polish government greets them with razor wire,
17:50a five-metre fence,
17:52and border guards pushing them back to Belarus.
17:57The hybrid threat has worked just as intended.
18:00Anti-migrant anger is on the rise across Poland.
18:04Far-right groups claim the country is now flooded
18:07with unconstrained migration.
18:10But the facts just don't support that.
18:13Only a small proportion of those arriving
18:16are granted asylum in Poland.
18:19Among them, a young North African
18:21who's agreed with the promise of anonymity
18:23to tell me how he got here.
18:25How did you know that you could get to Europe
18:29by coming to Russia and then Belarus?
18:31Who told you that this was a pathway?
18:40Which social media?
18:41TikTok, Facebook, Instagram?
18:43I use only Facebook.
18:45Facebook, okay.
18:46The visa that you had for Russia,
18:49was it a tourist visa or a business visa?
18:52Tourist visa for three months?
18:55It was three months, yes.
18:56And when you were in Belarus,
18:58did any of the border guards or police
19:01tell you that it's safe to go through to Poland?
19:03Did they encourage you to go to Poland?
19:07Go to Poland, go to Europe.
19:10Go to Europe, yes.
19:12That was what they said?
19:13Yes.
19:15Did you think it would be safe to travel this way?
19:19Across the jungle, no safety.
19:27Migrants can spend days, weeks, even months in the forest and swamplands
19:32on both sides of the border.
19:35The border with Belarus is in about eight kilometres distance.
19:40All right, shall we walk?
19:41Yeah, we can walk.
19:44I got a message from Telstra welcoming me to Belarus.
19:49Alexandra Kirshinovska and Anna Albot have allowed me to join them
19:52in a section of forest where migrants often hide after crossing the border.
19:58So it's pretty rough, you're not really following trails,
20:00you're just literally walking off into the bush.
20:02No, yes.
20:04These women are part of a network of volunteers
20:07who can't offer people safe passage,
20:10but they can deliver life-saving supplies.
20:13We would come here with dry clothes, shoes, with food.
20:17Do you find that you get lost?
20:19We all the time use GPS.
20:21Yeah, right.
20:22So we do get lost, but not so bad.
20:25Yeah, right.
20:28There might be people here hiding from us.
20:30Really?
20:32Oh, those boots.
20:34Yeah, it's a boot.
20:35There's a jacket just back there.
20:37Yeah, and here...
20:39It's unbelievable how much there is.
20:41Everywhere in front of us here.
20:43And the jacket here,
20:45or actually a sleeping bag.
20:47Oh, and the backpack.
20:48Oh, wow.
20:49Sometimes if you run away, you leave everything behind.
20:53It must be so bewildering arriving into this.
20:57Like, I just don't know what you'd make of it
20:58if you'd arrived over a fence from North Africa or the Middle East
21:03and landed here in the middle of a swamp.
21:07Unbelievable.
21:12Oh, sorry.
21:14Our alarm phone.
21:15OK.
21:16Hello.
21:17Hello, hello.
21:19Tak. Tak. Tak. Tak.
21:21Mm-hmm. Mm-hmm.
21:23So this is an alert phone call they're getting from the other activists.
21:27Mm-hmm.
21:31They've just received a request for help from two people from Afghanistan,
21:36and they're waiting right now for their individual needs.
21:39So is this safe for you?
21:42Is it safe for the migrants to be in contact with you?
21:46For us it is safe, and for them it's an additional risk,
21:50because more people around them, the more chance that the drones
21:54or Polish border guards will see them.
21:56So they wouldn't contact us if they don't meet.
21:59And what would happen to them if Polish border guards find them?
22:02They'd be sent back?
22:03They would be immediately through to the other side.
22:05OK.
22:06Brutally.
22:08Every day, these women come face-to-face
22:10with the human cost of hybrid warfare.
22:13This is the wound of the...
22:15Oh, wow.
22:16And that's on the heel?
22:17On the heel, yes.
22:18Oh, they've got a huge cut in the heel.
22:20This is really big.
22:21Yep.
22:22They've met with hundreds of people fleeing authoritarian regimes
22:26in search of asylum.
22:29But less than 10% of migrants on this border
22:32have been officially accepted in Poland.
22:34The majority have been pushed back to Belarus.
22:38Poland has suspended the right
22:40to apply for international protection at the border with Belarus.
22:44So they have to do everything possible
22:46to avoid the contact with Polish border guards or soldiers.
22:51Let's just go.
22:52And maybe let's just give up...
22:53Other volunteers meet us.
22:55Let's put bags and garbage here.
22:58They've packed supplies, including medical gear,
23:01to treat the men's injuries.
23:05So that we can move quickly, my crew has stayed behind.
23:08I'm now on camera and sound.
23:12This is funny.
23:14It's a camera.
23:15360 degrees.
23:17So front and back.
23:19And it has audio as well.
23:22So we just need to behave very carefully,
23:26not making too much noise and not being seen.
23:30We should even avoid being seen by random people,
23:33because you never know who will be willing to help
23:36and who will call the border guard.
23:38We should be careful.
23:51We should be careful.
23:52I can go to the border guard.
23:53I'm ready.
23:54I'm ready.
24:06I can go to the border guard.
24:25Anna shares clothes, shoes, hot soup and water.
24:39It's a very big wound.
24:41It's very deep, very deep.
24:43He needs proper medical attention.
24:45It doesn't look like easy looking.
24:51His hands.
24:54Can I see your hands?
25:00We use a translation app to ask about their journey.
25:05Jump me, jump me, jump me.
25:07I should.
25:09I should.
25:09I should.
25:11So I came via Russia to Belarus.
25:14Six months ago.
25:20The men are nervous and want to keep moving.
25:23Hi.
25:25Maasalamu.
25:27Hi.
25:28Anna and Alexandra deliberately don't ask where the men are headed.
25:34They're unlikely to ever meet again.
25:39She wrote in a translator that they are very grateful for our help.
25:43Yeah.
25:44That's it.
25:46Yeah.
25:47It's always difficult to receive these words of thankfulness.
25:53Because this situation shouldn't happen.
25:57Yeah.
25:57And we can't do a lot.
26:05The death of democracy happens by a thousand cuts, where you're bleeding from every cut and you don't know how
26:12weak you are until you die.
26:16It's happening everywhere.
26:19Every place where the tech is ruling us.
26:25Hybrid threats and disinformation campaigns don't just target foreign countries.
26:31Most of it is actually designed to work internally, to manipulate opinion and maintain power at home.
26:44We used to think revolutions happened when people took to the streets or that democracies were toppled through bloody military
26:51coup.
26:52Today we can all sense that those lines are blurring between democracy and autocracy and dictatorship.
26:58And for anyone that wants to influence what you think, there's a pretty good opportunity every time you reach into
27:04your pocket.
27:12To understand disinformation's impact on democracy, I've come to the Philippines.
27:19It's an Indo-Pacific neighbour of ours, offering a cautionary tale.
27:26The Philippines has been crowned the social media capital of the world.
27:30People here devote an average of three and a half hours a day to scrolling.
27:34That's an hour longer than the world average.
27:38But Facebook is where most people get their nudes.
27:42So the end goal is keep you scrolling.
27:44Hack our biology, change the way we feel, to change the way we see the world, to change the way
27:49we act, to change the way we vote.
27:58Just a couple of days, more than 60 million Filipinos are going to cast their vote.
28:02It's the nation's midterm elections.
28:06So you're voting, where do you get your information from?
28:09Social media is much easier when we select the candidates.
28:13Do you think that facts matter when it comes to elections in the Philippines?
28:17Well, facts matter for me. It's hard to find a facts.
28:21They are already running, all the same face, the same family, the same people.
28:31Politics here is dominated by two warring families, Marcos and Duterte.
28:38Once allies, they now work tirelessly to backstab and discredit each other.
28:45Their weapon of choice is the modern day dagger, online disinformation.
28:55Former President Rodrigo Duterte long ago recognised the role of technology to help maintain authoritarian power.
29:03If you destroy my country, I will kill you.
29:07Over his six year term, he ran an indiscriminate war on drugs, which left thousands dead.
29:15Including 20 journalists.
29:18Just because you are a journalist, you are not exempt from the population.
29:23If you are the son of a bitch.
29:27Independently, reporting facts in the Philippines can be dangerous.
29:32There's one news organisation here that will not be silenced.
29:36Founded back in 2011, it is small, independent, online.
29:41Rappler is a digital news outlet which began its life on Facebook.
29:46What are we fighting?
29:47First, tech, social media, corruption.
29:51Congratulations.
29:52Congratulations.
29:52Rappler's co-founder, Maria Ressa, has won a swag of awards, including the Nobel Peace Prize, for her commitment to
30:00press freedom.
30:02Let's have fun, please. Let's keep having hope.
30:05Rappler has created and now publishes on its own innovative app.
30:10It works much like a chat app and has freed the news outlet from a reliance on big tech platforms.
30:17I drank the Kool-Aid and believed that social media could be an enabling factor for better governance.
30:22And so we thought there was tremendous potential and it worked for a while.
30:27Are you talking about a moment in time where there was just this great hope?
30:31Yes.
30:31Like the Arab Spring, there was this idea that social media was going to democratise the message.
30:39Give the people more power, right? That was the idea, that it was an enabler for democracy.
30:47In the early 2010s, the Arab Spring began with demonstrations across the Arab world to oust repressive leaders.
30:55And I was there, reporting at the time.
30:57This is a revolution.
30:59Yeah, this is an Egyptian revolution. We don't want this government.
31:03The crowds organised themselves via social media.
31:06They called them Twitter revolutions.
31:09And it felt like a powerful tool, creating real political change.
31:13Egypt was so incredible because they were leaderless networks, right?
31:19And it was so exciting that the people could work together to gain more power.
31:27These leaderless networks were able to share information online.
31:31But then, from 2014, algorithms began promoting content that would boost engagement.
31:38The messaging from grassroots campaigns was eclipsed by more sensational content.
31:44The public information ecosystem literally was rewarding lies, spreading lies six times faster than facts.
31:53If you lace it with fear, anger and hate, it spreads even faster.
31:58For Duterte, algorithms presented an opportunity.
32:02He could now flood social media with disinformation that justified his actions and discredited his critics.
32:18He also encouraged an orchestrated hate campaign against Maria Ressa.
32:23In 2016, exponential lies on social media attacking us.
32:30I was getting an average of 90, 9-0 hate messages per hour.
32:36It's dehumanising.
32:37And then, 2019 was the beginning of the arrests.
32:42The chief executive of one of the Philippines' most popular news sites has been arrested for the seventh time.
32:49Maria Ressa has been found guilty of cyber libel charges and faces up to six years in prison.
32:55I'd say the tipping point when I realised the world had really changed was when I was arrested in this
33:00newsroom.
33:02It's a shock. It's a shock, but we're going.
33:05Press freedom is the foundation of every single right of every Filipino to the truth.
33:12Ten arrest warrants in a little over a year.
33:15In 2020, I was convicted of cyber libel.
33:18We're still fighting this at the Supreme Court.
33:21Can democracy exist without shared facts?
33:26Absolutely not. No.
33:28The tearing down of democracy began by tearing down facts.
33:32Democracy is taking that shared reality where we agree on what the facts are so that we can mobilise, move
33:40together.
33:40In the past, it used to be journalists. We were the gatekeepers.
33:46Maria Ressa's clarion call is echoed by journalists across the world.
33:51I don't think democracy can survive without a free and independent press. It never has.
33:56Marty Baron is a celebrated American newspaper editor.
34:00He's been pivotal to the success of papers like The Washington Post and The Boston Globe.
34:06He led the Globe's investigation into clerical sex abuse, immortalised in the Oscar-winning film Spotlight.
34:14Journalism can influence ordinary people's lives for the better.
34:17It is essential to democracy.
34:20Just telling the truth in too many countries, it will get you killed.
34:26Our mission is holding powerful individuals and institutions accountable, particularly those who are in government.
34:35When you have aspiring autocrats, one of the first things they do is try to crush the free and independent
34:39press.
34:40We are fighting the fake news. It's fake, phony, fake.
34:44CNN is scum.
34:45It's your fake news.
34:46ABC is one of the worst.
34:47I have to be honest with you.
34:49They go after the press first, and there will be nobody to hold them to account.
34:53And when you look around the world at authoritarian governments and how they came into existence, you see that one
34:59of the first things they did was try to destroy a free and independent press.
35:03In my opinion, you are hurting Australia very much right now.
35:06They want to get along with me.
35:14Traditional media organisations bear responsibility for what they publish.
35:18There's at least a group of people who are professionally trained, they're not perfect, but they're dedicated to actually verifying
35:25the facts.
35:26And it's not just because of the laws, it's because we actually feel morally responsible for what we publish.
35:33I don't think it's enough now to just say, you should trust us because we're journalists.
35:37But I think it's also fair to say that there are a set of people that exploit the trust gap
35:41with mainstream media organisations for more nefarious purposes.
35:45Something's wrong with the press, guys.
35:48Journalism is dead. Like, what the heck is going on?
35:51Citizen journalism is the future. By the people, for the people.
36:01The social media platforms that do have a very significant dominance in the information ecosystem have a business model that
36:09prioritises polarising content.
36:14They're not designed to facilitate quality information distribution.
36:18X or Twitter is now just probably the largest disseminator of false information in the world right now.
36:28And that's why we have a society where we can't agree on a common set of facts.
36:32That's why we have a society where we can't even agree on how to determine what a fact is.
36:36And just remember, what you're seeing and what you're reading is not what's happening.
36:44I don't really want government to be in the business of dictating what's true and what's false.
36:49That's a very dangerous place to be.
36:57If you're watching this and you're thinking to yourself, journalists are the problem, or at least part of it, you
37:02may well be right.
37:03There are plenty of legitimate criticisms of us, the role we've played, in diminishing trust.
37:11The question is whether journalism can rise to meet the challenge to help fight against this assault on facts.
37:28With only 24 hours to go, election fever is building on the streets of Manila.
37:34As the vote gets closer, they stage these huge political rallies.
37:38So, that's where I'm heading now.
37:53You know, when you're in the middle of this, there's something quite captivating about it.
37:57It's kind of hard to know where the politics actually is, but there's a heck of a lot of people
38:03engaged.
38:03Let's have a look.
38:06This is a pro-Marcos rally.
38:09Ferdinand Bongbong-Marcos is the nation's current president.
38:12So, tell me, where do you get your information about politics?
38:15I watch on Facebook.
38:19And do you believe everything you see and hear on Facebook?
38:22Yes.
38:23So, you can spot the fake news, can you?
38:24Yes.
38:25OK.
38:26In the media zone, there's over 50 social media influencers, known here as vloggers, live streaming.
38:34The Independent Electoral Commission says it's fair to assume they're all being paid for their support.
38:40I know lots of people in Philippines use social media.
38:43Is that where most people get their politics from?
38:46Five years ago, they shifted from mainstream media to vlogging.
38:51It becomes an industry.
38:52Yeah.
38:53That 3.2 million views is pretty good.
38:56Influencer or vlogger, Karkofi, hasn't confirmed if he's paid for his devoted support of the Marcos family.
39:03People believe vloggers as compared to mainstream media.
39:09That's my observation.
39:10Because they look at mainstream media as biased.
39:12Which I agree.
39:13All media organisations are biased.
39:16Um...
39:16I'm pro-Marcos.
39:17So, I produce content with leanings, with bias to the government.
39:22Thank you, ma'am.
39:23Thank you, Paul.
39:24So, I focus on topics that I think this is good, this will promote the administration, that I do content
39:28for that.
39:30Do you think what you do is propaganda?
39:35By definition of propaganda, I think yes.
39:39It's part of the propaganda.
39:41It's a biased content production.
39:44Yes.
39:45If the propaganda doesn't harm anybody, it doesn't harm the society, it doesn't harm the democracy, I don't see there's
39:52a problem with the propaganda.
39:57You know, historically, countries where the media disappears, the democracy disappears.
40:03But there are vloggers who are also sort of media.
40:11Prime Minister Anthony Albanese, welcome to Big Small Top.
40:14Of course, an alliance between influencers and politicians isn't unique to the Philippines.
40:19And they have this nuclear fantasy.
40:21We'll get into that.
40:23They're opposed to renewables.
40:25In Australia, politicians must disclose paid support from influencers.
40:30But they now often appear as guests on their high-rating shows.
40:34I can't dive.
40:35Were you one of those people that dove and did the belly flop straight in?
40:38There was a little bit of that, yeah, yeah.
40:40Delulu.
40:41I dare you to drop that in a speech.
40:43Do you reckon I could do that?
40:43Delulu with no salulu.
40:45Can you drop that in a speech?
40:45They are Delulu with no salulu, Mr Speaker.
40:49The 2025 ballot has been dubbed Australia's first social media-led election.
40:55Have you hit your gardening phase yet?
40:57No, I'm not a gardener.
40:58No, me either.
40:59Yeah.
40:59You just have to observe what podcasters are doing and what influencers are doing.
41:04They're speaking in a much more informal way, as if they were talking to their buddies.
41:08He's scared!
41:09And I'm joking.
41:11Yep, tenters up, shitting himself.
41:13Okay.
41:13They're much more relatable than traditional journalists tend to be, and people can connect
41:19with them.
41:21I found often audiences would say, you journalists in the mainstream media, you've brought this
41:28on yourself.
41:29I don't like your facts.
41:30There's other facts.
41:32There are no alternative facts.
41:34Because, let's say right here.
41:35But why do so many people believe there are?
41:37I guess that's what I'm getting at.
41:38Because it's easy to insidiously manipulate people today.
41:41What used to take intelligence agencies a lot more money can now be done cheaply, easily.
41:49Silicon Valley opening the gates to this kind of insidious manipulation.
41:54And the crazy thing is, people don't believe they can be manipulated.
42:00Again, it goes back to our cognitive biases.
42:03And until we realize that, we are Pavlov's dogs.
42:09The media environment, the technology environment, has allowed for and encouraged greater tribalism.
42:16We feel more comfortable with our tribe.
42:18We trust them.
42:19And so, it just reinforces an instinct that's been part of society forever.
42:33Across town, Duterte supporters are enjoying what feels more like a pop concert than a political rally.
42:41Vloggers are out in force here too.
42:43And they're upset that Rodrigo Duterte isn't here.
42:47He's in The Hague.
42:48The former Filipino leader, Rodrigo Duterte, is now in the custody of the International Criminal Court
42:54after being arrested on charges linked to his war on drugs.
42:58Thousands of small-time dealers, users, activists and others were killed without trial.
43:06Despite incarceration, Duterte is actually vying for a mayoral seat in this election.
43:12And his daughter is vice-president.
43:15Well, this is where they're all here to see, the vice-president, Sarah Duterte.
43:19Her dad was the president, he's in The Hague, and she's at war with the current president.
43:24In the past months, my family's name has been dragged through the mud.
43:30I am not the problem of this country.
43:43Of course, the Duterte camp has its own army of full-time vloggers.
43:52Between 10 and 30 million US dollars gets funneled to vloggers and influencers here each election cycle.
44:00Before the platforms reward them, they're rewarded by money here.
44:05People can make money from disinformation.
44:07People see the opportunity to make money.
44:09They see the opportunity to acquire power by exploiting the media environment as it exists today.
44:14People are using it for their own commercial ends.
44:17They're using it for their own political ends.
44:18We don't have a counterweight to that.
44:21And that's where we are right now.
44:25Start by 10 seconds.
44:2710, 9, 8, 7, 6, 5, 4, 3...
44:33Hello and welcome.
44:35I'm Maria Ressa, and you're watching Rappler's special coverage of the 2025 national and local elections.
44:41And the polls have officially opened.
44:4568.4 million Filipinos are registered to vote.
44:49President Ferdinand Marcos declared today a holiday.
44:52So, Filipinos, go out and vote!
44:56Voters don't disappoint.
44:58It's the highest turnout in the history of midterm elections.
45:02More than 80%.
45:06Joining us now is Jairo Balendo.
45:10Hi, I'm Maria. I'm reporting live from Davao City and all eyes on Davao City because of many...
45:15As results start rolling in, former President Rodrigo Duterte is delivered a sweeping victory.
45:23The Dutertes emerged as big winners in this year's elections in Davao City.
45:28Former President Rodrigo Duterte set to be elected mayor of Davao City by a landslide despite being detained at The
45:35Hague.
45:38It's a symbolic win.
45:40A mayoral seat Duterte can't reach from his prison cell.
45:44Social media has been flooded with revisionist versions of Duterte's brutal leadership.
45:51And the voters seem to have bought it.
45:55What's happened in the Philippines has happened in many other parts of the world now.
46:04The White House has banned journalists from entering the Pentagon
46:07if they haven't signed a legal document agreeing to only report information that's been authorised by the Department of Defence.
46:14I think that the environment now is a dangerous one for our democracy.
46:20I think we have an administration that's pushing our democracy to the limits in many ways, aiming to break it.
46:2972% of the world is now under authoritarian rule.
46:33We are electing illiberal leaders democratically.
46:38All because of what's on that device.
46:40I think this is the accelerant and an enabler for digital authoritarians.
46:47Every day that democratic nations do not reclaim their rights from big tech is a day they get weaker.
46:55And a government that wins when there's no trust, the only government that survives in that environment is a dictatorship.
47:04Do you think we can win this fight?
47:07We have to wake up.
47:10We have to look at the tech.
47:13We have to admit that we have been manipulated.
47:18And we have to demand better.
47:21This is a thinking fast world.
47:24That's where the big tech companies want you to stay because that's where they make the most profit.
47:29But you need to move into the thinking slow world.
47:33And we need to solve it at scale or we will lose our agency and our humanity.
47:44Join me next time.
47:45This is very new and very weird.
47:47Are these people real?
47:48I'll be diving head first into AI.
47:51Was there anything about it that looks weird to you?
47:54No.
47:55Where is AI taking us?
47:56So you reckon he's real?
47:58That looks fake.
48:00I was really easily tricked by it.
48:02And do we even want to go there?
48:03We don't want to sleepwalk into a future that nobody wants.
48:06They've invested hundreds of billions of dollars into this industry and they want to seek a return.
48:11Are you scared?
48:12I'm very scared.
48:18Don't wait a week when all episodes of The Matter of Facts are streaming now on ABC iview.
Comments

Recommended