- 11 hours ago
Category
📺
TVTranscript
00:03Hi, I'm Hamish McDonald, and I'm a journalist, which means facts are my business.
00:09Where do you get your information from?
00:11Social media.
00:12Facebook.
00:12Facebook, Instagram, and YouTube.
00:15Social media is where so many of us now get our information.
00:20But can we rely on it when tech makes it so easy to create a fictional story which can
00:28look and sound real?
00:29I went to the moon last week, don't you know?
00:31Even though it's entirely fake.
00:35Are you like so many people now, struggling to know what's true and what's false?
00:41In this series, I'm going to explore how big tech is transforming our lives and our minds.
00:47We're going through a period of radical change in the way that people consume information.
00:51We've created this monster.
00:53An algorithm is outsourcing human choice into a machine's decision.
00:59The end goal is keep you scrolling, hack our biology, change the way we feel to change
01:04the way we see the world.
01:06Is anyone else paying attention to how all of this is affecting society?
01:12I want to understand how technology is changing the way we think.
01:16It really does question your reality.
01:20And why we're all susceptible to mis and disinformation.
01:23We are just as vulnerable but we think we're not and actually that's what makes us more vulnerable.
01:30Do you worry about what happens if we can't agree on facts?
01:34This has been an information war.
01:35Facts stop mattering.
01:37This growing sense that anything could be fake at any time.
01:41It looks like you.
01:43It does not mean.
01:45The facts are so polar opposite.
01:47Which ones are the facts?
01:49We can't even agree on how to determine what a fact is.
01:52Is distrust creating fracture lines in your community?
01:56I've never seen so much anger and hate in my lifetime.
02:00I was getting an average of 90 hate messages per hour.
02:04Are you scared?
02:05I'm very scared.
02:08Is disinformation really a threat to democracy?
02:12A world where people don't trust anything.
02:14Like who is that good for?
02:16It's part of the propaganda.
02:17It's a biased content production.
02:19The tearing down of democracy began by tearing down facts.
02:24Join me on a journey right around Australia and the world
02:27to meet people determined to keep facts alive
02:30in this real-time information revolution.
02:50Facts are being eroded in communities all over Australia,
02:55including where I grew up in the Snow Mountains High Country.
03:01It's still the same small, tight-knit community,
03:06with one fundamental difference.
03:09Everyone is now online.
03:13And on the local Facebook forum,
03:15things are getting increasingly spicy.
03:24The single biggest debate that plays out here on social media
03:28involves horses and this national park.
03:31The community is divided, deeply divided,
03:34between those who believe the Brumbies, as they're known,
03:37should roam freely in these mountains
03:39and those who see them as pests,
03:42damaging this pristine environment.
03:49The horses have adapted and multiplied
03:51since they were introduced here over 160 years ago.
03:56For decades, the fight to protect Brumbies
03:59has been waged by a well-known local,
04:03former Nationals politician Peter Cochrane.
04:06We will have 1,000 horses on Macquarie Street.
04:09Now, you'll need a shovel after that,
04:10let me tell you about car.
04:17Now, at 81, Peter is still wrangling horses on his farm.
04:22The great advantage of getting out of politics
04:24was I got decontaminated early.
04:26What do you mean by decontaminated?
04:27Don't tell lies anymore.
04:31Peter rides every day
04:33and takes horse treks up through the mountains.
04:36I know, so if you follow me...
04:40Peter's family has farmed here for generations.
04:44I don't think I could be much happier than a normal horse.
04:47It's that spiritual connection with man, animal and the land.
04:52But as the National Park expanded,
04:55their grazing rights contracted.
04:58Progressive removal of the livestock grazing from the High Country
05:02was the detriment of the income or the livelihood
05:05of a lot of the people surrounding the park.
05:07So that created a massive political conflict within the community.
05:12But if we wanted to resolve an issue with government,
05:16a meeting would be called in one of the town halls
05:19and they'd thrash it out.
05:21There wasn't the opportunity for people to tell as many lies.
05:25Today, the community is divided around one central fact.
05:30How many Brumbies actually exist in the National Park?
05:35The numbers are far too high
05:37and the damage to the landscape is too great.
05:39They are not the right animal in the right place right now.
05:41Thousands of horses have already been shot.
05:45Save our Brumbies!
05:47Brumbies supporters are dismayed and angry.
05:51And that anger is red-hot across social media.
05:57Do you think there's more friction in the community now
06:00than there was before?
06:01Like, I observe it as someone that grew up here,
06:04I don't live here now,
06:06but it seems like it's become more fractious.
06:09Like, there's more hate, there's more anger.
06:11I've never seen so much anger and hate in the community in my lifetime.
06:16The social media has played a major part in the division within the community.
06:21But I guess these social media pages are meant to bring us together.
06:24Exactly.
06:25And it's done exactly the opposite.
06:27Yeah.
06:27It's done exactly the opposite.
06:28Could you still get the community together in a town hall
06:31and nut this out in a respectful way?
06:33The risk you'd have of having a town hall meeting now
06:36is that conflict that had arisen over social media
06:38would emerge at the town hall meeting.
06:41Yeah.
06:41And there are people getting very angry.
06:43That's what worries me.
06:44I just...
06:45There's incredible anger in the place.
06:57Social media has no boundaries, you know.
06:59People are jumping on and making statements like they're absolute fact,
07:03and they live on the north coast and never been to the park.
07:08Rocky Harvey runs a deli in Jindabyne.
07:11He's convinced authorities exaggerate the horse numbers
07:15to justify slaughtering them in their thousands.
07:19Would it have helped in this whole debate
07:22if there was an agreed set of facts between everyone?
07:26No doubt.
07:27No doubt.
07:28But the facts are so polar opposite, you know.
07:31Well, everybody says...
07:32Can they all be facts?
07:33No, they're facts.
07:34I can promise you they're facts.
07:36It seems like everyone in this whole argument...
07:40Yeah.
07:40..has their own individual set of facts.
07:42Yeah.
07:43That's social media, though, isn't it?
07:48Maybe one thing we can all agree on
07:50is that when it comes to communication between humans,
07:55social media is now a very Wild West.
08:01So, how did it all get so feral?
08:06On the 14th of June, 1995,
08:10a US senator tried to do something
08:11that no-one had ever done before.
08:16Regulate the internet.
08:19I invite some of you to take a look
08:20at what is on the internet system today.
08:24We have not only the right,
08:26but the responsibility to protect our children.
08:29Senator James Exxon was deeply concerned
08:32about the terrible things
08:33young people might see on the internet.
08:35He wanted to protect them.
08:36So, he introduced a Communications Decency Act to Congress.
08:44The senator successfully got his bill through Congress,
08:47but with a clause that's been described
08:50as the 26 words that created the internet.
09:00These words effectively freed the online platforms
09:03from liability around what they were publishing.
09:06They were just providing a space for free speech.
09:10This 1996 law gave social media companies
09:15a total immunity shield.
09:17They're not responsible for any consequence,
09:19let alone whether it's harmful content.
09:22Section 230 helped create the internet as we know it.
09:25It has helped new ideas get built,
09:27and our companies to spread American values around the world,
09:30and we should maintain this advantage.
09:33Section 230 has helped make US tech companies so powerful
09:37they're difficult to regulate in Australia
09:39or anywhere else for that matter.
09:42Social media has placed fact and fiction on a level playing field.
09:47Who would know whether something has emerged
09:49from a credible source or just from a conspiracy theory?
09:52You might think, well, there's professional fact-checkers for that.
09:56But in this new digital reality, they're gone too.
10:00We're gonna get rid of fact-checkers
10:01and replace them with community notes.
10:03The fact-checkers have just been too politically biased
10:06and have destroyed more trust than they've created.
10:09And all of that means that it's the users of social media
10:12left with the job of figuring out what's true and what's false.
10:17That means you.
10:19So how do you reckon you're going with that?
10:23A study by four leading universities
10:26has found that 97% of Australians
10:29struggle to fact-check online information.
10:32Do you know what the definition of misinformation is?
10:35Um, putting out information that's completely false,
10:37knowingly knowing it's false.
10:39So that's actually disinformation?
10:41Oh, is it? Oh.
10:43Do you know what the definition of disinformation is?
10:47Um, no, I don't.
10:49I think it's just outright false information.
10:53I'm not sure.
10:54Disinformation looks accurate,
10:56but is deliberately delivered to you to deceive you.
10:59So, to be clear, misinformation is false information,
11:03but the person sharing it might believe it's true.
11:07Disinformation is malicious.
11:10False information that's designed and shared to deceive you.
11:15Both catch us out online more than we think.
11:19I don't think I'm ever tricked by false information,
11:23but how would I know if I'm being tricked by false information?
11:28With the help of ten intrepid participants
11:30and two cognitive psychologists,
11:33we're about to put Australia's online lie detector skills to the test.
11:37We have been aware for well over a hundred years
11:40that perhaps we're not perfect in our perceptions,
11:44that we are vulnerable to being tricked.
11:48Over the course of the series,
11:50our psychologists will put these participants
11:52through a range of tests,
11:54revealing how our brains decipher fact from fiction.
11:58Your task is pretty simple.
12:01There'll be three different pictures that come up on the page
12:04and I just want you to respond based on what you see.
12:10You might be tempted to try and guess what the correct answer is,
12:14but what we're interested in for this one is what you see.
12:21I was a detective for 22 years.
12:24I just had the mantra I'd say to myself
12:27that not all is as it appears.
12:42OK.
12:44Oh, fail.
12:45We all did.
12:47So, what you just saw was a bunch of what we would call in psychology
12:52visual illusions.
12:55They're particular scenarios that trick our brains
12:59into seeing things not as they are.
13:03Our first one, these are exactly the same size.
13:08What this one is telling us about how our brains work
13:12is that what we see is not exactly what's out there in the world,
13:16but it's influenced by context.
13:20This example is an example of visual context,
13:23but all sorts of things can influence how we interpret the world.
13:28The next one asks which line is bigger.
13:31The typical answer is the top line, but again,
13:35it's the context that is changing what we see here.
13:38These are exactly the same length.
13:41Oh, yeah, I didn't think like that.
13:43And then our last one.
13:46There's actually no triangles or circles in there.
13:49There's just contours that are set up to make it look like there is.
13:54But this is actually functional.
13:56Our brains trick us in this way for really important reasons.
13:59When we're looking at the world around us, like a tree obscuring a river,
14:03we still want to know it's the same river.
14:04So it's the thing that makes us see continuity there.
14:07You're experiencing this kind of filling in information every day as you look around.
14:14The brain has developed to be quite clever in navigating its environment efficiently,
14:19but those conditions that we face in the modern information environment
14:24are very different to the conditions in which we evolved,
14:27and so we're seeing it being tricked and harnessed
14:30in very unusual and problematic ways
14:33that are driving some of these problems around misinformation.
14:38Big tech platforms have tricks to exploit our psychology too.
14:44There's something in the computer.
14:46I don't know how it gets there.
14:48Their number one tool is the algorithm.
14:51Algorithm? Algorithm? Algorithm?
14:54I put my teeth back in.
14:59It learns what we click on
15:01and keeps feeding us variations of that content
15:03so that we stay engaged.
15:05When you're leaving your digital footprint online,
15:08you're letting the internet know about your gender,
15:11your age, your political preferences,
15:14and it's using that information to tailor what's coming your way.
15:20An algorithm is outsourcing the human choice
15:23into a machine's decision.
15:25If you...
15:26You have to convene us here to try the behaviour...
15:30We've created these supercomputers pointed at people's brains
15:33that's trying to get them to do the next thing.
15:35If you...
15:36...say the truth...
15:37This race to the bottom of the brainstem for attention
15:40pulls out the worst in society.
15:47Algorithms rely on emotional content to get our attention,
15:52especially anger and outrage.
15:58I have a lot of sympathy for people who see a horse
16:01and have a really emotional reaction,
16:04but the issue is we've got species here
16:07that are found nowhere else on Earth.
16:09Jack Goff is the CEO of the Invasive Species Council.
16:13He's fighting to get feral horses out of the park.
16:17Do you have any sense of how many Brumbies there are today?
16:21We rely on what the scientists have said.
16:24I mean, helicopter aerial distance sampling suggests
16:26that there's somewhere between about 2,500
16:29and about 5,500 horses.
16:31Obviously, it's pretty difficult to get a precise number.
16:36How wrong were the numbers in your mind?
16:38Oh, probably about 80% wrong.
16:40We ride up there up to five days a week.
16:43We know how many horses were there.
16:45It's just so polarised, the Brumby world, you know.
16:48You've got some real hardcore pro Brumbies
16:51who were underestimating the numbers
16:53and the other end of the spectrum
16:55where, you know, you've got to exterminate them all.
16:58This has been an information war, absolutely.
17:01The reality of social media is that people
17:03get into their own bubbles
17:05and it becomes very difficult to talk to each other
17:07and also there's a degree of egging each other on.
17:11It's sort of, they're no longer about the facts on the ground.
17:16When it comes to herbivores causing damage,
17:19people don't intrinsically understand that.
17:22And so it's very easy to paint a picture
17:25of killing a horse as an evil act.
17:30Those things go to people's deep worldviews
17:32and it seems to then pull out a level of aggression
17:35when they feel threatened.
17:36Sadly, the online space has allowed those people
17:40to escalate in their rhetoric
17:41and that's actually then translated offline.
17:44Groups intimidating National Parks staff
17:47or sending threats to firebomb National Parks officers.
17:52One of the greatest risks that I see in the organisation
17:55is personal safety of some of our staff.
17:58Rocky Harvey isn't accused of making such threats
18:01but the discovery of dead horses did inspire him to act.
18:06Snowy Plain, they shot 69 horses
18:08and left the carcasses in a really pristine environment
18:13and that to me pushed me over the edge.
18:22So then I sort of took a horse head and threw it in the doors
18:26at the National Parks office and filmed myself doing it.
18:32I had a clear message with my post online.
18:37I said, nobody is cleaning up at Snowy Plains.
18:40They need to get this shit sorted out.
18:50I reckon what Rocky Harvey did was the best thing that's happened for years.
18:54I loved every bit of it.
18:56It's challenging authority.
18:57You know, there's a bit of human involvement related to The Godfather.
19:02The National Parks and Wildlife Service deserved everything they got out of that.
19:06Some of them got personal threats.
19:08Who?
19:08National Parks workers.
19:09Well, why wouldn't they?
19:11I mean, they bloody well deserved it.
19:12We can have disagreements over things.
19:15They don't need to escalate into threatening and violent behaviour.
19:19I mean, you can't lay down every time somebody confronts you
19:23with something you don't like and say, put the white flag up.
19:25Oh, the hell with it.
19:27Do you regret doing it?
19:29No, not at all.
19:30I probably would have done a whole horse, but it was just a bit heavier.
19:36Look, I see it as a clash of world views that's going on here.
19:40Unfortunately, what we tend to see is facts stop mattering.
19:45Why do you think anyone would want to lie about this stuff?
19:49They don't want us in the mountains at all.
19:51Why would the scientists be dishonest?
19:55Follow the money.
19:57What do you mean?
19:59I think you can get scientists to sort of produce stuff that's in your favour
20:04if you encourage it that way.
20:07You know, when you've got groups saying, we know for certain and we've brought out this
20:12expert who said there's only 900 horses in the park and then the New South Wales government
20:17is able to remove 9,000 horses over the next 12 months, you'd have to say that that's approaching a
20:24lie.
20:25Everybody tells lies.
20:26I mean, and they do.
20:27That's a fact.
20:28Have you told lies about this stuff?
20:30I don't need to because I'm on the truthful side.
20:32I don't need to tell lies.
20:33But everyone says that, right?
20:35Well, I can justify everything I've said.
20:37You don't need to tell lies.
20:39And the facts are there.
20:40There's a lot of information that is disinformation in the entire social media platform.
20:48That's what's changed the world.
20:53So look, I know I'm talking to you about where I come from, but I reckon there's a chance that
20:58some of this feels familiar.
21:00There are versions of these arguments going on in communities all over the country, all over the world.
21:05Where you live, it might be about wind turbines or housing developments or maybe even immigration.
21:11And the thing is, none of this is happening by coincidence.
21:21Algorithms have created an online world where facts and truth are receding.
21:27The public information ecosystem literally spreading lies six times faster than facts.
21:33When you don't have facts, you can't have truth.
21:37Without truth, you can't have trust.
21:40Without these three, we have no shared reality.
21:43We can't begin to solve any problem.
21:49Once trust is fractured, a community might then struggle to believe anything they're told.
21:54By media, by science, by government.
21:57And this creates a perfect target for disinformation.
22:05Go! Go!
22:10On the last day of 2019, I was among the thousands of Australians on holiday, caught in the black summer
22:16bushfires.
22:19While some people fled towns by road, for thousands of others, it was too late to leave.
22:25Oh my God!
22:26Wow!
22:27They gathered on beaches for safety.
22:29An unprecedented emergency situation.
22:32It's like a war zone.
22:36Millions of hectares were burnt across multiple states.
22:40Right now, we've got just under 100 fires burning.
22:43That's uncharted territory for here in New South Wales.
22:46In your car and evacuate!
22:48Immediately!
22:51I reported from the New South Wales coast through January as the fires spread.
22:56And now we're facing multiple new significant fire fronts.
22:59It's felt like night all day.
23:01There's not a lot left.
23:02Nothing.
23:03We've got nothing.
23:07Emergency workers were overwhelmed.
23:09I haven't seen anything like this in my lifetime.
23:11My house is burning down!
23:14What are you doing?
23:16Nothing!
23:17People were traumatised.
23:19Communities were confused and angry.
23:23They demanded answers.
23:25How does this happen?
23:27How can we stop it?
23:29The facts are that Australia is burning, while we turn a blind eye to the driving force,
23:35which is climate change and a warming planet.
23:37The world watched as the fires chewed through the landscape.
23:42And as that happened, something started to trend on social media globally.
23:47Hashtag arson emergency.
23:51It was messaging that the majority of the bushfires had been caused by arsonists.
24:00By January 7th, the arson emergency narrative was being repeated around the world.
24:07Australia has an arson problem.
24:09You can't pin on global warming, climate change, or whatever title you're giving your environmental
24:14boogeyman.
24:15So, at this point in the season, 24 people had been arrested in New South Wales on charges
24:21of arson.
24:21But only 1% of the land burned across the state during that summer could be officially attributed
24:27to arson.
24:29Lightning sparked the vast majority of the fires.
24:35Now, researchers couldn't identify the origins of the hashtag, but they found arson emergency
24:40could be a disinformation campaign designed to undermine the link between fires and climate
24:47change.
24:48There used to be something called consensus reality, that mostly we are seeing a shared
24:54set of stories, a shared set of events.
24:57But if you look at a big moment in culture, you'll see how different tribes or political
25:04groups who live in different information bubbles, when they see this story, they see specific evidence
25:10that is different than the evidence that other tribes see.
25:13Dr Timothy Graham from Queensland University of Technology analyzed more than 300 accounts
25:18using the arson emergency hashtag, and a third of them could be from bots.
25:23Not only individuals, but also media, as well as potentially malicious actors, are picking
25:28up on this content and spreading it.
25:30It spreads like wildfire.
25:32So, whatever is maximally inflammatory is what the algorithms are sorting for, and
25:39then making that the centre of attention for every topic that is inflammatory.
25:44And we can't even see it because it has literally been our 24-7 reality for the last 12-15
25:50years.
25:54Australia's not alone in this.
25:56All over the world, when disasters happen, floods, bushfires, disinformation follows.
26:01It's entirely human that, in those moments of panic and confusion, we want to find ways
26:07of understanding what's going on, even if there aren't clear or obvious answers.
26:12It's that very vulnerability that disinformation attaches itself to and feasts upon.
26:23If you're feeling overwhelmed, there's a reason.
26:27On average, we're devoting more than six hours a day to the consumption of digital information.
26:34And scientists are only just discovering what this is doing to our brains.
26:43I reckon since I woke up this morning, I've probably checked my phone a couple of hundred times.
26:48Emails, text messages, newspapers, all the usual stuff.
26:52On average, it says I use the phone about four hours a day.
26:56Sorry.
26:59It says I've actually used it, on average, four hours a day.
27:03Sorry, just checking a message.
27:08Anyway, that's why I've come here to Melbourne University, to the School of Psychological Sciences.
27:13Because I've been thinking, maybe it's actually doing something to my brain.
27:17What's it doing to your brain?
27:22I'm getting nervous now.
27:23Don't get nervous. It's OK. Just come and sit here.
27:25Professor Katherine Johnson is going to test how distraction affects my comprehension.
27:31I'm taking the phone.
27:32OK.
27:33Sorry.
27:34That's all right.
27:35You might get a sense of anxiety in your heart.
27:37I'll cope.
27:38So we've designed this new experiment, trying to simulate more of an everyday life situation,
27:44but still test your attention control.
27:47So this kind of goes to the avalanche of information that we're all taking in constantly now.
27:52That multitasking approach that we have to life now.
27:56So the task you'll do is a proofreading task.
27:58OK.
27:59OK.
28:00Whew.
28:00Crazy scientific experiments win prizes at leading US university.
28:05Catherine's given me a text with spelling and grammar mistakes that I'm meant to correct.
28:11Through their bottoms in emergencies.
28:14While I'm being distracted by other prompts.
28:17Project Pigeon was called off in 1944 despite a seemingly successful...
28:23If I don't do my correction fast enough...
28:26That's gone.
28:27..the prompt disappears.
28:29The botany prize was awarded for a search which found that...
28:36OK.
28:38Real plants...
28:38I also have to press specific buttons when they light up.
28:41Tin tan is part of the...
28:42It feels a bit like multi-multi-tasking.
28:47Ah, I've pushed the wrong thing.
28:50Good.
28:51Well done.
28:53How did you feel you went with that?
28:56Um...
28:56I mean, I know I made mistakes.
28:58Yeah.
28:58I mean, it's interesting, isn't it?
29:00Because we're thinking about whether we...
29:02have the capacity to critically assess information that's in front of us.
29:07Yeah.
29:07If we're trying to consume a huge amount of information...
29:11Yep.
29:12..at rapid speed...
29:13And if it keeps coming...
29:14You're going to make mistakes and you won't even realise you've made a mistake.
29:17So, um...
29:18So, how many mistakes did I make?
29:21Oh...
29:21Sorry.
29:22What?
29:24I'm just seeing all the errors.
29:26Oh, right.
29:27Sorry.
29:28It's not looking good.
29:30You made ten incorrect responses and you missed seven.
29:34So, is that a bad average response or...?
29:37On average, they didn't miss any words.
29:41Really?
29:41They made no mistakes?
29:43Yep.
29:44OK.
29:45So, they got the whole thing, right?
29:47So, I'm really below par.
29:49Well, you know, it's, um...
29:51I sort of sub-edit work as a career.
29:56But maybe I just am used to the mistakes.
29:59I don't know.
29:59Oh, do you use, like, autocorrect?
30:01No.
30:02No.
30:05What I've just very clearly demonstrated
30:07is that our brains aren't wired for multitasking.
30:11Research shows it stretches us thin,
30:13reducing our IQ up to ten points
30:16and rendering us 40% less productive.
30:20Even simple tasks require the brain to transfer attention,
30:24and every time that happens,
30:26we risk missing something important.
30:29Just go and look at the camera.
30:31Yep.
30:31Walk at your normal pace.
30:32I'll be shooting you from over here, so I'll cue you.
30:35So, if you just wait there...
30:36Sure.
30:36No worries.
30:38Our participants don't realise they're players
30:40in one of psychology's most instructive tests
30:43of human attention.
30:46Hey, sorry, mate.
30:47Hey, bud.
30:48Can you tell me where we are at the moment?
30:50I'm about to get filmed on something,
30:51but I'll give you the game.
30:52Oh, sorry.
30:52Where are we?
30:53Yeah.
30:53You're asking someone that doesn't know that well,
30:55but we're here.
30:56Where are you trying to get to?
30:57I'm trying to get to Circular Quays.
30:59Alright.
31:00Just walk up those stairs.
31:01See that stair where those people are?
31:03Oh, yeah.
31:03Yeah.
31:04Up and over, and that'll take you straight.
31:05That'll take you...
31:06Oh, okay.
31:12We're here.
31:13Yeah.
31:13Yeah.
31:15And then...
31:16You go up the stairs, and you follow it down.
31:18Oh, George straight.
31:19Yeah, okay.
31:20Here we are.
31:20No worries.
31:21Alright.
31:25This is the bridge there,
31:27and then so we're on this pier here.
31:29How do I get to...
31:30Oh, sorry.
31:31Sorry.
31:33So, see those stairs over there?
31:35Yeah.
31:40Oh, this is part of it.
31:42You're tricky.
31:44I'm highly observant here.
31:46Good on you.
31:48Did I wreck it?
31:52You're my hero, Shelley.
31:55I'm highly observant.
31:57Hey, could you tell me where we are at the moment?
32:00We are in Sydney.
32:01Yeah.
32:02Just see this bridge here?
32:04Yeah.
32:04That's that bridge there.
32:05Okay.
32:05I'm looking for...
32:06Oh, sorry.
32:12Sorry.
32:13What are you looking for?
32:14Circular key.
32:14Circular key, yes.
32:15You can get there quite easily.
32:17Damn.
32:18It's actually cut across to...
32:19See these stairs here?
32:21Look behind you.
32:22See those stairs that go up through the middle?
32:23Oh, okay.
32:24Yeah, yeah.
32:24You want to go through those?
32:25Because those stairs are...
32:26Oh, wow.
32:27Yeah, that's amazing.
32:28So, if you go around, that's crazy.
32:30But it's a nice walk, but...
32:32I'd go the stairs.
32:33Wow.
32:33This is something we see people being fooled by it all the time.
32:39So, what attention does to our world.
32:42When we're really focused on something else, we are filtering out a bunch of other information
32:48that seems less relevant for us.
32:50And it goes back to things like visual illusions, all the cognitive biases.
32:56Our brains are looking to make sense of the world in the most efficient way possible.
33:01I didn't look.
33:02So, like, I'm not surprised at all, but it is very hard to watch when you see yourself get
33:09due to just like, oh, you just continue on and like, yeah, it's not a problem.
33:13It really does question your reality.
33:16How do I then know when I'm not being tricked?
33:19Or how else do I let myself down and not see what's right in front of my very eyes?
33:26How?
33:27Like, I have a completely different view now of how people feel if they've been swindled,
33:33or they've been hacked, or they've been in some way fooled online.
33:36How that can have a really adverse effect on them.
33:40I wouldn't have thought I would ever have that enlightening kind of moment,
33:45but I really do now.
33:48We're really susceptible in ways that I don't think people understand.
33:51They think that, well, if I got a PhD or went to college, I'm not vulnerable to this.
33:56But actually, these are psychological biases that are deeper than that,
34:00and that we're all vulnerable to.
34:03This vulnerability isn't a personal failing.
34:07It's how evolution has wired us.
34:10The reality is we are so distracted, we cannot give the same amount of attention to comprehension.
34:19According to Professor Marianne Wolfe, who's spent decades studying how the brain takes in information,
34:24digital distraction comes at a very high price.
34:28We are skimming and scrolling and word spotting.
34:33We know there's insufficient allocation of time to deep thinking, therefore critical thinking.
34:40But Marianne believes there is a solution within easy reach.
34:45Reading is an act of resistance in an age of distraction in which critical thinking and empathy are on the
34:56decline.
34:59We were never born to read.
35:02We were born to speak, to see, to eat, to think, but not to read.
35:08There's no one region in the brain for reading.
35:12Rather, the brain has this fascinating design principle.
35:16And the design allows it for new functions, whether it's riding a bike or reading,
35:23to create a new set of circuits and connect them in a new way.
35:30That brain can learn to read multiple writing systems.
35:34It's amazing what we can do.
35:37I feel like I'm being decorated like a Christmas tree.
35:40It's all for science.
35:43Neurological tests demonstrate that our brains are plastic.
35:48Over our lifetimes, they create new neural connections,
35:51so we can keep learning and adapt to change.
35:55What's the connection between attention and our ability to analyse
35:59and our ability to actually process something?
36:02Attention is underpinning everything.
36:04Yeah.
36:05When you're needing to think about whether something is true or untrue,
36:09you're concentrating and that's attention.
36:11Yeah, OK.
36:12What scientists are now seeing is a shift to our neuroplasticity
36:17from prolonged time on devices,
36:20with negative implications for our cognition,
36:23our emotions and our behaviours.
36:27Social media is changing how we gather information
36:30and connect with each other.
36:33Well over five billion people across the globe now use social media
36:38to better understand issues, large and small.
36:40Here's how I would survive an alien invasion.
36:43Damp removal on my refrigerator.
36:45Online health advocates are particularly popular.
36:48I've been drinking my own urine for 17 years.
36:51Putting garlic cloves in my nose after a sinus infected.
36:54The majority have a financial interest in what they're promoting.
36:58This is day one of me ingesting borax.
37:01Borax.
37:02Borax.
37:02Borax.
37:03And less than 6% cite credible scientific evidence to back their claims.
37:08Why do I recommend turpentine cleanse for parasites?
37:13I have seen parasites jump out of someone's eye holes.
37:16Six months later she has another pap smear.
37:19They just so happened to take it on the same spot.
37:22What's the result?
37:23Abnormal cells.
37:24What caused the abnormal cells?
37:27The first pap smear that was never allowed to heal.
37:30Barbara O'Neill is an Australian natural health advocate
37:33with hundreds of thousands of followers.
37:37I'll tell you something.
37:38Australia has never ever had so much information from me
37:44since I was banned.
37:49Barbara has been formally banned from providing any health services.
37:53after she was found to have delivered misinformation to her followers.
37:57And I know many were blackmailed into having the COVID shot.
38:02The vaccines.
38:04They are not safe.
38:07As well as her dangerous advice on vaccinations,
38:10Barbara has promoted debunked cancer treatments
38:12and advocated against the use of sunscreen and antibiotics.
38:17And that's what we use.
38:20Common sense.
38:21So I really can't understand why they claim I'm a threat to public safety.
38:26Today, Barbara's supporters have come together in a Sydney park
38:30directly opposite the Health Care Complaints Commission,
38:33which made the decision to shut Barbara down.
38:36I've never protested before in my life.
38:38This is my first time ever being front line
38:41and making a stand for something that I truly believe in.
38:44If she's dangerous, I choose to be in the danger zone.
38:47You're a gift from God.
38:49You're a blessing.
38:50Thank you so much.
38:51Have you got a pen?
38:52I do.
38:53I have a texter.
38:54I feel like I'm shaking.
38:58Barbara has a loyal following,
39:00an audience particularly receptive to the idea
39:02that governments can't be trusted when it comes to our health.
39:06What I do is I teach people how to look after their bodies.
39:09I just wanted to say thanks to you that you cured my stage 2 cancer.
39:14Oh, that is so nice to hear.
39:16It really is based on our God-given right to choose.
39:20You're a wonderful teacher.
39:21That is good.
39:22It's my right to choose actually what I do
39:25and what I do not do to my body.
39:27So I see one of my main roles is just to give information.
39:31One lady told me that she totally conquered her breast cancer
39:35by just using the castor oil compressors.
39:37Australian authorities say Barbara poses a risk
39:40to the health and safety of the public.
39:42But like so many unregistered wellness promoters,
39:45her business is online,
39:47where Australia's regulatory bodies can't touch her.
39:51Please don't have a pap smear
39:53because they are contributing.
39:55Every time you have a pap smear, you're damaging the cervix.
39:59Barbara's legion of followers spans the globe.
40:02According to the Health Care Complaints Commission,
40:05she's misled vulnerable people with health misinformation.
40:10The great cholesterol con, the great cholesterol lie,
40:12the great cholesterol hoax, the great cholesterol deception.
40:16But here is where Barbara's story gets really interesting.
40:20Barbara's voice and image have been hijacked
40:22to create disinformation.
40:25She's now all over the internet,
40:26offering advice and products that she would never endorse.
40:31How much of what's out there is really you?
40:35And how much of it is fake you?
40:37Well...
40:38Most of it's fake.
40:39Most of it's fake?
40:40Most of it's fake.
40:41I want to show you some, if that's right,
40:43because I've had a look and I kind of can't tell the difference.
40:46Very interesting to have a look.
40:47Can we watch some and see?
40:48Absolutely.
40:49Okay.
40:50I'll be talking about this incredible body that we live in
40:53and how you can give it the right conditions
40:56to bring about a healing response.
40:58You'll be hearing from about 35 world-renowned speakers on health.
41:03So is this you?
41:03This is one of the must-attend health events of the year.
41:07Yeah, it is.
41:07And put in...
41:08And that was a real video.
41:09Barbara.
41:10Okay.
41:11So a billion dollar industry was created...
41:13Ah, yeah.
41:14That's me.
41:14That's you?
41:15The Bible says God gave herbs for the service of men.
41:18They work with the needs of the body.
41:20And this is truth.
41:24Yep.
41:25Cool.
41:25Hello, friends.
41:26Welcome back.
41:27A cancer expert pointed to a common food that may make cancer worse.
41:31We're going to reveal that food.
41:33We'll also talk about daily habits and patterns.
41:36Leave me a little note in the comments below.
41:38That's definitely not Barbara.
41:39It always norms my heart to see how far this message goes.
41:42Sounds like you.
41:43Looks like you.
41:44A bit weird seeing this.
41:46It's crazy.
41:48First, you are going to wash a red onion well
41:51and cut it with the skin on.
41:53And cut it into squares and boil a cup of water.
41:56The voice is not me, but that's me in the background.
41:58Let it boil for a few minutes.
41:58Serve it in a cup and drink.
42:00You can drink it on an empty stomach.
42:02That's definitely not Barbara.
42:04So you do talk about cleaning the uterus?
42:07Yes.
42:09But...
42:10But with some herbs.
42:12So that's not the right recipe?
42:14Yeah, no, no.
42:15I don't use an onion to clean out uteruses.
42:17Okay.
42:19If even Barbara is struggling to recognise true
42:22and phony versions of Barbara,
42:24what chance do her followers have?
42:26Do you see Barbara's stuff on social media?
42:29Oh, everywhere.
42:30Everywhere.
42:31I see her stuff on Facebook, Instagram and YouTube.
42:34And some of her stuff's been manipulated using AI?
42:37Yes.
42:38Have you seen that?
42:39I can actually tell the difference now.
42:41I can actually tell when it's been AI,
42:44just the way she's speaking.
42:46So you feel confident that you know the difference?
42:48I do, personally.
42:49Okay.
42:49But that's maybe because I've been following her for quite a while.
42:52Can I show you some and see how you go?
42:54Yeah.
42:57What do you reckon is that?
42:59Real or fake, Barbara?
43:02I think that's real.
43:03Yeah.
43:03That's definitely it.
43:05Okay.
43:06How about this one?
43:06Lindsay?
43:08No.
43:13That looks like her in one of her group meetings.
43:18Lectures.
43:18Lectures.
43:18Yes.
43:19I would take that as yes.
43:21Okay.
43:21With the onion, the honey.
43:23Yeah.
43:23I had onions in my socks last week when I was unwell.
43:26Really?
43:27Yeah.
43:27What did the onions do in your socks?
43:28Oh, haven't you heard this one?
43:30No.
43:30They draw out all the, um, sickness basically, the impurities.
43:36Like literally my feet were wet, like the onions had sweated.
43:40Okay.
43:40Um, my sickness went out.
43:43Okay.
43:43So you reckon it works?
43:44It works.
43:45If Barb says it works, it works.
43:48So just on the uterus cleaning video, I need to be clear, Barbara says that's a fake.
43:54She does say that's a fake.
43:55I thought that was, yeah.
43:55So she does talk about uterus cleaning.
43:57Yeah, but not like that.
43:58But not using onions.
43:59See, that's so dangerous.
44:00I think they took information from a separate video and put it in that one.
44:04But does that worry you?
44:05It does.
44:06What's real?
44:07Like pushing through the fog.
44:08And you feel like you know what's real and what's fake?
44:10I certainly don't, but I think I'm learning.
44:14I think when you listen to these stories, you can start to understand how these layers
44:20and layers of stories, some of them factual, some of them fictional, some of them based
44:26on pseudoscience, start to enmesh and get lost in themselves.
44:32It kind of feels a little bit like being caught in a riddle.
44:35And you can also hear people's vulnerabilities and how easy it is to become kind of caught
44:41up in that.
44:42What is fact?
44:43What is real?
44:44What is truth?
44:46What is truth?
44:47And now it's even difficult to know whether what you're seeing is the real thing.
44:59Truth is fragile.
45:02And truth has been, if you will, captured, kidnapped by those who will say, this is the truth.
45:11All women, and especially all modern women.
45:16People have been inundated with information that what we do is actually end up reading
45:26less that is outside our silos of information.
45:32And then if we're only going into that silo, we don't use critical thinking as much.
45:41We become vulnerable to misinformation and we are extremely susceptible to disinformation.
45:52We really are seeing decreases in all the deep reading skills, especially empathy and critical thinking.
46:02The act of reading gives us a chance to think differently than our own perspective.
46:12That allows us to leave the egocentrism that we all have and really connect to others.
46:25That, however, is only the beginning of what has to happen, which is critical thinking.
46:33And what the brain does, it's a wild dance where you are saying, yes, no, yes.
46:40Finally, you come to an agreement that this is the truth or this is falsehood or something in between.
46:48But that is critical thinking.
46:52And it takes time.
46:56One of the questions that I'm most concerned about is the role of empathy in reading and whether that is
47:05changing
47:06and whether, if it changes, it will affect our society, including our democracy.
47:15One of the hardest things for me of late is to see the polarization that exists in my country and
47:25around the world.
47:26Democracy is cringe.
47:28Which do you think are more dangerous, drag shows or guns to children?
47:32Drag shows.
47:33Because I think people who come here illegally should be deported.
47:37The trend towards seeing others as sinister or as enemy is the greatest threat to our democracy.
47:47So here we have a connection between reading, empathy and democracy.
47:57Next time, I follow disinformation across the globe.
48:01There might be people here hiding from us.
48:04Then discover the damage it's doing to democracy.
48:06The chief executive of one of the Philippines' most popular news sites has been arrested.
48:12I meet people actually creating disinformation.
48:15And people exposing it.
48:20Companies will behave in ways that are nefarious.
48:23We have an administration that's pushing our democracy to the limit, aiming to break it.
48:33Like what you just watched?
48:35Head over to ABC iview now to stream all episodes of The Matter of Facts.
Comments