Skip to playerSkip to main content
  • 9 hours ago
The Matter Of Facts S01E01

Category

📺
TV
Transcript
00:03Hi, I'm Hamish McDonald, and I'm a journalist, which means facts are my business.
00:09Where do you get your information from?
00:11Social media.
00:12Facebook.
00:12Facebook, Instagram and YouTube.
00:15Social media is where so many of us now get our information.
00:20But can we rely on it?
00:22When tech makes it so easy to create a fictional story, which can look and sound real,
00:29I went to the moon last week, don't you know?
00:31Even though it's entirely fake.
00:35Are you like so many people now, struggling to know what's true and what's false?
00:41In this series, I'm going to explore how big tech is transforming our lives and our minds.
00:47We're going through a period of radical change in the way that people consume information.
00:51We've created this monster.
00:53An algorithm is outsourcing the human choice into a machine's decision.
00:59The end goal is keep you scrolling.
01:01Hack our biology, change the way we feel to change the way we see the world.
01:06Is anyone else paying attention to how all of this is affecting society?
01:12I want to understand how technology is changing the way we think.
01:16It really does question your reality.
01:20And why we're all susceptible to mis- and disinformation.
01:23We are just as vulnerable, but we think we're not.
01:27And actually, that's what makes us more vulnerable.
01:30Do you worry about what happens if we can't agree on facts?
01:34This has been an information war. Facts stop mattering.
01:37This growing sense that anything could be fake at any time.
01:42Looks like you.
01:44That's not me.
01:45The facts are so polar opposite.
01:47Which ones are the facts?
01:49We can't even agree on how to determine what a fact is.
01:52Is distrust creating fractal lines in your community?
01:56I've never seen so much anger and hate in my lifetime.
02:00I was getting an average of 90 hate messages per hour.
02:04Are you scared?
02:05I'm very scared.
02:08Is disinformation really a threat to democracy?
02:12A world where people don't trust anything.
02:14Like, who is that good for?
02:15It's part of the propaganda. It's a biased content production.
02:19The tearing down of democracy began by tearing down facts.
02:24Join me on a journey right around Australia and the world
02:27to meet people determined to keep facts alive
02:31in this real-time information revolution.
02:50Facts are being eroded in communities all over Australia,
02:55including where I grew up in the Snow Mountains High Country.
03:01It's still the same small, tight-knit community,
03:06with one fundamental difference.
03:09Everyone is now online.
03:13And on the local Facebook forum,
03:15things are getting increasingly spicy.
03:24The single biggest debate that plays out here on social media
03:28involves horses and this national park.
03:31The community is divided, deeply divided,
03:34between those who believe the Brumbies, as they're known,
03:37should roam freely in these mountains,
03:40and those who see them as pests damaging this pristine environment.
03:49The horses have adapted and multiplied
03:51since they were introduced here over 160 years ago.
03:57For decades, the fight to protect Brumbies has been waged
04:01by a well-known local,
04:03former Nationals politician Peter Cochrane.
04:06We will have 1,000 horses on Macquarie Street.
04:09Now, you'll need a shovel after that, let me tell you about car.
04:17Now, at 81, Peter is still wrangling horses on his farm.
04:22The great advantage of getting out of politics was
04:24I got decontaminated early.
04:26What do you mean by decontaminated?
04:28Oh, don't tell lies anymore.
04:31Peter rides every day
04:33and takes horse treks up through the mountains.
04:36I know, so if you follow me...
04:40Peter's family has farmed here for generations.
04:44I don't think I could be much happier than when I'm on a horse.
04:47It's that spiritual connection with man, animal and the land.
04:52But as the National Park expanded,
04:55their grazing rights contracted.
04:58Progressive removal of the livestock grazing from the High Country
05:02was the detriment of the income or the livelihood
05:05of a lot of the people surrounding the park.
05:07So that created a massive political conflict within the community.
05:12But if we wanted to resolve an issue with government,
05:16a meeting would be called in one of the town halls
05:19and they'd thrash it out.
05:21There wasn't the opportunity for people to tell as many lies.
05:25Today, the community is divided
05:28around one central fact.
05:30How many Brumbies actually exist in the National Park?
05:34The numbers are far too high
05:37and the damage to the landscape is too great.
05:39They are not the right animal in the right place right now.
05:42Thousands of horses have already been shot.
05:45Save our Brumbies! Save our Brumbies!
05:48Brumbies supporters are dismayed and angry.
05:51And that anger is red-hot across social media.
05:57Do you think there's more friction in the community now than there was before?
06:02Like, I observe it as someone that grew up here,
06:04I don't live here now,
06:06but it seems like it's become more fractious.
06:10Like, there's more hate, there's more anger.
06:11I've never seen so much anger and hate in the community in my lifetime.
06:16The social media's played a major part in the division within the community.
06:21But I guess these social media pages are meant to bring us together.
06:24Exactly. And it's done exactly the opposite.
06:27Yeah. It's done exactly the opposite.
06:28Could you still get the community together in a town hall
06:31and nut this out in a respectful way?
06:33The risk you'd have of having a town hall meeting now
06:36is that conflict that had arisen over social media
06:38would emerge at the town hall meeting.
06:41And there are people getting very angry.
06:44That's what worries me.
06:45There's incredible anger in the place.
06:51There are people getting very angry.
06:57Social media has no boundaries, you know.
06:59People are jumping on and making statements like they're absolute fact
07:04and they live on the north coast and never been to the park.
07:08Rocky Harvey runs a deli in Jindaby.
07:11He's convinced authorities exaggerate the horse numbers
07:15to justify slaughtering them in their thousands.
07:19Would it have helped in this whole debate
07:22if there was an agreed set of facts between everyone?
07:26No doubt. No doubt.
07:28But the facts are so polar opposite, you know.
07:31Well, everybody says...
07:33Can they all be facts?
07:33No, they're facts. I can promise you they're facts.
07:36It seems like everyone in this whole argument
07:40has their own individual set of facts.
07:42Yep. That's social media though, isn't it?
07:48Maybe one thing we can all agree on
07:50is that when it comes to communication between humans,
07:55social media is now a very wild west.
08:01So, how did it all get so feral?
08:07On the 14th of June, 1995, a US senator tried to do something
08:11that no one had ever done before.
08:16Regulate the internet.
08:19I invite some of you to take a look at what is on the internet system today.
08:23We have not only the right, but the responsibility to protect our children.
08:29Senator James Exxon was deeply concerned
08:32about the terrible things young people might see on the internet.
08:35He wanted to protect them.
08:36So, he introduced a Communications Decency Act to Congress.
08:44The senator successfully got his bill through Congress,
08:47but with a clause that's been described
08:50as the 26 words that created the internet.
09:00These words effectively freed the online platforms
09:03from liability around what they were publishing.
09:06They were just providing a space for free speech.
09:10This 1996 law gave social media companies
09:15a total immunity shield.
09:17They're not responsible for any consequence,
09:19let alone whether it's harmful content.
09:21Section 230 helped create the internet as we know it.
09:25It has helped new ideas get built
09:27and our companies to spread American values around the world
09:30and we should maintain this advantage.
09:33Section 230 has helped make US tech companies so powerful
09:37they're difficult to regulate in Australia
09:39or anywhere else for that matter.
09:42Social media has placed fact and fiction on a level playing field.
09:47Who would know whether something has emerged from a credible source
09:50or just from a conspiracy theory?
09:52You might think, well, there's professional fact-checkers for that.
09:56But in this new digital reality, they're gone too.
10:00We're going to get rid of fact-checkers
10:01and replace them with community notes.
10:03The fact-checkers have just been too politically biased
10:06and have destroyed more trust than they've created.
10:08And all of that means that it's the users of social media
10:13left with the job of figuring out what's true and what's false.
10:17That means you.
10:19So how do you reckon you're going with that?
10:23A study by four leading universities has found that 97% of Australians
10:29struggle to fact-check online information.
10:31Do you know what the definition of misinformation is?
10:35Um, putting out information that's completely false,
10:37knowingly knowing it's false.
10:39So that's actually disinformation?
10:41Oh, is it? Oh.
10:43Do you know what the definition of disinformation is?
10:47Um, no, I don't.
10:49I think it's just outright false information.
10:52I'm not sure.
10:54Disinformation looks accurate
10:56but is deliberately delivered to you to deceive you.
10:59So, to be clear,
11:01misinformation is false information
11:03but the person sharing it might believe it's true.
11:08Disinformation is malicious.
11:10False information that's designed and shared to deceive you.
11:15Both catch us out online more than we think.
11:19I don't think I'm ever tricked by false information,
11:22but how would I know if I'm being tricked by false information?
11:28With the help of ten intrepid participants
11:30and two cognitive psychologists,
11:33we're about to put Australia's online lie detector skills to the test.
11:37We have been aware for well over a hundred years
11:40that perhaps we're not perfect in our perceptions,
11:44that we are vulnerable to being tricked.
11:48Over the course of the series,
11:50our psychologists will put these participants through a range of tests,
11:54revealing how our brains decipher fact from fiction.
11:58Your task is pretty simple.
12:01There will be three different pictures that come up on the page
12:04and I just want you to respond based on what you see.
12:10You might be tempted to try and guess what the correct answer is,
12:14but what we're interested in for this one is what you see.
12:21I was a detective for 22 years.
12:24I just had the mantra I'd say to myself that not all is as it appears.
12:42Okay.
12:44Oh, fail.
12:45We all did.
12:47So, what you just saw was a bunch of what we would call in psychology visual illusions.
12:55They're particular scenarios that trick our brains into seeing things not as they are.
13:03Our first one, these are exactly the same size.
13:08What this one is telling us about how our brains work is that what we see is not exactly what's
13:15out there in the world,
13:17but it's influenced by context.
13:20This example is an example of visual context,
13:23but all sorts of things can influence how we interpret the world.
13:28The next one asks which line is bigger.
13:31The typical answer is the top line,
13:33but again, it's the context that is changing what we see here.
13:38These are exactly the same length.
13:41Oh, yeah, I didn't think like that.
13:43And then our last one.
13:46There's actually no triangles or circles in there.
13:49There's just contours that are set up to make it look like there is.
13:54But this is actually functional.
13:56Our brains trick us in this way for really important reasons.
13:59When we're looking at the world around us, like a tree obscuring a river,
14:03we still want to know it's the same river.
14:04So it's the thing that makes us see continuity there.
14:07You're experiencing this kind of filling in information every day as you look around.
14:14The brain has developed to be quite clever in navigating its environment efficiently.
14:20But those conditions that we face in the modern information environment are very different to the conditions in which we
14:26evolved.
14:27And so we're seeing it being tricked and harnessed in very unusual and problematic ways that are driving some of
14:34these problems around misinformation.
14:39Big tech platforms have tricks to exploit our psychology too.
14:44It's something in the computer. I don't know how it gets there.
14:48Their number one tool is the algorithm.
14:51Allerhythm? Alg...algorithm?
14:54I put my teeth back in.
14:59It learns what we click on and keeps feeding us variations of that content so that we stay engaged.
15:05When you're leaving your digital footprint online, you're letting the internet know about your gender, your age, your political preferences.
15:14And it's using that information to tailor what's coming your way.
15:19An algorithm is outsourcing the human choice into a machine's decision.
15:25If you...
15:27We've created these supercomputers pointed at people's brains
15:33that's trying to get them to do the next thing.
15:35If you say the truth...
15:37This race to the bottom of the brainstem for attention pulls out the worst in society.
15:48Algorithms rely on emotional content to get our attention, especially anger and outrage.
15:58I have a lot of sympathy for people who see a horse and have a really emotional reaction.
16:04But the issue is we've got species here that are found nowhere else on Earth.
16:09Jack Goff is the CEO of the Invasive Species Council.
16:13He's fighting to get feral horses out of the park.
16:17Do you have any sense of how many Brumbies there are today?
16:21We rely on what the scientists have said.
16:24I mean, helicopter aerial distance sampling suggests that there's somewhere between
16:27about 2,500 and about 5,500 horses.
16:31Obviously, it's pretty difficult to get a precise number.
16:36How wrong were the numbers in your mind?
16:38Oh, probably about 80% wrong.
16:40We ride up there up to five days a week.
16:43We know how many horses were there.
16:45It's just so polarised, the Brumby world, you know.
16:48You've got some real hardcore pro Brumbies who were underestimating the numbers
16:53and the other end of the spectrum where, you know, you've got to exterminate them all.
16:58This has been an information war, absolutely.
17:01The reality of social media is that people get into their own bubbles
17:05and it becomes very difficult to talk to each other.
17:08And also there's a degree of egging each other on.
17:11It's sort of, they're no longer about the facts on the ground.
17:16When it comes to herbivores causing damage, people don't intrinsically understand that.
17:22And so it's very easy to paint a picture of killing a horse as an evil act.
17:30Those things go to people's deep worldviews
17:32and it seems to then pull out a level of aggression when they feel threatened.
17:36Sadly, the online space has allowed those people to escalate in their rhetoric
17:41and that's actually then translated offline.
17:44Groups, the intimidating National Parks staff,
17:48or sending threats to firebomb National Parks officers.
17:52One of the greatest risks that I see in the organisation
17:55is personal safety of some of our staff.
17:58Rocky Harvey isn't accused of making such threats, but the discovery of dead horses did inspire him to act.
18:06Snowy Plain, they shot 69 horses and left the carcasses in a really pristine environment.
18:13And that, to me, pushed me over the edge.
18:22So then I sort of took a horse head and threw it in the doors at the National Parks office
18:28and filmed myself doing it.
18:32I had a clear message with my post online.
18:37I said, nobody is cleaning up at Snowy Plains, they need to get this shit sorted out.
18:50I reckon what Rocky Harvey did was the best thing that's happened for years.
18:54I loved every bit of it. It's challenging authority.
18:57You know, there was a bit of human involvement. It was related to the Godfather.
19:02The National Parks and Wildlife Service deserved everything they got out of that.
19:06Some of them got personal threats.
19:08Who?
19:08National Parks workers.
19:10Well, why wouldn't they? I mean, they bloody well deserve it.
19:12We can have disagreements over things.
19:15They don't need to escalate into threatening and violent behaviour.
19:19I mean, you can't lay down every time somebody confronts you with something you don't like and say,
19:24put the white flag up.
19:26No, to hell with it.
19:28Do you regret doing it?
19:29No, not at all.
19:30I probably would have done a whole horse, but it was just a bit heavier.
19:36Look, I see it as a clash of world views that's going on here.
19:41Unfortunately, what we tend to see is facts stop mattering.
19:45Why do you think anyone would want to lie about this stuff?
19:49They don't want us in the mountains at all.
19:51Why would the scientists be dishonest?
19:55Follow the money.
19:57What do you mean?
19:59I think you can get scientists to sort of produce stuff that's in your favour if you encourage it that
20:06way.
20:07You know, when you've got groups saying, we know for certain and we've brought out this expert who said there's
20:13only 900 horses in the park and then the New South Wales government is able to remove 9,000 horses
20:20over the next 12 months.
20:22You'd have to say that that's approaching a lie.
20:25Everybody tells lies.
20:26I mean, and they do.
20:28That's a fact.
20:28Have you told lies about this stuff?
20:30I don't need to because I'm on the truthful side.
20:32I don't need to tell lies.
20:33But everyone says that, right?
20:34Well, I can justify everything I said.
20:37You don't need to tell lies.
20:39I mean, the facts are there.
20:40There's a lot of information that is disinformation in the entire social media platform.
20:48That's what's changed the world.
20:53So, look, I know I'm talking to you about where I come from, but I reckon there's a chance that
20:58some of this feels familiar.
21:00There are versions of these arguments going on in communities all over the country, all over the world.
21:05Where you live, it might be about wind turbines or housing developments or maybe even immigration.
21:11And the thing is, none of this is happening by coincidence.
21:21Algorithms have created an online world where facts and truth are receding.
21:27The public information ecosystem literally spreading lies six times faster than facts.
21:33When you don't have facts, you can't have truth.
21:37Without truth, you can't have trust.
21:40Without these three, we have no shared reality.
21:43We can't begin to solve any problem.
21:49Once trust is fractured, a community might then struggle to believe anything they're told by media, by science, by government.
21:57And this creates a perfect target for disinformation.
22:10On the last day of 2019, I was among the thousands of Australians on holiday, caught in the black summer
22:17bushfires.
22:19While some people fled towns by road, for thousands of others, it was too late to leave.
22:25Oh my God!
22:26Whoa!
22:27They gathered on beaches for safety.
22:29An unprecedented emergency situation.
22:32It's like a war zone.
22:37Millions of hectares were burnt across multiple states.
22:40Right now, we've got just under 100 fires burning.
22:43That's uncharted territory for here in New South Wales.
22:46In your car and evacuate!
22:48Immediately!
22:51I reported from the New South Wales coast through January as the fires spread.
22:56And now we're facing multiple new significant fire fronts.
22:59It's felt like night all day.
23:01There's not a lot left.
23:02Nothing.
23:03We've got nothing.
23:06Emergency workers were overwhelmed.
23:09I haven't seen anything like this in my lifetime.
23:11My house is burning down!
23:14What are you doing?
23:16Nothing!
23:17People were traumatised.
23:19Communities were confused.
23:21And angry.
23:23They demanded answers.
23:25How does this happen?
23:27How can we stop it?
23:28The facts are that Australia is burning,
23:31while we turn a blind eye to the driving force,
23:35which is climate change and a warming planet.
23:38The world watched as the fires chewed through the landscape.
23:42And as that happened,
23:43something started to trend on social media globally.
23:47Hashtag arson emergency.
23:51It was messaging that the majority of the bushfires
23:54had been caused by arsonists.
24:00By January 7th, the arson emergency narrative
24:03was being repeated around the world.
24:07Australia has an arson problem.
24:09You can't pin on global warming, climate change,
24:12or whatever title you're giving your environmental boogeyman.
24:15So, at this point in the season,
24:1724 people had been arrested in New South Wales
24:20on charges of arson.
24:21But only 1% of the land burned across the state during that summer
24:25could be officially attributed to arson.
24:29Lightning sparked the vast majority of the fires.
24:35Now, researchers couldn't identify the origins of the hashtag,
24:39but they found arson emergency could be a disinformation campaign
24:43designed to undermine the link between fires and climate change.
24:48There used to be something called consensus reality,
24:51that mostly we are seeing a shared set of stories,
24:55a shared set of events.
24:57But if you look at a big moment in culture,
25:00you'll see how different tribes or political groups
25:04who live in different information bubbles,
25:06when they see this story, they see specific evidence
25:10that is different than the evidence that other tribes see.
25:13Dr Timothy Graham from Queensland University of Technology
25:16analyzed more than 300 accounts
25:18using the arson emergency hashtag,
25:21and a third of them could be from bots.
25:23Not only individuals, but also media,
25:26as well as potentially malicious actors,
25:28are picking up on this content and spreading it.
25:30It spreads like wildfire.
25:32So whatever is maximally inflammatory
25:35is what the algorithms are sorting for
25:38and then making that the centre of attention
25:40for every topic that is inflammatory.
25:44And we can't even see it
25:45because it has literally been our 24-7 reality
25:48for the last 12, 15 years.
25:54Australia's not alone in this.
25:56It's all over the world when disasters happen,
25:58floods, bushfires, disinformation follows.
26:01It's entirely human that in those moments
26:03of panic and confusion,
26:05we want to find ways of understanding what's going on
26:09even if there aren't clear or obvious answers.
26:12It's that very vulnerability
26:14that disinformation attaches itself to
26:16and feasts upon.
26:23If you're feeling overwhelmed, there's a reason.
26:27On average, we're devoting more than six hours a day
26:30to the consumption of digital information.
26:34And scientists are only just discovering
26:37what this is doing to our brains.
26:43I reckon since I woke up this morning,
26:45I've probably checked my phone a couple of hundred times.
26:47Emails, text messages, newspapers, all the usual stuff.
26:52On average, it says I use the phone about four hours a day.
26:56Sorry.
26:59It says I've actually used it on average four hours a day.
27:03Sorry, just checking a message.
27:08Anyway, that's why I've come here to Melbourne University
27:11to the School of Psychological Sciences
27:13because I've been thinking
27:14maybe it's actually doing something to my brain.
27:17What's it doing to your brain?
27:22Getting nervous now.
27:23Don't get nervous. It's OK. Just come and sit here.
27:26Professor Katherine Johnson is going to test
27:28how distraction affects my comprehension.
27:31I'm taking the phone.
27:32OK. Sorry.
27:34That's all right.
27:35You might get a sense of anxiety in your heart.
27:37I'll cope.
27:38So we've designed this new experiment
27:40trying to simulate more of an everyday life situation
27:44but still test your attention control.
27:47So this kind of goes to the avalanche of information
27:49that we're all taking in constantly now.
27:52Yeah. That multitasking approach that we have to life now.
27:55So the task you'll do is a proofreading task.
27:58OK. OK.
28:00Crazy scientific experiments win prizes at leading US university.
28:06Katherine's given me a text with spelling and grammar mistakes
28:09that I'm meant to correct
28:10through their bottoms in emergencies
28:13while I'm being distracted by other prompts.
28:17Project Pigeon was called off in 1944
28:21despite a seemingly successful...
28:23If I don't do my correction fast enough...
28:27That's gone.
28:27..the prompt disappears.
28:29The botany prize was awarded for a search which found that...
28:34..which found that...
28:36OK.
28:38Real plants...
28:38I also have to press specific buttons when they light up.
28:41It feels a bit like multi-multi-tasking.
28:46Ah, I've pushed the wrong thing.
28:50Good.
28:51Well done.
28:53How did you feel you went with that?
28:56I mean, I know I made mistakes.
28:58I mean, it's interesting, isn't it?
29:00Because we're thinking about whether we have the capacity
29:04to critically assess information that's in front of us.
29:07If we're trying to consume a huge amount of information
29:11at rapid speed...
29:14You're going to make mistakes
29:15and you won't even realise you've made a mistake.
29:16So, um...
29:18So, how many mistakes did I make?
29:20Oh, sorry.
29:22Right.
29:24I'm just seeing all the errors.
29:26Oh, right.
29:26Yeah, sorry.
29:28It's not looking good.
29:30You made ten incorrect responses and you missed seven.
29:34So, is that a bad average response or...?
29:37On average, they didn't miss any words.
29:41Really?
29:42They made no mistakes?
29:43Yep.
29:44Okay.
29:45So, they got the whole thing, right?
29:47So, I'm really below par.
29:49Well, you know, it's, um...
29:51I'm sort of sub-edit work as a career.
29:56But maybe I just am used to the mistakes?
29:59Oh, do you use, like, autocorrect?
30:01No.
30:02No.
30:05What I've just very clearly demonstrated
30:07is that our brains aren't wired for multitasking.
30:11Research shows it stretches us thin,
30:14reducing our IQ up to ten points
30:16and rendering us 40% less productive.
30:20Even simple tasks require the brain to transfer attention
30:24and every time that happens,
30:26we risk missing something important.
30:29Just go and look at the camera.
30:31Yep.
30:31Walk at your normal pace.
30:33I'll be shooting you from over here, so I'll cue you.
30:35So, if you just wait there...
30:36Sure. No worries.
30:38Our participants don't realise they're players
30:40in one of psychology's most instructive tests
30:43of human attention.
30:46Hey, sorry, man.
30:47Hey, bud.
30:48Can you tell me where we are?
30:50About to get filmed on something, but I'll give you the game.
30:52Oh, sorry.
30:52Where are we?
30:53Yeah.
30:53You're asking someone that doesn't know that well,
30:55but we're here.
30:56Where?
30:56Where are you trying to get to?
30:57I'm trying to get to Circular Quays.
30:59Alright.
31:00Just walk up those stairs.
31:01See that stair where those people are?
31:03Oh, yeah, yeah.
31:04Up and over and that'll take you straight.
31:05That'll take you straight.
31:12We're here.
31:13Yeah.
31:14Yeah.
31:15And then...
31:16You go up the stairs and you follow it down.
31:18Cool.
31:18George straight.
31:19Here we are.
31:20No worries.
31:25This is the bridge there.
31:27And then so we're on this pier here.
31:29How do I get to...
31:30Oh, sorry.
31:31Sorry.
31:33Um...
31:34So see those stairs over there?
31:35Yeah.
31:40Oh, this is part of it.
31:42You're tricky.
31:44I'm highly observant here.
31:46Good on you.
31:48Did I wreck it?
31:52You're my hero, Shelley.
31:55I'm highly observant.
31:57Hey, um...
31:58Could you tell me where we are at the moment?
32:00We are in Sydney.
32:01Yeah.
32:02Yeah.
32:03Just see this bridge here?
32:04That's that bridge there.
32:05Yeah.
32:05Okay.
32:05I'm looking for...
32:06Oh, sorry.
32:08Sorry.
32:12Um...
32:13Sorry.
32:13What are you looking for?
32:14Circular key.
32:14Circular key, yes.
32:16You can get there.
32:19See these stairs here?
32:21Behind you.
32:22See those stairs that go up through the middle?
32:24Oh, okay.
32:24Yeah, yeah, yeah.
32:24You want to go through those, because those stairs...
32:26Oh, wow.
32:27Yeah, that's amazing.
32:28So if you go around, that's crazy.
32:30But it's a nice walk, but...
32:32I'd go the stairs.
32:33Wow.
32:34This is something we see people being fooled by it all the time.
32:39So what attention does to our world, when we're really focused on something else,
32:45we are filtering out a bunch of other information that seems less relevant for us.
32:50And it goes back to things like visual illusions, all the cognitive biases.
32:56Our brains are looking to make sense of the world in the most efficient way possible.
33:01I didn't look.
33:02So, like, I'm not surprised at all, but it is very hard to watch when you see yourself get
33:09due to just like, oh, just continue on and like, yeah, it's not a problem.
33:13It really does question your reality.
33:16How do I then know when I'm not being tricked?
33:19Or how else do I let myself down and not see what's right in front of my very eyes?
33:26How?
33:27Like, I have a completely different view now of how people feel if they've been swindled,
33:33or they've been hacked, or they've been in some way fooled online,
33:36how that can have a really adverse effect on them.
33:40I wouldn't have thought I would ever have that enlightening kind of moment.
33:45But I really do now.
33:47We're really susceptible in ways that I don't think people understand.
33:51They think that, well, if I got a PhD or went to college, I'm not vulnerable to this.
33:56But actually, these are psychological biases that are deeper than that,
34:00and that we're all vulnerable to.
34:04This vulnerability isn't a personal failing.
34:07It's how evolution has wired us.
34:10The reality is, we are so distracted, we cannot give the same amount of attention to comprehension.
34:19According to Professor Marianne Wolfe, who spent decades studying how the brain takes in information,
34:25digital distraction comes at a very high price.
34:28We are skimming and scrolling and word spotting.
34:33We know there's insufficient allocation of time to deep thinking, therefore critical thinking.
34:40But Marianne believes there is a solution within easy reach.
34:45Reading is an act of resistance in an age of distraction
34:51in which critical thinking and empathy are on the decline.
34:59We were never born to read.
35:02We were born to speak, to see, to eat, to think, but not to read.
35:08There's no one region in the brain for reading.
35:12Rather, the brain has this fascinating design principle.
35:16And the design allows it for new functions, whether it's riding a bike or reading,
35:23to create a new set of circuits and connect them in a new way.
35:30That brain can learn to read multiple writing systems.
35:34It's amazing what we can do.
35:37I feel like I'm being decorated like a Christmas tree.
35:41It's all for science.
35:43Neurological tests demonstrate that our brains are plastic.
35:48Over our lifetimes, they create new neural connections so we can keep learning and adapt to change.
35:55What's the connection between attention and our ability to analyse and our ability to actually process something?
36:02Attention is underpinning everything.
36:04Yeah.
36:05When you're needing to think about whether something is true or untrue, you're concentrating and that's attention.
36:11Yeah, okay.
36:12What scientists are now seeing is a shift to our neuroplasticity from prolonged time on devices,
36:20with negative implications for our cognition, our emotions and our behaviours.
36:27Social media is changing how we gather information and connect with each other.
36:33Well over five billion people across the globe now use social media to better understand issues, large and small.
36:40Here's how I would survive an alien invasion.
36:43Dimp removal on my refrigerator.
36:45Online health advocates are particularly popular.
36:48I've been drinking my own urine for 17 years.
36:51Putting garlic cloves in my nose after a sinus infection.
36:54The majority have a financial interest in what they're promoting.
36:58This is day one of me ingesting borax.
37:01Borax.
37:02Borax.
37:02Borax.
37:03And less than 6% cite credible scientific evidence to back their claims.
37:08Why do I recommend turpentine cleanse for parasites?
37:13I have seen parasites jump out of someone's eye holes.
37:16Six months later, she has another pap smear.
37:19They just so happen to take it on the same spot.
37:22What's the result?
37:23Abnormal cells.
37:25What caused the abnormal cells?
37:27The first pap smear that was never allowed to heal.
37:30Barbara O'Neill is an Australian natural health advocate with hundreds of thousands of followers.
37:36I'll tell you something, Australia has never ever had so much information from me since I was banned.
37:49Barbara has been formally banned from providing any health services after she was found to have delivered misinformation to her
37:57followers.
37:57And I know many were blackmailed into having the COVID shot.
38:03The vaccines.
38:04They are not safe.
38:06As well as her dangerous advice on vaccinations, Barbara has promoted debunked cancer treatments and advocated against the use of
38:15sunscreen and antibiotics.
38:16And that's what we use.
38:20And that's what we use.
38:20Common sense.
38:21So I really can't understand why they claim I'm a threat to public safety.
38:26Today, Barbara's supporters have come together in a Sydney park directly opposite the Health Care Complaints Commission, which made the
38:34decision to shut Barbara down.
38:36So I've never protested before in my life.
38:38This is my first time ever being front line and making a stand for something that I truly believe in.
38:44If she's dangerous, I choose to be in the danger zone.
38:47You're a gift from God.
38:49You're a blessing.
38:50Thank you so much.
38:51Have you got a pen?
38:52I do.
38:53I have a texter.
38:54I feel like I'm shaking.
38:58Barbara has a loyal following.
39:00An audience particularly receptive to the idea that governments can't be trusted when it comes to our health.
39:06What I do is I teach people how to look after their bodies.
39:10I just wanted to say thanks to you that you cured my stage two cancer.
39:14Oh, that is so nice to hear.
39:16It really is based on our God-given right to choose.
39:20A wonderful teacher.
39:21That is good.
39:22It's my right to choose actually what I do and what I do not do to my body.
39:27So I see one of my main roles is just to give information.
39:31One lady told me that she totally conquered her breast cancer by just using the castor oil compressors.
39:37Australian authorities say Barbara poses a risk to the health and safety of the public.
39:42But like so many unregistered wellness promoters, her business is online, where Australia's regulatory bodies can't touch her.
39:51Please don't have a pap smear.
39:53Because they are contributing.
39:56Every time you have a pap smear, you're damaging the cervix.
39:59Barbara's legion of followers spans the globe.
40:03According to the Health Care Complaints Commission, she's misled vulnerable people with health misinformation.
40:09The great cholesterol con, the great cholesterol lie, the great cholesterol hoax, the great cholesterol deception.
40:15But here is where Barbara's story gets really interesting.
40:20Barbara's voice and image have been hijacked to create disinformation.
40:24She's now all over the internet, offering advice and products that she would never endorse.
40:31How much of what's out there is really you?
40:35And how much of it is fake you?
40:38Well...
40:38Most of it's fake.
40:39Most of it's fake?
40:40Most of it's fake.
40:41I want to show you some, if that's alright.
40:43Because I've had a look and I kind of can't tell the difference.
40:46Be very interested to have a look.
40:47Can we watch some and see?
40:48Absolutely.
40:49Okay.
40:50I'll be talking about this incredible body that we live in and how you can give it the right conditions
40:56to bring about a healing response.
40:58You'll be hearing from about 35 world renowned speakers on health.
41:03So is this you?
41:04This is one of the must attend health events of the year.
41:07Yeah it is.
41:07And put in...
41:08And that was a real video?
41:09Barbara.
41:10Okay.
41:11So a billion dollar industry was created...
41:13Oh yeah.
41:14That's me.
41:14That's you?
41:15The Bible says God gave herbs for the service of men.
41:18They work with the needs of the body.
41:20And this is truth.
41:24Yep.
41:25Cool.
41:26Hello friends, welcome back.
41:27A cancer expert pointed to a common food that may make cancer worse.
41:31We're going to reveal that food.
41:33We'll also talk about daily habits and patterns.
41:36Leave me a little note in the comments below.
41:39That's definitely not Barbara.
41:39It always warms my heart to see how far this message passed.
41:42Sounds like you.
41:43Looks like you.
41:43Yeah.
41:44A bit weird seeing this.
41:46It's crazy.
41:48First, you are going to wash a red onion well and cut it with the skin on.
41:54And cut it into squares and boil a cup of water.
41:56The voice is not me but that's me in the background.
41:58Serve it in a cup and drink.
42:00You can drink it on an empty stomach.
42:02That's definitely not Barbara.
42:04So you do talk about cleaning the uterus?
42:07Yes.
42:09But with some herbs.
42:12So that's not the right recipe?
42:14Yeah, no, no.
42:15I don't use an onion to clean out uteruses.
42:17Okay.
42:19If even Barbara is struggling to recognise true and phony versions of Barbara, what chance
42:25do her followers have?
42:26Do you see Barbara's stuff on social media?
42:29Oh, everywhere.
42:30Everywhere.
42:30I see her stuff on Facebook, Instagram and YouTube.
42:34And some of her stuff's been manipulated using AI?
42:37Yes.
42:37Yes.
42:38Definitely.
42:38Have you seen that?
42:39I can actually tell the difference now.
42:41I can actually tell when it's been AI, just the way she's speaking.
42:46So you feel confident that you know the difference?
42:48I do personally, but that's maybe because I've been following her for quite a while.
42:52Can I show you some and see how you go?
42:54Yeah.
42:57What do you reckon is that?
43:00Real or fake, Barbara?
43:02I think that's real.
43:03That's definitely it.
43:05Okay.
43:06How about this one?
43:08No.
43:08Ooh.
43:11Ooh.
43:14That looks like her.
43:15In one of her group meetings.
43:18Lectures.
43:18Yes.
43:19I would take that as yes.
43:21Okay.
43:22With the onion, the honey.
43:23Yeah.
43:23I had onions in my socks last week when I was unwell.
43:26Really?
43:27Yeah.
43:27What did the onions do in your socks?
43:28Oh, haven't you heard this one?
43:30No.
43:30They draw out all the sickness, basically.
43:35The impurities.
43:36Like, literally, my feet were wet.
43:38Like, the onions had sweated.
43:40Okay.
43:41My sickness went out.
43:43Okay.
43:43So you reckon it worked?
43:45It works.
43:45If Barb says it works, it works.
43:48So just on the uterus cleaning video, I need to be clear, Barbara says that's a fake.
43:54She does say that's a fake.
43:55So she does talk about uterus cleaning.
43:57Yeah, but not like that.
43:58But not using onions.
43:59See, that's so dangerous.
44:00I think they took information from a separate video and put it in that one.
44:04But does that worry you?
44:06It does.
44:06What's real?
44:07Like, pushing through the fog.
44:08And you feel like you know what's real and what's fake?
44:10I certainly don't.
44:11But I think I'm learning.
44:14I think when you listen to these stories, you can start to understand how these layers and
44:20layers of stories, some of them factual, some of them fictional, some of them based on
44:28pseudoscience, start to enmesh and get lost in themselves.
44:32It kind of feels a little bit like being caught in a riddle.
44:35And you can also hear people's vulnerabilities and how easy it is to become kind of caught
44:41up in that.
44:42What is fact?
44:43What is real?
44:44What is truth?
44:47And now it's even difficult to know whether what you're seeing is the real thing.
44:54Green Barbara again!
44:56Green Barbara again!
44:58Green Barbara again!
45:00Truth is fragile.
45:02And truth has been, if you will, captured, kidnapped by those who will say, this is the truth.
45:12All women, and especially all modern women.
45:16People have been inundated with information that what we do is actually end up reading less
45:27that is outside our silos of information.
45:33This is a problem.
45:34And then if we're only going into that silo, we don't use critical thinking as much.
45:42We become vulnerable to misinformation and we are extremely susceptible to disinformation.
45:50To the system, not the people.
45:52We really are seeing decreases in all the deep reading skills.
45:57Especially empathy and critical thinking.
46:02The act of reading gives us a chance to think differently than our own perspective.
46:12That allows us to leave the egocentrism that we all have and really connect to others.
46:25That, however, is only the beginning of what has to happen, which is critical thinking.
46:33And what the brain does, it's a wild dance, where you are saying, yes, no, yes.
46:40Finally you come to an agreement that this is the truth.
46:44Or this is falsehood.
46:46Or something in between.
46:48But that is critical thinking.
46:52And it takes time.
46:56One of the questions that I'm most concerned about is the role of empathy in reading.
47:04And whether that is changing.
47:06And whether, if it changes, it will affect our society.
47:10Including our democracy.
47:15One of the hardest things for me of late is to see the polarization that exists in my country and
47:25around the world.
47:26Democracy is cringe.
47:28Which do you think are more dangerous, drag shows or guns to children?
47:32Drag shows.
47:33Because I think people who come here illegally should be deported.
47:37The trend towards seeing others as sinister or as enemy is the greatest threat to our democracy.
47:48So here we have a connection between reading, empathy and democracy.
47:57Next time, I follow disinformation across the globe.
48:01There might be people here hiding from us.
48:03And discover the damage it's doing to democracy.
48:06The chief executive of one of the Philippines' most popular news sites has been arrested.
48:12I meet people actually creating disinformation.
48:18And people exposing it.
48:20Companies will behave in ways that are nefarious.
48:24We have an administration that's pushing our democracy to the limits, aiming to break it.
48:33Like what you just watched?
48:35Head over to ABC iview now to stream all episodes of The Matter of Facts.
48:39See you next time.
48:41See you next time.
48:52TV Gelroos
Comments

Recommended