Skip to playerSkip to main content
  • 47 minutes ago
Transcript
00:00This is Crime Night, where we look at crime through the ideas, science and psychology
00:28that shape it.
00:29Tonight we're exploring scams, or are we?
00:33We are.
00:34We'll look at where they come from, why they work and how even the smartest people can
00:38be taken for a ride.
00:39But before we hand over our address and mother's maiden name, let's meet the panel.
00:43She holds three degrees from three countries and yes, all from actual universities.
00:48I can only imagine what her hex debt looks like.
00:51Please welcome Professor Danielle Raynald.
00:58I mean honestly, three degrees is very impressive.
01:02Is your collection complete?
01:04Nah, I doubt it.
01:05If somebody paid me, I would totally study another degree.
01:08What would that be?
01:09Probably environmental science to start with.
01:11Oh, correct.
01:12We need more of those, because it's pretty bad out there.
01:16He's a consulting criminologist and expert in fraud and cyber crime.
01:22Check out your browser history and shred your receipts.
01:24It's Dr David Bartlett.
01:26Now David, does anything still surprise you when it comes to white collar crime?
01:36If you asked me that question 12 months ago, I'd say nothing surprises me anymore.
01:41But in the last 12 months with generative AI and some of the things we've seen there,
01:45yeah, I get a little surprised.
01:46Oh, I'm getting a little bit terrified.
01:48Alright, jeez.
01:49Up next, she studied law, trained in journalism, then realised she'd rather do something useful
01:54and took up comedy.
01:55The justice system's loss is very much our gain.
01:58It's Steph Tisdall.
02:05Steph, I'm wondering what kind of lawyer you think you might have made.
02:08I wanted to be a human rights lawyer.
02:10Oh.
02:11And then I met more humans and went, nah, no wrong.
02:17And finally, they're not just a funny person, they're a skilled negotiator,
02:20once haggling their way through a mugging by offering a pack of cigarettes.
02:24Please welcome Rhys Nicholson.
02:31Did that really happen?
02:32Did you?
02:33Yeah, it was before marriage equality had happened and this guy was mugging me
02:37and I haggled and I gave him a pack of cigarettes and I was like,
02:40I'm just going to cancel my bank card so you don't need that.
02:43I gave him some cash.
02:44And then as he was walking away, and I'm not making this up,
02:47he turned around and said, by the way, my sister's a lesbian
02:49and I think it's really sad you guys don't have gay marriage.
02:52Have a good night.
02:54And that's how I met my husband.
02:59Scams aren't rare, they're relentless.
03:02Every week, there's a new one in the headlines.
03:04Here are just a few that made the news.
03:06Investment scams.
03:07Pirate obit scams.
03:08Crypto scams.
03:09Blessings scams.
03:10Kidnapping scams.
03:11Recorrection scams.
03:12Scams.
03:13Scams.
03:14And we can't go without mentioning perhaps the biggest scam of all,
03:18duping millions of Australians every year.
03:20I really do believe in love of wisdom.
03:23Sweet Jesus, how are they still trying to convince us that it's a social experiment?
03:28Last year alone, scammers drained nearly $1.5 trillion globally.
03:35Australians themselves lost a staggering $2 billion to scammers in 2024.
03:40That was the quote to rebuild the Gabba for the Brisbane Olympics, which also turned out
03:45to be a bit of a scam.
03:46And tonight, we're zeroing in on one of the most financially devastating of the lot.
03:51Phishing scams.
03:52I'll let the Australian Federal Police explain what they are.
03:55Phishing is an attempt by cyber criminals to trick victims into giving them personal information
03:59or downloading malware.
04:01To do this, they often send messages pretending to be from companies you know and trust
04:05and direct you to click links or reply with sensitive information.
04:09That's very helpful information that everyone should know, but I don't think the message
04:13is really getting out there.
04:14Phishing is exactly like, well, phishing.
04:19You cast the nets wide, but instead of catching dinner, scammers are hoping to reel in your details.
04:24David, are phishing scams increasing or are we just hearing more about them?
04:28Actually both.
04:29They are increasing and we are hearing more about them.
04:32But what we know is that these types of scams are still really under-reported.
04:35And there's a whole range of reasons why people don't report.
04:38So it might be, for example, that they think it's just a small amount of money and not
04:42worth reporting.
04:43But it might be that it's a romance scam and they're actually embarrassed to admit that
04:47they'd sort of fallen victim to this.
04:48And this falls into what we call the dark figure of crime.
04:51And that's the proportion of crime we know that sort of exists but doesn't get reported.
04:56So that $2 billion really is the tip of the iceberg.
04:59That's what I was going to say.
05:00Is that $1.2 trillion worldwide based on just what we know?
05:05Absolutely.
05:06That's not prediction.
05:08That is insane.
05:10So as criminologists we work off of data, right?
05:15Like the data on scams is not thorough.
05:18And it's because of that dark figure.
05:20There's so much about it that we just don't know.
05:22Rhys, how often are you contacted by scammers?
05:25I reckon quite a bit.
05:26Like a lot.
05:27Like I feel like every third text at the moment seems to be a phishing scam or my family.
05:33And it's really the same.
05:34So you've noticed an increase?
05:37Yeah, I reckon since the Qantas hack.
05:40I genuinely think since the Qantas hack it all kind of kicked off and I'm getting constant
05:45text all the time and emails and never a phone call though.
05:48Just pick up the phone, you know?
05:49Well that's actually completely predictable.
05:53So hackers, what they're interested in is the data.
05:56And your data has probably ended up on the dark web and been sold to hackers who are now
06:00using that information, yours and everyone else that's been hacked, to try and execute
06:05a phishing scam on you.
06:06Because I keep getting the, like, the job offer ones.
06:09Like, is anyone, you know, the ones where they're kind of like, hi, and they're always
06:12the wrong name.
06:13It's like, hi Martine, I'm from blah, blah, blah.
06:15And you can make $1,000 a minute if you come and sit in this box.
06:20Yeah, that's actually a bit different.
06:22That's probably part of a money laundering syndicate.
06:26They want to get you in to be a money mule.
06:28So what they, they've got all this money that they need to launder and basically they want
06:33to use your, your bank account probably to do that.
06:36And I have a follow up question.
06:38How would you get out of one of them?
06:42Move on Julia.
06:46I have a question for you.
06:47With this reporting thing, right, are there like squads that are set up for scams?
06:51I'm not even joking.
06:52No.
06:53It's a great idea.
06:54Most police services in Australia have a, have a sort of fraud section.
06:57Some of those have, are very specific in relation to scams.
07:00But in Australia, it's actually the Australian Competition and Consumer Commission, the ACCC,
07:04company that does a lot of the work around scams.
07:06Hmm.
07:07And the interesting thing is that in 2025, reporting is down 25%.
07:11The amount of money lost is actually up 28%.
07:15And that suggests that scammers are actually becoming more successful.
07:18Hmm.
07:19Over 100 million spam emails are sent every minute.
07:22Not only this, a third of your phone calls are now scam related.
07:26Like everyone here, I've received bogus text messages requesting payment from toll company
07:31linked.
07:32I've had fake packages delivered by fake Australia Post.
07:35But before dodgy texts and crypto bros, there was this guy.
07:39It's the oldest rip off on the internet.
07:41An email from a Nigerian prince promising millions of dollars in exchange for a small fee.
07:46Oh, you're laughing now.
07:50The father of phishing scams.
07:52A fake royal with fake problems and a fake fortune.
07:55But the victims?
07:57Very real.
07:58The scam first appeared in the 1980s, often run by mail and fax.
08:02I mean, imagine scamming someone by fax.
08:05So noisy.
08:06Once email took off, so did the scam, which amazingly is still alive today
08:11and is still raking in thousands of dollars a year.
08:14And with the rise of social media, scams are surging.
08:17Scammers can slide into message threads, into your DMs.
08:20They can pretend to be your long lost friend, a distant relative,
08:24even a dead celebrity.
08:26Here's Michael Jackson, who's keen to make a comeback
08:28and trying to get his finances in order.
08:31Hey, it's Michael Jackson.
08:34I'm not really dead.
08:35Can you cash out me $600 so I can come back to the United States
08:39and put out some more music?
08:41The original Smooth Criminal.
08:43The original Smooth Criminal.
08:53But Rhys, what kind of phishing scam are you likely to fall for?
08:57Oh, 100% romance scams.
08:59No.
09:00Totally.
09:01I reckon if I wasn't married, I'd be one of those people at the airport
09:03with a bunch of flowers saying,
09:04he said he'd be here.
09:06Even when I hear you talk about the dark figure of crime,
09:09I'm like, ooh, what's his vibe?
09:12Danielle, how many people are these scams catching out?
09:16Well, these phishing scams aren't targeted, right?
09:19They're literally phishing for victims.
09:21So if only 1% of those scam emails and texts successfully result in a victim,
09:27the scammers are winning, you know?
09:29You think about that Qantas hack.
09:31There were, like, 6 million, approximately,
09:34people whose information was hacked.
09:36If the hackers are successful at scamming 60,000 people,
09:40they are victorious.
09:42The scammers also adapt their methods.
09:45So there's a whole lot of work that goes into trying to prevent
09:48these phishing emails getting through the system to their victims.
09:52But the scammers are working to get around it.
09:54So there's, like, spam filters, for example.
09:56And the idea is that the spam filter will search on keywords
09:59and then just send that message automatically to junk.
10:02So the scammers get around that by putting deliberate typos in.
10:05But not only that, coming back to Danielle's point,
10:07they're looking for that 1% of people who will actually fall for the scam.
10:12Now, someone who doesn't detect the spelling error
10:14might be more likely to actually part with money.
10:16You might be thinking, I'd never fall for that.
10:19I'm smarter than that.
10:21Well, let's put that to the test in our experiment of the week.
10:30Now, we set up our very own harmless phishing scam
10:34to see just how many of you would take the bait.
10:37Our lure, a QR code,
10:39and the promise of a gift to remember your time here tonight.
10:42Because nothing says, trust me, like free stuff.
10:45This is quishing, which sounds like something you should keep to yourself,
10:48but it's actually a phishing scam.
10:50Earlier this evening, our audience was relaxing in the foyer
10:53when they were presented with this enticing poster.
10:56To receive their gift, they had to scan the code,
11:00which directed them to an official-looking United Parcel Post website,
11:05where they were invited to enter their details.
11:08Anyone feeling a little nervous right now?
11:11Now, before we reveal the results,
11:15we want to point out that this experiment was designed
11:17to dupe our audience like many scams out there.
11:20We wanted to test their trust in an institution,
11:22like the ABC and my face,
11:25and use that trust to access their personal details.
11:28All right.
11:29Studio audience, let's see how many of you scan the code.
11:3262%.
11:34That's a pretty alarming figure.
11:36You're all keen to quish.
11:38But how many of you really wanted that mystery gift enough
11:41to add your details into a United Parcel Post website?
11:45Apparently 36% of you.
11:48Now, Cy Y, hello.
11:51You were one of the people that shared your details.
11:54Did you at any point think that this might be a scam?
11:56Well, I trust the ABC.
11:57Of course you do, me also.
11:59I trust you, Julie.
12:00That's correct.
12:03It's all possible for any of us to fall for a scam.
12:06Have you ever fallen for a scam before?
12:08Oh, yeah.
12:09Like, it can happen.
12:10I'm a chronic online shopper.
12:12So, it was starting to twig when I was like,
12:14I don't know this delivery service.
12:17Because I'm quite familiar with the delivery service.
12:21And that's good.
12:22You're figuring out the things, your limits around it.
12:24That's excellent.
12:25Are you ever freaked out by what you might be scammed on?
12:28It's just something you sort of have to be aware of,
12:30but you can't be shot off to everything.
12:32And I love free shit.
12:33And is that not the moral of the story?
12:37I love free shit.
12:38Thank you so much, Cywise.
12:40Thank you for scanning.
12:46The most dangerous scams aren't the ones promising millions.
12:50They're the ones that feel personal.
12:52Scammers have graduated to a masterclass in psychological manipulation.
12:56Let's take a look at the evolution of one of the most insidious phishing scams
13:00over the past few years.
13:01It's been dubbed the High Mum Scam.
13:03Parents are being targeted in a new text message scam.
13:07Authorities say the scammers are contacting parents claiming to be their children in an emergency.
13:13The scam texts were usually vague enough that parents would assume they were real.
13:17The child needs money for a new phone or their card has just been declined.
13:21They were basic.
13:22It was a simple text message that caught out Jen Irishu.
13:27Well, it started with, hi mum, it's me.
13:30This is my new number.
13:31You can save this one and delete the other.
13:34Believable.
13:35Believable.
13:36A message from who she thought was her son, which has now cost her more than $3,000.
13:43What started as vague text messages has evolved into something far scarier.
13:48Scammers can now clone voices using AI.
13:51All it takes is a few seconds of audio, often taken from social media,
13:55and suddenly they can sound just like your child.
13:58Unsurprisingly, it gets worse.
14:01I received a phone call from my daughter and she keeps crying and sobbing.
14:05She's like, mom, help me, help me.
14:07Mom, these bad men have me.
14:08They have me.
14:09And this man takes over and he tells me, listen here, I have your daughter.
14:13If you call anybody, if you call the police, I'm going to pop her stomach so full of drugs
14:18and then drop her for dead in Mexico and you'll never see your daughter again.
14:21I never had any doubt that it wasn't my daughter.
14:23It wasn't just her voice.
14:24It was her personality.
14:26It was the way she talked to me.
14:27It's the way she'd cry and sob.
14:28It was her replicated.
14:30A terrifying situation.
14:32David, what makes these techniques so effective?
14:34As a parent, that is frightening.
14:37Scammers really play on our emotions.
14:40That particular scam has combined two sets of emotions.
14:43One is the time urgency.
14:45So, you know, they need it now.
14:47And the other thing is the emotion of being a parent.
14:50And those two together are really, really powerful.
14:53My mum got hi-mummed from me.
14:57And she got a message saying, hi, mum.
14:59It's my new phone.
15:00The other one's broken.
15:02Can you send me some money so I can get a new phone?
15:06And she said, oh, well, we literally live five minutes down the road from each other.
15:10Codependence.
15:11And it just happened to be that I had seen something like on social media about the hi-mum thing.
15:18And I just called her probably to borrow money at the same time.
15:22She said, what are you doing?
15:25You've just been texting me.
15:26Are you getting a new phone or what?
15:28And I was like, what are you on about?
15:30And she was very convinced.
15:32And I said, mum, that's a scam.
15:33And she went, really?
15:34I didn't know that.
15:35Because, I mean, my mum's a very smart woman.
15:37It was just, why would you ever think to question it?
15:40I think that's the thing.
15:41It was really quite fortunate that you happened to have that conversation with her.
15:44Oh, yeah.
15:45To stop her engaging in that scam even further.
15:47Well, luckily she was just going to rock up to my house as well.
15:49She said, I don't like giving numbers out, you know.
15:51Like, luckily.
15:52But she was 100% convinced that that was me.
15:55That, yeah, I don't like giving numbers out.
15:57That's actually a really good strategy.
15:59Yeah, yeah, yeah.
16:00Are there certain types of people that are more likely to get scammed?
16:04Look, I think it's a bit of a myth that there are certain types of people who fall for scams.
16:08Like, oh, you have to be dumb.
16:10Or it's older people because they're not as technologically savvy.
16:13There's actually some recent research from last year.
16:16And it actually showed that teenagers' susceptibility to scams was increasing.
16:21One in four of them fell victim to these social media scams.
16:26And they were most vulnerable to these buying and selling scams.
16:29So, you know, like on Marketplace, for example, you say, oh, I really like that bicycle.
16:34I'll send you $500 and you send them the bicycle.
16:37They would send the money and not get the product that they paid for.
16:40So those are the types of scams that young people are falling victim to.
16:44Well, ultimately, I think anyone can be scammed.
16:46If it's the right scam delivered to you at the right time, when you're a bit vulnerable to that type of scam or you're busy and you're not paying attention, ultimately I think anyone can be scammed.
16:58Is there a particular time of day where people might get scammed more than another time?
17:03Yeah, there is actually.
17:04People are most vulnerable when they're busy or when they're trying to shift gears.
17:09So, in fact, banks did some research that showed that Tuesdays at four o'clock were the time that people are most likely to get scammed.
17:16Look out, everyone.
17:17But when you think about it, it's people at four o'clock packing up from work, they're thinking about what they're going to do next, they're going home.
17:25So their guard's down a little bit and they're not quite sort of switched on.
17:29There's other research around, though, that shows that Fridays, Friday afternoons in particular, people are more susceptible.
17:35And that, again, makes sense because your mind shifts sort of changing from work into the weekend.
17:40Yeah.
17:41Image and video manipulation has evolved so quickly we're reaching a point where it's almost impossible to tell real things.
17:46It's impossible to tell real from fake, which makes it the perfect tool for scams.
17:50Here's Lou Wall.
17:54Humans love a good trick.
17:56Magic, multilevel marketing, monogamy.
17:59The only thing we love more is falling for them.
18:02And now scammers have taken that love digital with deep fakes.
18:06AI-generated replicas of real people, from Tom Cruise and Paris Hilton eating cereal
18:11to the late fake Pope Francis in Balenciaga.
18:14Rest in polyester Popey.
18:16Deep fakes are advancing with every scroll.
18:19So how did AI get so good at making fake people?
18:22Well, one of the ways was Snapchat filters.
18:25Yep, those puppy ears and rainbow vomit effects weren't just for fun.
18:29Every Technicolor tech form was teaching AI to map human faces with precision.
18:34And while the machines were learning, our brains were learning to become accustomed to manipulated imagery.
18:39So these filters tricked us into teaching them how to get better at tricking us.
18:45Run.
18:46All it takes is a few minutes of footage.
18:49A TikTok, a wedding speech, a hot take.
18:51That's enough to clone you.
18:52These days it's easier to deep fake your boss than get promoted.
18:56And I should know.
18:57I made one.
19:01Relax.
19:02Obviously I outsourced it to a deep fake expert.
19:04Hello Liam.
19:05Who are you?
19:06Where are we?
19:07What do you do?
19:08We're a cyber security company.
19:09Okay.
19:10And I'm a hacker here.
19:11Do you hack for good or for evil?
19:12Exclusively for good.
19:13What are the main indicators that I should be looking for to spot a deep fake?
19:18They struggle with beards.
19:19They struggle with people that don't have any hair.
19:21Liam, you've nailed this.
19:22You need surprisingly little footage to create a deep fake.
19:24What the real game changer was when things were fast enough to be able to do it in real
19:25time.
19:26We're hacking into the mainframe.
19:27Using one five minute red carpet cliff and disturbingly little time, Liam knocked
19:45up my very own petit feu zimero.
19:47I'm ready to be Julia.
19:49You are Julia.
19:50Oh my.
19:53That is insane.
19:54Hey, I'm Julia Z.
19:57I'm going to go scare some Fisk fans.
19:59Hey Kitty Flanagan.
20:01That's crazy, isn't it?
20:11It's fun on one level, but it's kind of extremely terrifying on the other.
20:15David, how worried should we be about getting deep faked?
20:18The question you've probably got to ask yourself is why would someone target you to make a deep
20:22fake?
20:23So what we tend to see is that it's celebrities, politicians, people like that who tend to get
20:27targeted.
20:28So it's possible, but it's probably more a case of being alert, but not necessarily alarmed
20:33at this point.
20:34AI is changing all of this as we speak.
20:37And I know that you're right.
20:38Like, you know, you really have to ask yourself, why would somebody deep fake you?
20:43But even here in Australia, there have been cases of these high school students in New South Wales and Victoria who were accused of deep faking pornographic pictures of their female classmates.
20:55So I think that this is a real problem.
20:57Can we just pretend we don't know?
20:59Can you just wipe my memory, please?
21:03It's terrifying.
21:04Yes, scammers keep getting smarter, but all is not lost.
21:07There's a glimmer of LED light in the darkness proving we're not helpless in this fight.
21:12Is it the AFP?
21:13No.
21:14People don't even watch their videos by accident.
21:17I am talking about a vigilante YouTuber with over 4 million subscribers who turns the table
21:23on scammers and exposes their methods.
21:26Enter Jim Browning.
21:27The man is totally unknown.
21:29We can't tell you where he is or show you his face.
21:32He's helped lead authorities to fraud boiler rooms, resulting in the takedown of major operations.
21:37The scammers would like nothing more than to expose him.
21:40Can we just acknowledge the handcuff method that they've got going on there?
21:45They're just falling hands.
21:51I think I went to that wedding last summer.
21:56Browning is one of many scambaders who expose and disrupt scam operations by hacking into their call centres,
22:02accessing their computers and finding out exactly who they are.
22:06Yeah, hi sir.
22:07My name is Carolina Fernandez.
22:09I am calling you from the Microsoft.
22:12Oh, hi Priya.
22:17Browning's huge YouTube popularity allows him to educate and protect the public against many worldwide scams.
22:23And Browning is not alone.
22:25An Australian company, Aparthe AI, is also using technology to scam bait scammers.
22:31Aparthe chatbots engage with scammers on long, fake calls.
22:35The idea being if they're chatting to a bot, they're not chatting to us.
22:38Hi sir, this is Mary speaking and I'm calling you from NBN.
22:43So we decided to upgrade your line.
22:46Right, right.
22:47So about the technician, how long do they usually take to do their work?
22:50You know, I might need to grab a cuppa while waiting.
22:52Yes, it's a quick job.
22:54Oh, that's good to hear.
22:55Just worried I might need to entertain them.
22:57Any tips on what to talk about with the technician?
22:59I could be a bit chatty, you see.
23:01Can you just hold for a minute?
23:03I will transfer my call to my sister.
23:06With people like Jim Browning and Aparthe on our side, there's a real chance to fight back and reclaim our digital space.
23:13So Danielle, how do we arm ourselves against scams?
23:16Well, I think a really good tip for friends and family is to use a safe word.
23:21So a safe word is a word that only you and your group know.
23:25And I think that it's a really great tip to help you in scamming situations like the hey mom situation.
23:31Really easy to prepare, but it could be really critical in a moment where you don't have a lot of time to think.
23:37And reporting helps.
23:38Yeah, if there's sort of one message, it's report, report, report.
23:41Because the more reporting that happens, the more data that we get, the more we can understand what sorts of scams are out there,
23:47who's more likely to be targeted, who's more likely to be a full victim and so on.
23:51Scamwatch.gov.au is where you report scams in Australia.
23:54It's pretty much sort of like a clearinghouse.
23:56And they will send that information on, in some cases, to the police and other authorities.
24:02I love the idea of a scamming clearinghouse.
24:04Like, we've ordered too many scams!
24:06Bring the truck!
24:07Bring the trailer!
24:08Face painting for the kids!
24:09No windows!
24:10There's a whole lot of other things we can do.
24:15For example, stay vigilant around knowing how your bank, your insurance company, your utility providers all interact with you.
24:22The other thing is to keep a perspective on urgency.
24:25So, a lot of these scams really use time urgency as the emotion to sort of hook you in.
24:31And look, the reality is, in life, very little is super time urgent.
24:36So, I think just take a breath and just think for a minute.
24:39Because that will be enough to actually sort of get your cognitive processes going again.
24:43Is it better to hang up on scammers or be polite to them?
24:47We've got to remember that these people who contact us aren't always doing this willingly.
24:54So, organised crime engage in human trafficking to get people against their will into these call centres to make these phone calls.
25:03So, the person contacting you may not actually be the person where the money's going to.
25:07And they're a victim of crime as well.
25:09They're victims of human trafficking who are being used to try and extract money from you.
25:12And the real crime there is capitalism.
25:15100% my friend.
25:17That is the most ABC thing ever since.
25:20Actually, Rhys, I don't think they say it enough.
25:24Have you ever been on the phone with a scammer and start to get to know them?
25:28I had a friend of mine get scammed.
25:31We were pretty sure it was a romance scam.
25:33It was quite obvious.
25:34And we said to her, look, this isn't real.
25:37And to prove it to her, we found the person whose photo had been taken and asked, have you been speaking with her?
25:43And he said no.
25:45And she said, I'll verify it myself.
25:46And then tried to pick him up.
25:48So...
25:49Slightly different story.
25:52But a really good one, though.
25:56Yeah, I liked it.
25:57You can actually do a reverse image search and see if that image has popped up anywhere else on the internet.
26:01And what we know is that, particularly in romance scams, you'll see the same profile image used across multiple profiles.
26:08And that's a surefire bet that it's a scammer trying to lure you in, basically.
26:13Well, I've been deep faked and the audience have been quished.
26:18Which leaves our comedian friends as the only ones who haven't been the victims of a scam tonight.
26:23But we're about to change all of that.
26:25Rhys, Steph, I'd like to invite you to play a game we're calling There's No AI In Me.
26:30In this game, we've taken some real audio clips of things you have both actually said and recreated them using a pretty basic AI voice emulator.
26:46Your job is to tell your real voice from the AI recreation.
26:51Easy?
26:52Well, we'll see.
26:53Rhys, you're up first.
26:54You're going to hear two clips of the exact same phrase.
26:57Which one is you?
26:59Hello, I'm Rhys.
27:00I'm from Australia.
27:01This is my real voice.
27:02Ridiculous.
27:03Uh...
27:04I got...
27:05Here's the thing.
27:06I got engaged a couple of years ago.
27:07I chew a man.
27:08Sorry, ladies.
27:09Wrong treaty.
27:10Stop barking.
27:11Uh...
27:12Hello, I'm Rhys.
27:15I'm Rhys.
27:16I'm from Australia.
27:17This is my real voice.
27:18Ridiculous.
27:19Uh...
27:20I got...
27:21Here's the thing.
27:22I got engaged a couple of years ago.
27:23I chew a man.
27:24Sorry, ladies.
27:25Wrong treaty.
27:26Stop barking.
27:27Uh...
27:28I mean, I don't sound like that.
27:30Um...
27:31I think the second one is me.
27:34And I'm basing this purely on there is a mic pop.
27:38Like, I can't tell the voice difference.
27:41But that's...
27:42What I'm saying is that's the only thing that's making me think that.
27:44Well...
27:45I cannot pick it out.
27:46Well, maybe that's the kind of thing you should be looking out for.
27:48Because it's correct.
27:49You are.
27:54That's still terrified though.
27:56Yeah, I know.
27:57It is.
27:58It's terrifying.
27:59Yeah.
28:00Okay, Steph, you're up.
28:01Here's your first clip.
28:02I know that what we're really using AI for is, um, to make our Instagram pictures look
28:07really hot.
28:08That is bloody hard.
28:09Mm.
28:10Because my initial reaction was to go for the first one until I heard the second one.
28:13But they've done that edit to make it sound like it's on stage.
28:14Mm.
28:15I thought there was a mic pop again, like you said, on the second one.
28:16I don't know why.
28:17I think I'm gonna be wrong, but my gut is to go for the first one.
28:18Well, the AI has fooled you.
28:19It was true.
28:20Ah!
28:21Hush!
28:22No, it was not!
28:23And also, my voice sounded great, so thank you.
28:38Please give them a round of applause.
28:40Such great work.
28:45Please give them a round of applause, such great work.
28:50Please give a huge thank you to our guests, Rhys Nicholson and Steph Tisdall.
28:59And of course to our resident experts, Professor Danielle Raynald and Dr David Bartlett.
29:08This week on Crimelight we only scratched the surface of the science behind the scams
29:14but we've learned that a little vigilance goes a long way and how safe words aren't just
29:19for the bedroom.
29:20I'm Julia Zamiro, goodnight.
Be the first to comment
Add your comment

Recommended

24:33