- 2 days ago
- #realityinsighthub
Crime Night! Season 1 Episode 2
#
#RealityInsightHub
🎞 Please subscribe to our official channel to watch the full movie for free, as soon as possible. ❤️Reality Insight Hub❤️
👉 Official Channel: https://www.dailymotion.com/TrailerBolt
👉 THANK YOU ❤️❤️❤️❤️❤️
#
#RealityInsightHub
🎞 Please subscribe to our official channel to watch the full movie for free, as soon as possible. ❤️Reality Insight Hub❤️
👉 Official Channel: https://www.dailymotion.com/TrailerBolt
👉 THANK YOU ❤️❤️❤️❤️❤️
Category
😹
FunTranscript
00:00I'm Julia Zemiro, this is Crime Night, where we look at crime through the ideas, science
00:28and psychology that shape it. Tonight we're exploring scams. Or are we? We are. We'll
00:34look at where they come from, why they work and how even the smartest people can be taken
00:38for a ride. But before we hand over our address and mother's maiden name, let's meet the panel.
00:43She holds three degrees from three countries and yes, all from actual universities. I can
00:48only imagine what her hex debt looks like. Please welcome Professor Danielle Reynolds.
00:58I mean honestly, three degrees is very impressive. Is your collection complete?
01:03Nah, I doubt it. If somebody paid me, I would totally study another degree.
01:07What would that be? Probably environmental science to start with.
01:11Oh, correct. We need more of those because it's pretty bad out there.
01:15He's a consulting criminologist and expert in fraud and cybercrime. Clear your browser history
01:23and shred your receipts. It's Dr. David Bartlett. Now David, does anything still surprise you
01:34when it comes to white collar crime?
01:36If you asked me that question 12 months ago, I'd say nothing surprises me anymore. But in
01:41the last 12 months with generative AI and some of the things we've seen there, yeah, I get
01:45a little surprised.
01:46I'm getting a little bit terrified. Alright, geez. Up next, she studied law, trained in journalism,
01:52then realised she'd rather do something useful and took up comedy. The justice system's loss
01:57is very much our gain. It's Steph Tisdall.
01:59Steph, I'm wondering what kind of lawyer you think you might have made?
02:08I wanted to be a human rights lawyer.
02:10Ah.
02:11And then I met more humans and went, nah, no wrong.
02:17And finally, they're not just a funny person, they're a skilled negotiator once haggling
02:21their way through a mugging by offering a pack of cigarettes. Please welcome Rhys Nicholson.
02:25Yeah.
02:27Did that really happen? Did you?
02:32Yeah. It was before marriage equality had happened. And this guy was mugging me and I
02:38haggled and I gave him a packet of cigarettes and I was like, I'm just going to cancel my
02:41bank card so you don't need that. I gave him some cash. And then as he was walking
02:45away, and I'm not making this up, he turned around and said, by the way, my sister's a
02:49lesbian and I think it's really sad you guys don't have gay marriage. Have a good night.
02:52And that's how I met my husband.
02:59Scams aren't rare, they're relentless. Every week there's a new one in the headlines.
03:04Here are just a few that made the news.
03:06And we can't go without mentioning perhaps the biggest scam of all, duping millions of
03:19Australians every year.
03:20I really do believe in love of this one.
03:24Sweet Jesus, how are they still trying to convince us that it's a social experiment?
03:28Last year alone, scammers drained nearly $1.5 trillion globally. Australians themselves
03:37lost a staggering $2 billion to scammers in 2024. That was the quote to rebuild the Gabba
03:42for the Brisbane Olympics, which also turned out to be a bit of a scam.
03:46And tonight we're zeroing in on one of the most financially devastating of the lot, phishing
03:52scams. I'll let the Australian Federal Police explain what they are.
03:55Phishing is an attempt by cyber criminals to trick victims into giving them personal information
03:59or downloading malware. To do this, they often send messages pretending to be from companies
04:04you know and trust and direct you to click links or reply with sensitive information.
04:09That's very helpful information that everyone should know, but I don't think the message
04:13is really getting out there.
04:15Phishing is exactly like, well, phishing. You cast the nets wide, but instead of catching
04:21dinner, scammers are hoping to reel in your details. David, are phishing scams increasing,
04:26or are we just hearing more about them? Actually both. They are increasing and we
04:30are hearing more about them. But what we know is that these types of scams are still really
04:34underreported. And there's a whole range of reasons why people don't report. So it might
04:39be, for example, that they think it's just a small amount of money and not worth reporting.
04:43Or it might be that it's a romance scam and they're actually embarrassed to admit that
04:47they'd sort of fallen victim to this. And this falls into what we call the dark figure
04:51of crime. And that's the proportion of crime we know that sort of exists but doesn't
04:55get reported. So that $2 billion really is the tip of the iceberg.
04:59That's what I was going to say. Is that $1.2 trillion worldwide based on just what we know?
05:05Absolutely.
05:06That's not prediction. That is insane.
05:10So as criminologists, we work off of data, right? Like the data on scams is not thorough.
05:18And it's because of that dark figure. There's so much about it that we just don't know.
05:22Rhys, how often are you contacted by scammers?
05:25I reckon quite a bit. Like a lot. Like a lot. Like I feel like every third text at the moment
05:30seems to be a phishing scam or my family and it's really the same.
05:34So you've noticed an increase?
05:37Yeah. I reckon since the Qantas hack. I genuinely think since the Qantas hack it all kind of kicked
05:43off and I'm getting constant text all the time and emails and never a phone call though.
05:48Just pick up the phone, you know?
05:50Well that's actually completely predictable. So hackers, what they're interested in is the data.
05:55And your data's probably ended up on the dark web and been sold to hackers who are now using that
06:00information, yours and everyone else that's been hacked, to try and execute a phishing scam on you.
06:05Because I keep getting the, like, the job offer ones. Like, is there anyone, you know, the ones where they're
06:10kind of like, hi, and they're always the wrong name. It's like, hi Martine, I'm from blah, blah, blah.
06:15And you can make a thousand dollars a minute if you come and sit in this box.
06:20Yeah, that's actually, that's actually a bit different. That's probably part of a money laundering
06:25syndicate, trying to get you in to be a money mule. So what they, they've got all this money that they
06:30need to launder and basically they want to use your, your bank account probably to do that.
06:36And I have a follow up question. How would you get out of one of those?
06:44Move on Julia!
06:45Julia?
06:46A question for you. With this reporting thing, right, are there like squads that are set up
06:50for scams? I'm not even joking.
06:52No, absolutely.
06:53Yeah.
06:54Most police services in Australia have a, have a sort of fraud section. Some of those have,
06:58are very specific in relation to scams. But in Australia, it's actually the Australian Competition
07:02and Consumer Commission, the ACCC, that does a lot of the work around scams.
07:06And the interesting thing is that in 2025, reporting is down 25%. The amount of money lost is actually
07:13up 28%. And that suggests that scammers are actually becoming more successful.
07:18Hmm. Over 100 million spam emails are sent every minute. Not only this, a third of your
07:23phone calls are now scam related. Like everyone here, I've received bogus text messages requesting
07:29payment from toll company linked. I've had fake packages delivered by fake Australia Post.
07:34But before dodgy texts and crypto bros, there was this guy.
07:38It's the oldest rip off on the internet. An email from a Nigerian prince promising millions
07:43of dollars in exchange for a small fee.
07:46Oh, you're laughing now.
07:49The father of phishing scams. A fake royal with fake problems and a fake fortune.
07:55But the victims? Very real. The scam first appeared in the 1980s, often run by mail and fax.
08:01I mean, imagine scamming someone by fax. So noisy. Once email took off, so did the scam,
08:08which amazingly is still alive today and is still raking in thousands of dollars a year.
08:13And with the rise of social media, scams are surging. Scammers can slide into message threads,
08:18into your DMs. They can pretend to be your long lost friend, a distant relative, even a dead celebrity.
08:24Here's Michael Jackson, who's keen to make a comeback and trying to get his finances in order.
08:31Hey, it's Michael Jackson. I'm not really dead. Can you cash out me $600
08:37so I can come back to the United States and put out some more music?
08:41The original Smooth Criminal.
08:42But Rhys, what kind of phishing scam are you likely to fall for?
08:56Oh, 100% romance scams.
08:59No.
09:00Totally. I reckon if I wasn't married I'd be one of those people at the airport with a bunch of flowers saying,
09:04he said he'd be here.
09:06Even when I hear you talk about the dark figure of crime, I'm like, ooh, what's his vibe?
09:12Danielle, how many people are these scams catching out?
09:16Well, these phishing scams aren't targeted, right?
09:19They're literally phishing for victims.
09:21So if only 1% of those scam emails and texts successfully result in a victim,
09:27the scammers are winning, you know?
09:29You think about that Qantas hack.
09:31There were like 6 million approximately people whose information was hacked.
09:36If the hackers are successful at scamming 60,000 people, they are victorious.
09:42Mmm.
09:43The scammers also adapt their methods.
09:45So there's a whole lot of work that goes into trying to prevent these phishing emails
09:50getting through the system to their victims.
09:52But the scammers are working to get around it.
09:54So there's like spam filters, for example.
09:56And the idea is that the spam filter will search on keywords and then just send that message automatically to junk.
10:01So the scammers get around that by putting deliberate typos in.
10:05But not only that, coming back to Danielle's point,
10:07they're looking for that 1% of people who will actually fall for the scam.
10:11Now someone who doesn't detect the spelling error might be more likely to actually part with money.
10:16You might be thinking, I'd never fall for that.
10:19I'm smarter than that.
10:21Well, let's put that to the test in our experiment of the week.
10:24Now, we set up our very own harmless phishing scam to see just how many of you would take the bait.
10:37Our lure, a QR code, and the promise of a gift to remember your time here tonight.
10:42Because nothing says, trust me, like free stuff.
10:45This is quishing, which sounds like something you should keep to yourself,
10:48but it's actually a phishing scam.
10:50Early this evening, our audience was relaxing in the foyer when they were presented with this enticing poster.
10:56To receive their gift, they had to scan the code, which directed them to an official-looking United Parcel Post website,
11:04where they were invited to enter their details.
11:08Anyone feeling a little nervous right now?
11:11Now before we reveal the results, we want to point out that this experiment was designed to dupe our audience,
11:18like many scams out there.
11:20We wanted to test their trust in an institution, like the ABC, and my face,
11:24and use that trust to access their personal details.
11:28Alright, studio audience, let's see how many of you scan the code.
11:3262%.
11:34That's a pretty alarming figure.
11:36You're all keen to quish.
11:38But how many of you really wanted that mystery gift enough to add your details into a United Parcel Post website?
11:45Apparently 36% of you.
11:48Now, Cy Y, hello.
11:51You were one of the people that shared your details.
11:54Did you at any point think that this might be a scam?
11:56Well, I trust the ABC.
11:57Of course you do, me also.
11:59I trust you, Julia.
12:02That's correct.
12:03It's all possible for any of us to fall for a scam.
12:06Have you ever fallen for a scam before?
12:08Oh, yeah.
12:09Like, it can happen.
12:10I'm a chronic online shopper.
12:12So, it was starting to twig when I was like, I don't know this delivery service.
12:17Because I'm quite familiar with the delivery service.
12:21And that's good.
12:22You're figuring out the things, your limits around it.
12:24That's excellent.
12:25Are you ever freaked out by what you might be scammed on?
12:28It's just something you sort of have to be aware of.
12:30But you can't be shot off to everything.
12:31And I love free shit.
12:33And is that not the moral of the story?
12:37I love free shit.
12:38Thank you so much, Cy Y.
12:40Thank you for scamming.
12:41The most dangerous scams aren't the ones promising millions.
12:50They're the ones that feel personal.
12:52Scammers have graduated to a master class in psychological manipulation.
12:56Let's take a look at the evolution of one of the most insidious phishing scams of the past few years.
13:01It's been dubbed the Hi Mum scam.
13:04Parents are being targeted in a new text message scam.
13:07Authorities say the scammers are contacting parents claiming to be their children in an emergency.
13:13The scam texts were usually vague enough that parents would assume they were real.
13:17The child needs money for a new phone or their card has just been declined.
13:21They were basic.
13:22It was a simple text message that caught out Jen Irishu.
13:27Well, it started with, Hi Mum, it's me.
13:30This is my new number.
13:31You can save this one and delete the other.
13:34Believable.
13:35Believable.
13:36A message from who she thought was her son, which has now cost her more than $3,000.
13:43What started as vague text messages has evolved into something far scarier.
13:48Scammers can now clone voices using AI.
13:51All it takes is a few seconds of audio, often taken from social media.
13:55And suddenly, they can sound just like your child.
13:58Unsurprisingly, it gets worse.
14:00I received a phone call from my daughter.
14:03And she keeps crying and sobbing.
14:04She's like, Mom, help me.
14:06Help me.
14:07Mom, these bad men have me.
14:08They have me.
14:09And this man takes over and he tells me, listen here, I have your daughter.
14:13If you call anybody, if you call the police, I'm going to pop her stomach so full of drugs
14:17and then drop her for dead in Mexico and you'll never see your daughter again.
14:20I never had any doubt that it wasn't my daughter.
14:23It wasn't just her voice.
14:24It was her personality.
14:25It was the way she talked to me.
14:26It's the way she'd cry and sob.
14:27It was her replicated.
14:29A terrifying situation.
14:31David, what makes these techniques so effective?
14:34As a parent, that is frightening.
14:37Scammers really play on our emotions.
14:40That particular scam has combined two sets of emotions.
14:43One is the time urgency.
14:45So, you know, they need it now.
14:46And the other thing is the emotion of being a parent.
14:50And those two together are really, really powerful.
14:53My mum got hi-mummed from me.
14:57And she got a message saying, hi mum, this is my new phone.
15:00The other one's broken.
15:02Can you send me some money so I can get a new phone?
15:06And she said, oh, well, we literally live five minutes down the road from each other.
15:10Co-dependence.
15:11And it just happened to be that I had seen something like on social media about the hi-mum thing.
15:18And I just called her probably to borrow money at the same time.
15:22She said, what are you doing?
15:25You've just been texting me.
15:26Are you getting a new phone or what?
15:28And I was like, what are you on about?
15:30And she was very convinced.
15:31And I said, mum, that's a scam.
15:33And she went, really?
15:34I didn't know that.
15:35Because, I mean, my mum's a very smart woman.
15:36It was just, why would you ever think to question it?
15:39I think that's the thing.
15:41It was really quite fortunate that you happened to have that conversation with her.
15:44Oh, yeah.
15:45To stop her engaging in that scam even further.
15:47Well, luckily she was just going to rock up to my house as well.
15:49She's like, I don't like giving numbers out, you know?
15:51Like, luckily.
15:52But she was 100% convinced that that was me.
15:55That, yeah, I don't like giving numbers out.
15:57That's actually a really good strategy.
15:59Yeah, yeah, yeah.
16:00Are there certain types of people that are more likely to get scammed?
16:03Look, I think it's a bit of a myth that there are certain types of people who fall for scams.
16:08Like, oh, you have to be dumb.
16:09Or it's older people because they're not as technologically savvy.
16:13There's actually some recent research from last year.
16:16And it actually showed that teenagers' susceptibility to scams was increasing.
16:21One in four of them fell victim to these social media scams.
16:26And they were most vulnerable to these buying and selling scams.
16:29So, you know, like on Marketplace, for example, you say, oh, I really like that bicycle.
16:34I'll send you $500 and you send them the bicycle.
16:37They would send the money and not get the product that they paid for.
16:40So those are the types of scams that young people are falling victim to.
16:44Well, ultimately, I think anyone can be scammed.
16:46If it's the right scam delivered to you at the right time, when you're a bit vulnerable
16:53to that type of scam or you're busy and you're not paying attention, ultimately I think anyone
16:57can be scammed.
16:58Yeah.
16:59Is there a particular time of day where people might get scammed more than another time?
17:02Yeah, there is actually.
17:04People are most vulnerable when they're busy or when they're trying to shift gears.
17:08So, in fact, banks did some research that showed that Tuesdays at four o'clock were the time
17:14that people are most likely to get scammed.
17:16Look out, everyone.
17:17But when you think about it, it's people at four o'clock packing up from work.
17:22They're thinking about what they're going to do next.
17:24They're going home.
17:25So they're guards down a little bit and they're not quite sort of switched on.
17:29There's other research around that shows that Fridays, Friday afternoons in particular,
17:34people are more susceptible.
17:35And that, again, makes sense because your mind shifts sort of changing from work into
17:39the weekend.
17:40Yeah.
17:41Image and video manipulation has evolved so quickly we're reaching a point where it's
17:45almost impossible to tell real from fake, which makes it the perfect tool for scams.
17:50Here's Lew Wall.
17:54Humans love a good trick.
17:56Magic, multi-level marketing, monogamy.
17:59The only thing we love more is falling for them.
18:02Now scammers have taken that love digital with deepfakes.
18:05AI generated replicas of real people, from Tom Cruise and Paris Hilton eating cereal,
18:11to the late fake Pope Francis in Balenciaga.
18:14Rest in polyester, Popey.
18:17Deepfakes are advancing with every scroll.
18:19So how did AI get so good at making fake people?
18:22Well, one of the ways was Snapchat filters.
18:25Yep, those puppy ears and rainbow vomit effects weren't just for fun.
18:29Every Technicolor attack bomb was teaching AI to map human faces with precision.
18:34And while the machines were learning, our brains were learning to become accustomed to manipulated imagery.
18:39So these filters tricked us into teaching them how to get better at tricking us.
18:44All it takes is a few minutes of footage.
18:48A TikTok, a wedding speech, a hot take, that's enough to clone you.
18:52These days it's easier to deepfake your boss than get promoted.
18:56And I should know, I made one.
19:01Relax, obviously I outsourced it to a deepfake expert.
19:04Hello Liam, who are you, where are we, what do you do?
19:10We're a cyber security company and I'm a hacker here.
19:15Do you hack for good or for evil?
19:16Exclusively for good.
19:18What are the main indicators that I should be looking for to spot a deepfake?
19:22They struggle with beards, they struggle with people that don't have any hair.
19:27Liam, you've nailed this.
19:29You need surprisingly little footage to create a deepfake.
19:33What the real game-changer was when things were fast enough to be able to do it in real time.
19:38We're hacking into the mainframe.
19:40Using one five-minute red carpet clip and disturbingly little time,
19:44Liam knocked up my very own petit peu zmiro.
19:47I'm ready to be Julia.
19:49You are Julia.
19:50Oh my...
19:52That is insane.
19:54Hey, I'm Julie Z.
19:57I'm gonna go scare some Fisk fans.
19:59Hey Kitty Flanagan!
20:03That's crazy, isn't it?
20:10It's fun on one level but it's kind of extremely terrifying on the other.
20:14David, how worried should we be about getting deepfaked?
20:17The question you've probably got to ask yourself is why would someone target you to make a deepfake?
20:22So what we tend to see is that it's celebrities, politicians, people like that who tend to get targeted.
20:27So it's possible but it's probably more a case of being alert but not necessarily alarmed at this point.
20:34AI is changing all of this as we speak.
20:37And I know that you're right, like, you know, you really have to ask yourself why would somebody deepfake you?
20:42But even here in Australia, there have been cases of these high school students in New South Wales and Victoria
20:48who were accused of deepfaking pornographic pictures of their female classmates.
20:54So I think that this is a real problem.
20:57Can we just pretend we don't know?
20:59Can you just wipe my memory?
21:02Memory, please.
21:03It's terrifying.
21:04Yes, scammers keep getting smarter but all is not lost.
21:07There's a glimmer of LED light in the darkness proving we're not helpless in this fight.
21:12Is it the AFP?
21:13No.
21:14People don't even watch their videos by accident.
21:17I am talking about a vigilante YouTuber with over 4 million subscribers who turns the table on scammers
21:23and exposes their methods.
21:25Enter Jim Browning.
21:27The man is totally unknown.
21:28We can't tell you where he is or show you his face.
21:31He's helped lead authorities to fraud boiler rooms resulting in the takedown of major operations.
21:36The scammers would like nothing more than to expose him.
21:39Oh, can we just acknowledge the handcuff method that they've got going on there?
21:44They're just solving hands.
21:47I think I went to that wedding last summer.
21:55Browning is one of many scambaiters who expose and disrupt scam operations by hacking into their call centres,
22:02accessing their computers and finding out exactly who they are.
22:05Yeah, hi sir.
22:06My name is Carolina Fernandez.
22:09I am calling you from the Microsoft.
22:12Oh, hi Priya.
22:14Browning's huge YouTube popularity allows him to educate and protect the public against many worldwide scams.
22:23And Browning is not alone.
22:25An Australian company, Aparthe AI, is also using technology to scambate scammers.
22:30Aparthe chatbots engage with scammers on long fake calls.
22:35The idea being if they're chatting to a bot, they're not chatting to us.
22:38Hi sir, this is Mary speaking and I'm calling you from NBN.
22:43So we decided to upgrade your line.
22:45Right, right.
22:46So about the technician, how long do they usually take to do their work?
22:49You know, I might need to grab a cuppa while waiting.
22:52Yes, it's a quick job.
22:53Oh, that's good to hear.
22:55Just worried I might need to entertain them.
22:57Any tips on what to talk about with the technician?
22:59I could be a bit chatty, you see.
23:01Can you just hold for a minute?
23:03I will transfer my call to my teacher.
23:06With people like Jim Browning and Aparthe on our side, there's a real chance to fight back and reclaim our digital space.
23:12So Danielle, how do we arm ourselves against scams?
23:16Well, I think a really good tip for friends and family is to use a safe word.
23:20So a safe word is a word that only you and your group know.
23:24And I think that it's a really great tip to help you in scamming situations like the hey mum situation.
23:30And really easy to prepare, but it could be really critical in a moment where you don't have a lot of time to think.
23:37And reporting helps.
23:38Yeah, if there's sort of one message, it's report, report, report.
23:41Because the more reporting that happens, the more data that we get, the more we can understand what sorts of scams are out there,
23:47who's more likely to be targeted, who's more likely to be a full victim and so on.
23:51Scamwatch.gov.au is where you report scams in Australia.
23:54It's pretty much sort of like a clearing house.
23:57And they will send that information on, in some cases, to the police and other authorities.
24:02I love the idea of a scamming clearing house.
24:04Like, we've ordered too many scams!
24:06Bring the truck, bring the trailer, face painting for the kids, no windows!
24:10There's a whole lot of other things we can do.
24:14Like, for example, stay vigilant around knowing how your bank, your insurance company, your utility providers all interact with you.
24:22The other thing is to keep a perspective on urgency.
24:24So, a lot of these scams really use time urgency as the emotion to sort of hook you in.
24:30Look, the reality is, in life, very little is super time urgent.
24:36So, I think just take a breath and just think for a minute, because that will be enough to actually sort of get your cognitive processes going again.
24:42Is it better to hang up on scammers or be polite to them?
24:46We've got to remember that these people who contact us aren't always doing this willingly.
24:53So, organised crime engage in human trafficking to get people against their will into these call centres to make these phone calls.
25:02So, the person contacting you may not actually be the person where the money's going to.
25:07And they're a victim of crime as well.
25:08They're victims of human trafficking who are being used to try and extract money from you.
25:12And the real crime there is capitalism.
25:14A hundred percent, my friend.
25:17That is the most ABC thing ever since.
25:20Actually, Rhys, I don't think they say it enough.
25:23Have you ever been on the phone with a scammer and start to get to know them?
25:28I had a friend of mine get scammed.
25:31We were pretty sure it was a romance scam.
25:33It was quite obvious.
25:34And we said to her, look, this isn't real.
25:37And to prove it to her, we found the person whose photo had been taken and asked, have you been speaking with her?
25:43And he said no.
25:45And she said, I'll verify it myself.
25:46And then tried to pick him up.
25:48So, slightly different story.
25:51But a really good one, though.
25:55Yeah, I liked it.
25:57You can actually do a reverse image search and see if that image has popped up anywhere else on the internet.
26:01And what we know is that, particularly in romance scams, you'll see the same profile image used across multiple profiles.
26:08And that's a surefire bet that it's a scammer trying to lure you in, basically.
26:13Well, I've been deep faked and the audience have been quished,
26:18which leaves our comedian friends as the only ones who haven't been the victims of a scam tonight.
26:22But we're about to change all of that.
26:24Rhys, Steph, I'd like to invite you to play a game we're calling There's No AI In Me.
26:30In this game, we've taken some real audio clips of things you have both actually said and recreated them using a pretty basic AI voice emulator.
26:46Your job is to tell your real voice from the AI recreation.
26:50Easy?
26:51Well, we'll see.
26:52Rhys, you're up first.
26:53You're going to hear two clips of the exact same phrase.
26:56Which one is you?
26:58Hello, I'm Rhys.
27:00I'm from Australia.
27:01This is my real voice.
27:02Ridiculous.
27:03Here's the thing.
27:04I got engaged a couple of years ago.
27:06To a man.
27:07Sorry, ladies.
27:08Wrong tree.
27:09Stop barking.
27:10Hello, I'm Rhys.
27:15I'm from Australia.
27:16This is my real voice.
27:17Ridiculous.
27:18I got engaged a couple of years ago.
27:21To a man.
27:22Sorry, ladies.
27:23Wrong tree.
27:24Stop barking.
27:25I mean, I don't sound like that.
27:28I think the second one is me.
27:33And I'm basing this purely on there is a mic pop.
27:38Like, I can't tell the voice difference.
27:40But what I'm saying is that's the only thing that's making me think that.
27:44I cannot pick it out.
27:45Well, maybe that's the kind of thing you should be looking out for.
27:47Because it's correct.
27:48You are.
27:53That's still terrifying though.
27:55Yeah, I know.
27:56It is.
27:57It's terrifying.
27:58Okay, Steph, you're up.
27:59Here's your first clip.
28:00Because obviously we all know that what we're really using AI for is to make our Instagram
28:06pictures look really hot.
28:08Because obviously we all know that what we're really using AI for is to make our Instagram
28:14pictures look really hot.
28:17That is bloody hard.
28:21Because my initial reaction was to go for the first one until I heard the second one.
28:26But they've done that edit to make it sound like it's on stage.
28:29I thought there was a mic pop again, like you said, on the second one.
28:33I don't know why.
28:35I think I'm going to be wrong, but my gut is sure for the first one.
28:38Well, the AI has fooled you.
28:40It was true.
28:41No, it was not.
28:42And also my voice sounded great.
28:44So thank you.
28:45Yeah.
28:46Please give them a round of applause.
28:48Please give a huge thank you to our guests, Rhys Nicholson and Steph Tisdall.
29:01And of course, to our resident experts, Professor Danielle Reynold and Dr. David Bartlett.
29:07This week on Crime Night, we only scratched the surface of the science behind the scams.
29:14But we've learned that a little vigilance goes a long way.
29:17And how safe words aren't just for the bedroom.
29:19I'm Julia Zemiro.
29:20Good night.
Recommended
29:41
|
Up next
29:30
29:16
28:50
43:43
44:02
44:03
29:02
41:29
43:45
44:02
44:01
44:02
29:23
45:00
23:51
43:02
21:31
43:35
24:01
45:43
44:02
44:02
21:38
28:57
Be the first to comment