- 9 minutes ago
Category
📺
TVTranscript
00:26Hello and thank you for joining us for
00:29Crime Watch Live. We have got a busy program this morning so please do keep watching. You never
00:34know when you might be able to help the police with their appeals. Today do you recognize this
00:39man Luke Williamson? He was jailed earlier this year for child sex offenses but police think
00:45there are more girls out there who he may have exploited. We'll hear the inside story of how a
00:50far right keyboard criminal Gabriel Budash was jailed for terrorism offenses. He was live streaming
00:58the most vile, hatred, extreme right-wing views that you could possibly put out.
01:04Nobody! Keep down! Keep down!
01:10Bet he was shocked when that happened. And we're out with a court artist whose job gives her a front
01:14row seat on the biggest cases in the country. Sometimes in court, although I'm there, I can't
01:21hear because in my mind there's a different monologue going on describing the eyes and the expression on
01:27face and those things are taking over my thought processes. Backstage here is a hive of activity
01:34as our team and the police are ready to take your calls. To get in touch, you can scan the
01:39QR code
01:39below using your phone's camera and just following the link. Or you can call us on 08000 468 999.
01:46You can text us on 633 99. Start with the word crime, leave a space, then write your message.
01:52Or email us. And the address for that is cwl at bbc.co.uk.
01:58And remember, the details will be on your screen throughout the programme, so please do get in touch.
02:04First, you may have heard about the horrific murder of Chamali Turhan, whose death led to a 45-day hunt
02:12for his remains. They were found, but his killer is still at large. This film contains some disturbing details.
02:24Ethics police, what's your emergency?
02:26Hiya. We have friends, but from last night till now, we can't hear about him.
02:36We know that he left the country within 30 minutes of the police being called.
02:42The site is huge. We knew we had to find his body.
02:48He needs to provide answers for his involvement.
02:55On the 19th of November 2024, 45-year-old Chamali Turhan was reported missing.
03:03We received a call into Essex Police about 8 o'clock on the 19th of November from a friend of
03:09Chamali
03:09who hadn't been able to get hold of him on the phone, and he hadn't turned up for work.
03:15Two hours later, police received more information from another friend,
03:20leading them to suspect their missing man had come to harm.
03:28Police began searching for footage of Chamali at the time of his last sighting.
03:35We searched CCTV on the night of the 18th of November,
03:39and we were able to see that he went to the Globe public house.
03:49And then back into Duke Street with another male.
03:52The CCTV showed the unknown man leave barista, but without Chamali.
04:06We see Chamali on the 18th of November.
04:09At that point, I declared it a homicide investigation.
04:17Convinced Chamali had never left the barista bar,
04:20they dispatched a team to search the property.
04:24Officers go into the bar and close the property down,
04:27and then we take it as a crime scene.
04:29Police! Anyone in here come out? Show them your hands!
04:34I first received a call as an exclamation, really,
04:38at how much blood there was at the scene.
04:41There were blood spatters in the ceiling, on the walls.
04:46There were blood spatters on the salt and pepper pot.
04:50There was a blood spatter on the underneath of a table.
04:54There were blood marks to the back of the bar,
04:56and in the carpet, up the stairs, to the back entrance.
05:00But there was no sign of a body.
05:04Then a man called Illy Cyprian Joseph came forward with information.
05:10Who's Joseph? Are you Joseph, are you?
05:13Yes.
05:13Take a seat for me, sir.
05:14Yep.
05:15Cyprian approaches officers to say that Chamali is dead,
05:20and he helped to clean up.
05:22I'm going to rest your suspicion of murder at this time.
05:26Illy said he had helped move the body.
05:30My name is Illy Cyprian Joseph.
05:32Whilst explaining how he was involved in cleaning up the scene,
05:37he gave detectives another name.
05:39I received a phone call from Jay.
05:44Illy explains to us that Jay tells him to come to the bar to clean up.
05:50He tells us that when he arrives,
05:51he sees Jay standing over a male,
05:55a bloodied male who he sees as dead.
06:01He helps to carry the body out of the premises.
06:06Police already knew the manager of Barista was a man called Chaihan,
06:10known to his friends as Jay.
06:16Chaihan, the manager of the bar,
06:18matches the description of Jay,
06:20who we see walking around with Chamali.
06:22So I now know my murder suspect is called Chaihan.
06:25His surname is Dinla.
06:28Detectives started to track down their suspect
06:30and then received new information.
06:33We had a phone call for a taxi company.
06:36The taxi took Chaihan to Stansted Airport,
06:40and we then know that he left the country.
06:42And that was at approximately 20.00 on the 19th of November.
06:48So within 30 minutes of the police being called,
06:51he's gone.
06:58With their prime suspect on the run
07:01and one man in custody,
07:03police try to establish what possible motive
07:05there could be for this brutal crime.
07:10Chamali meets Chaihan outside the globe.
07:13They go in together.
07:14They have a beer.
07:17They talk using a lot of Han language.
07:21They shake hands,
07:23and then they leave the bar together.
07:25They walk back to Barista.
07:28Although they appeared to be friends,
07:30it turned out they shared a love interest.
07:35Chamali wasn't too bothered about that interest,
07:39but Chaihan really struggled with Chamali's relationship with the same female.
07:4772 hours after Chamali's murder,
07:50police discover some of his body parts in a bin behind the barista bar.
07:59Just finding these body parts,
08:02I mean, just brutal.
08:03And it, you know, just really focused the team on how important it was to not only find Chaihan,
08:08but find Chamali.
08:13Determined to recover his remains,
08:15police realise that his body is likely to have ended up at a landfill site.
08:21It was 1,500 tonnes of rubbish,
08:25several football pitches and a couple of metres deep
08:28that we knew we had to search to find Chamali.
08:32We needed in excess of 30 officers a day.
08:35The site is huge, but the area that we searched was confined.
08:39So we knew we had the best chance possible to locate his body.
08:44Then on the 11th of January, 2025,
08:47after a 45-day search at the landfill,
08:51officers finally recovered the body of Chamali Turhan.
08:58I was, you know, incredibly proud of Team Essex, really.
09:03We never gave up on finding Chamali Turhan's body.
09:07I had the support of senior officers,
09:08but, you know, life had to go on and other work had to be done.
09:11So, you know, everyone sort of pulled out all the stops.
09:14And we did what no other experts thought we could do.
09:19On the 10th of October, 2025,
09:21Illy, Cypry and Joseph were sentenced to seven years in prison
09:25for his part in the murder.
09:29Police are still looking for Cheyhan Dinla.
09:33We're yet to deal with Cheyhan Dinla.
09:36He needs to provide answers for his involvement.
09:42At the heart of this case, as a father,
09:45there's a daughter who is now going to grow up without her dad.
09:50And it's so important now that we seek justice,
09:53not only for her, but the rest of Chamali's family.
09:58I'm now joined by DCI Louise Metcalfe from Essex Police.
10:02You saw in the film there.
10:03It really is a disturbing case, this one, Louise.
10:05Before I come to you,
10:07I just want to read some words from Chamali's family,
10:10who said,
10:18It's a hard question.
10:19I know you've been dealing with them a lot.
10:21How are they coping at the moment?
10:23They are devastated by the loss of Jamali
10:27and friends and family.
10:29And they've still got questions that need answers.
10:32So, one of the big questions is,
10:34do you know anything about a motive
10:35as to what could be behind this attack?
10:39So, we know that Jamali and Cheyhan Dinla
10:42were in a relationship with the same lady.
10:44Jamali seemed to be quite relaxed about that,
10:47but Cheyhan Dinla was not.
10:49We believe that was the motive behind the events
10:52of the 18th into the 19th of November.
10:55Okay.
10:56We've got some CCTV we're going to look at now.
10:58Now, this was actually on the night of the murder.
11:02What do you think may be happening here?
11:04So, we see Cheyhan Dinla at the bottom of the screen
11:07and the coat shaking Chamali's hand.
11:09Now, has the issue been resolved?
11:13Is Cheyhan happy?
11:14I think probably not,
11:16because now he's tapping his fingers on the bar afterwards.
11:19Chamali looks very relaxed.
11:20Quite pensive, isn't he?
11:21Absolutely.
11:23And this is both males there leaving that same night,
11:27the night of the murder.
11:28They're together.
11:29Correct, yes.
11:30Cheyhan has invited Chamali back to Barista.
11:34So, that's the last we see of Chamali alive.
11:37Okay.
11:38So, then Barista was the bar that Cheyhan,
11:40or known as Jay, owned.
11:43What do we know that happened in the hours after the murder?
11:47So, we know that Chamali wasn't seen alive coming out of the bar,
11:54but Cheyhan was.
11:55Okay.
11:55And Cheyhan left the bar wearing a different outfit that we see here.
11:59You remember he arrived in a camel coat and light clothing.
12:03He did.
12:03He's sitting walking there in a dark outfit
12:06that we know was brought to him during the night.
12:08Yeah, so we can see that.
12:09And this is the night of the murder.
12:12This is it.
12:12And he looks incredibly calm there, strolling around.
12:16You noted that he's in a different jacket to what he entered in,
12:19but that was the night there.
12:21So, two went in, one came out,
12:23and sadly you discovered what happened within there.
12:27I have to give credit, we all do,
12:29for the incredible job,
12:31a huge-scale job that you and the police did
12:34to find the body.
12:35It took some time.
12:36So, it was 45 days of 30 officers a day searching,
12:411,500 tonnes of landfills, domestic and commercial waste.
12:46And we have, we are the only police force
12:51to have a successful active search where we've recovered a body.
12:55Wow, so you did recover that, as sad as it is,
12:58but it gave some kind of closure to the family
13:01that you were able to find a body.
13:03At this time, what you don't have is the prime suspect.
13:08What do we know about where he could be?
13:10And let's just remind us who you are looking for.
13:13Within 30 minutes of the police being notified of Jamali's disappearance,
13:17Chehan booked a taxi to Stansted Airport.
13:20We don't know where he is.
13:21We need him.
13:21We need to find him.
13:22We need to know where he is.
13:23And he's got questions that he needs to answer.
13:26Yeah, so this is him.
13:27This is him on screen, the owner of that bar.
13:30Often known as Jay to his friends.
13:32It's worth just reiterating that, isn't it?
13:33They may know him as Jay, but you know he went via Stansted or to Stansted.
13:38We don't know where he is now.
13:39And if anyone does, that's who we need to hear from.
13:43Thank you, Louise, for coming in.
13:45I should say there is a Crimestoppers reward as well that is up as £10,000,
13:51which could help with this investigation.
13:52So do get in touch if you can help.
13:54And if you know the whereabouts of this man, Mr Dimmler, we need to hear from you.
13:57Do get in touch.
13:58Details are at the bottom of your screen.
14:02Coming up, have you heard of synthetic Frankenstein identities?
14:06Well, we meet the team tracking down the online personas behind this new type of fraud.
14:14But first, we're with Detective Constable Stacey Waslin and D.I. Kat Slaughter from Humberside Police.
14:21They've been investigating a high-risk sex offender called Luke Williamson.
14:25He's been convicted of a number of sexual offences involving underage girls,
14:30but they believe there may be other victims out there.
14:33Thank you so much to both of you for coming in today.
14:36And Kat, let's start with you.
14:38Could you just tell us a bit more about what Luke Williamson has been convicted for?
14:42So Luke Williamson's a 34-year-old male from the Bedfordshire area.
14:46And in May 2025, the investigation commenced after his mobile phone was seized as part of another investigation.
14:53When his phone was examined by the digital forensic team,
14:57there were messages found within that which indicated there were further sexual offences against other victims,
15:03which he later admitted pleading guilty to 20 sexual offences,
15:07including sexual communication with a child in decent images and arranging to meet them to commit sexual acts against them.
15:14He was later sentenced in January this year, where he received extended sentence of 11 years,
15:20which equated to eight years in custody and 11, sorry, three years on licence with stringent conditions to manage him
15:28once he got into the community.
15:29I mean, this is this is just awful and really good to hear that he's been sentenced.
15:33But the reason that you're here today is because you don't believe that this investigation is over, do you?
15:39No, so he was dealt with for offences against three young girls.
15:43But with the inquiries that have been ongoing, we're keen to speak to anybody else that's been in contact with
15:48him.
15:49So we're now in phase two of the investigation called Operation Blue Neon.
15:53We're encouraging people to get in touch with us if they've been in any contact with him.
15:58We're looking to find the victims, support them and ultimately bring any of their cases to justice.
16:03Absolutely. As you say, that is so key, isn't it?
16:06Well, Stacey, let's take a look at this in a bit more detail.
16:09And we've got an image here of the areas that you think Luke Williamson frequents.
16:14And how does he initially kind of reach out to the victims? What's the process?
16:19So Luke Williamson was a travelling salesman.
16:22However, in his own time, he was willing to travel considerable distances to meet with victims.
16:28So these are the areas that we know he is offended in because of his convictions.
16:34However, ultimately, we are aware that there could be victims all over the UK.
16:39What Luke would do is send a large volume of blanket messages to many social media accounts of young girls.
16:48And these messages would be telling them, hey, hi, how are you?
16:54And then they'd move very quickly into sexualised contact.
16:59He would make them feel as if he loved them.
17:02He would tell them that he adored them, that they were beautiful.
17:05He would even belittle himself to seek reassurance from the girls and make them feel that he loved them and
17:12that they were in a relationship with him.
17:15And he would agree or try and encourage them to meet with him with the promise of vapes and alcohol
17:22and was even willing to sneak into young girls' bedrooms while their family slept to commit sexual offences against them.
17:32Incredibly manipulative, isn't it?
17:34And he used a variety of different handles, didn't he?
17:37We can actually take a look at some of them now, can't we?
17:40So these are the social media handles that we know that he did use with our victims.
17:47However, we are aware that there could be more social media handles.
17:51Now, Kat, you mentioned that Luke is a 34-year-old male, but he was actually engaging in conversations with
17:58females as young as 12 to 16 years old, wasn't he?
18:02Yes, he was. He manipulated them to believe in they were in a relationship with him, so some didn't see
18:08themselves as victims or feel ready or able to contact us to report any offences.
18:13And again, this is why you're here today. It's so important that these victims feel like they are supported and
18:19they can reach out to you.
18:20Absolutely. We're keen to speak to anybody that's been in contact with Luke.
18:26It's not just teenagers. Some of the victims might be older now, having had contact when they were teenagers some
18:31years ago.
18:32We're appealing to any parents, carers or friends.
18:35If you think your child or a friend or anybody you know has been in contact with Luke or you
18:41recognise his photograph, please get in touch with us.
18:43So we've opened a major incident public portal. So if you've got any concerns or you know anything about any
18:50offending, go to the Humberside Police website, go through the Operation Blue Neon link and quote log 72 of the
18:5817th of November 2025.
19:00Or if you want to report anonymously, please go through Crime Stoppers.
19:04It's really good to know. And sadly, Stacey, with investigations like this, it's often the victims that can feel that
19:11real sense of shame and feel like it's their fault when it's not, is it?
19:16Absolutely. It's so important to stress that this is not their fault and they shouldn't feel any shame from this
19:22at all.
19:24Clearly vulnerable and he preyed upon that.
19:27We want to make it clear to victims that even if they don't feel ready to talk to us about
19:32this yet, we are there.
19:34We will listen, even if it's in 10 years time.
19:36It's their voice, their choice, and they can choose when and how they speak to us.
19:42It's a really powerful message, Stacey Katt. Thanks again for coming in this morning.
19:46Appreciate it.
19:47If you have had any contact with Luke Williamson or, you know, anybody that has, then please do get in
19:53touch.
19:55Really is such a disturbing case, that one.
19:58But sadly, it's yet another example of how social media and the internet can be utilised to target victims.
20:06I'm now with Nairi Alexander from the Internet Watch Foundation, and they advise and work with the government and tech
20:11companies to eliminate child sexual abuse and imagery online.
20:15And, Nairi, this, unfortunately, seems to be very much a growing problem.
20:20Yes, Rav, unfortunately, it does feel like that.
20:23I've worked in online child abuse now for over 10 years, so as a police officer and now at the
20:29Internet Watch Foundation.
20:30And despite our best efforts, the increase in accessibility, the technological advancements, it feels like this is a problem that
20:39we're not winning.
20:40And actually, at the Internet Watch Foundation in 2025, we had a record-breaking year in terms of the number
20:46of reports that our analysts actioned.
20:49We saw over 300,000 separate occasions where child sexual abuse material was found in the wild online.
20:57That's a staggering amount, isn't it? It really is. So, so high.
21:01And you've seen there's lots of different ways that this abuse can actually happen.
21:06Yeah, absolutely. And I think we need to remind ourselves that, you know, whilst this doesn't happen to everyone, it
21:12can happen to anybody.
21:14And, you know, all it takes is a child that has a device that's got Internet access and a camera
21:20and somewhere where they can use that device unsupervised, often a bedroom, sometimes a bathroom.
21:26And we're leaving them vulnerable to being groomed online in this way.
21:30And what we see really frequently, as we've heard, is offenders using social media channels and often posing as children
21:37or young people themselves to groom children online.
21:40And it can happen really, really quickly. I'm talking in minutes.
21:43They will be asking children to share intimate images of themselves.
21:47It's just so scary, isn't it, to think that something like that could happen in minutes.
21:50And this would have a huge impact on the young person themselves, wouldn't it?
21:55Yeah, absolutely. I mean, it's a real power play by offenders.
21:58They're absolutely leaning in to those feelings of shame and guilt and fear and making young people feel like they
22:06have nowhere to turn, no one to talk to and scared of taking action.
22:10But they have got somewhere to turn. So what can we do to fight back?
22:14Well, you know, I'm a parent myself.
22:16It's scary, isn't it? And I'm constantly thinking, am I doing enough?
22:20Am I having enough conversations? Do I know enough about what's happening, even though I work in this space?
22:25And, you know, my view is that the best way we can keep our children safe online is actually what
22:31we do offline.
22:32And it's about building those trusted relationships, having clear lines of communication, making it safe space for children to be
22:40able to speak to you about what they're doing online,
22:42what they're seeing, what risks they're facing and removing that fear of judgment and ensuring that they know they can
22:49talk to you or a trusted adult.
22:51Talking is so important.
22:52Yes.
22:52Speaking of which, you have got a talk checklist. Talk me through that.
22:57Yeah. So the IWF has some amazing resources. The talk checklist is one of them.
23:01It's a resource for parents, carers, professionals, adults working with children.
23:04And it gives some structure and guidance on how to start these conversations off.
23:09So talking to your child about online sexual abuse.
23:12Keep it age appropriate. There's lots of advice on that.
23:15But little and often build into regular communication so it doesn't suddenly become this taboo subject.
23:21Agree to ground rules. So where are your children using their devices?
23:24Are they supervised? Are they not? Come to some kind of agreement with your children about expectations on using devices
23:31online and learn about the platforms.
23:33Learn from your children. What are they using? What are their friends using?
23:37And we need to speak to young people to find out where they're what they're using online.
23:41And finally, take some responsibility to know how to use the parental controls, safety measures in place that are available
23:49online and on certain on platforms, social media, etc.
23:52That's very important. What about report remove?
23:55Yeah, amazing. I mean, for me, this is one of the jewel in IWF's crowns.
23:59It's an amazing resource for under 18s in the UK that allows them to confidentially, anonymously, completely free of charge,
24:07report nude imagery of themselves to the IWF.
24:10And this tool was created, co-created by IWF and Childline NSPCC and it gives children under 18s this opportunity
24:20to share their imagery with us and prevent it from being further distributed online.
24:25Our analysts in Cambridge will assess the image against UK legislation, confirm that it is child sexual abuse material.
24:32And then we will give it a unique fingerprint, which we call a hash.
24:36It's a series of letters and numbers, which means it's only attributable to that image.
24:41IWF has over three million of these image hashes and we share them with our member organisations.
24:46So that's tech companies, social media platforms, law enforcement as well.
24:52And it's a really, it's a proactive way of young people being able to prevent the distribution of their images
24:57and alleviate that fear of them going viral, which we know is a massive, massive worry for young people.
25:03It really is.
25:04Yeah. And in parallel to that, the service offers confidential counselling via Childline.
25:10So it's a really holistic approach to supporting children when they might feel like they have nowhere else to tell.
25:15All of it is so important. They have got somewhere to turn.
25:17We put the details on the screen when you were just speaking about that, Nairi.
25:19But thank you for joining us, Michelle.
25:23Now, how the counter-terrorism unit unmasked a far-right extremist who is encouraging racism and hatred online.
25:33He was live-streaming the most vile, hatred, extreme right-wing views that you could possibly put out.
25:43He knew what he was doing when he was making these posts online.
25:48He was a very dangerous individual.
25:59For the last 14 years, I have seen a real increase in the online space and also the emergence of
26:07the extreme right-wing that want to circulate bad information to a national and global network.
26:17In June 2023, the South East Counter-Terrorism Unit were alerted to a chilling online post from someone calling themselves
26:26that one domestic terrorist.
26:29To any current or future state agents investigating me, I meant everything I said and more.
26:35I'm a genuine unhinged threat to you and your society.
26:38I'm going to build a bomb.
26:43Anybody alleging to build a bomb is deeply concerning.
26:47We need to do everything in our power in order to ensure that our communities are kept safe.
26:54That's why we looked at every opportunity to disrupt any activity that he was planning to cause.
27:01A specialist officer was tasked to find out who was behind the videos.
27:07To protect his identity, he will remain anonymous.
27:13My initial thoughts are, it felt real, and I needed to understand the risk because the public safety will always
27:19be paramount.
27:21If he was sat in front of his computer, he had his face covered with sunglasses and a skull mask,
27:28and he had a ballistic vest, and at points was holding a firearm or a pistol.
27:38Police poured over hours of shocking material he had made.
27:43He was live streaming a number of episodes that he posted online with the most vile, hatred, extreme right-wing
27:52views that you could possibly put out.
27:57Targeting members of the black community, anti-Semitic, and other ethnic groups.
28:04He was trying to push white supremacy.
28:07He would have a picture of Auschwitz behind him on some of his episodes, and he would have extreme right
28:14-wing memorabilia on his ballistic vest as well.
28:18He said he wanted somebody to take on his legacy.
28:21This isn't just the dark web.
28:23This is easily accessible where anybody could see that, including children.
28:29It really was horrendous what he was doing.
28:32With concern rising about his incitement to engage in dangerous behaviour, police needed to find out who he was and
28:40where he was.
28:43We couldn't get a positive identification of who he was through the video footage.
28:48So through a combination of digital forensic investigation and intelligence that was received, we identified who I believed was our
28:58main suspect.
29:00His name was Gabriel Budash, a 23-year-old who lived alone in Western Supermare, Somerset.
29:09He wasn't known to police at all.
29:12He was originally from Poland, but brought up in this country.
29:19Though police had a name, they had no idea what he was really capable of.
29:25He did have a gun, but I didn't know if he had explosives or not.
29:32Is he solely online, or is there something in his head that's going to flick that suddenly he moves to
29:37the real world?
29:39Then they made an even more disturbing discovery.
29:45Through financial forensic analysis, I found out that he had a 3D printer.
29:50It was being potentially used to manufacture a firearm or a pistol.
29:55That really did up the ante, because he had the pistol.
29:59He had the ability to print a weapon.
30:01And for me, that is probably the highest risk.
30:07Officers had to act fast, but couldn't risk him finding out they were coming.
30:14I wanted to make sure that the individual was at that location.
30:17And if we could catch him online offending, that would be perfect evidence for a prosecution.
30:26So, on the 1st of August, 2023, they arrived at his front door.
30:33You get one chance to get it right.
30:37Nobody! Keep standing!
30:40Keep standing!
30:42He was shocked and surprised.
30:46Oh, Jesus Christ, man!
30:49He was sat at his computer, logged on.
30:52It made it easier for us to access all of his material,
30:57and also shows that he was active at that point in time as well.
31:00What's your first name?
31:02Gabriel.
31:03Gabriel. Arm police, obviously.
31:04Oh, yeah.
31:05Gabriel, stand up for me.
31:07Yeah.
31:09What's going on?
31:11Yeah.
31:14Unscrewed.
31:17Inside his room was a wealth of evidence.
31:21The pistol was there.
31:22The ballistic vest was there.
31:24There was a green screen.
31:25So, he kind of had his own little production studio where he was making these episodes from.
31:33Police also seized Budash's computers and hard drives.
31:38There's somewhere in the region of 17 terabytes of data stored on them.
31:43If it was printed out on paper, that would be about 1.5 billion pages of text.
31:48That amount of material is extraordinary.
31:53When we started to look at him more and look at his devices, it was clear that he really saw
31:59himself as a leader.
32:01We needed to understand how much influence he was having and the wider public safety threat of trying to radicalise
32:09others.
32:28Despite overwhelming evidence against Budash, he maintained he didn't intend to hurt anyone.
32:39Our sense of humour is, you know, extreme.
32:45Very, you know, what you'd expect.
32:48One of the things that we generally has done as a meme is we post videos like that.
32:58I just roughly know that it's a video that shows how to make a bomb.
33:03I haven't watched the full thing.
33:05I posted it for the meme effect with an intentionally, you know, hysterical caption.
33:12He said this was his way of entertaining people, but the reality was he was committing acts of terrorism.
33:24Budash was found guilty of terror offences.
33:29When it came to the court proceedings, the judge determined that he knew what he was doing when he was
33:36making these posts online.
33:39On the 24th of October, 2024, Gabriel Budash was sentenced to 12 years in prison, followed by three years on
33:48licence.
33:49He was also given a five-year serious crime prevention order and a 30-year terrorism notification order.
33:58Spreading hatred, such as Gabriel Budash was doing, will not be tolerated.
34:05And the sentence that he was given prevents him from doing it again.
34:12Budash was a clear danger to society.
34:16Public safety will always come first.
34:22For the individuals that engage in this type of offending, we will find out who you are and you will
34:27be prosecuted.
34:33Well, it's not so funny for Gabriel now, is it?
34:36No, it's not.
34:36Not at all.
34:38Gabriel Budash had created an online alias to try and hide his criminal activity.
34:42But there is a new type of identity fraud that's hitting the World Wide Web.
34:47And it takes your personal information to rip off businesses, with the cost well being passed on to us, the
34:52consumer.
34:53Yeah, synthetic Frankenstein identities, they're called.
34:57It may sound like something out of a horror movie, but I'm afraid they are very much real.
35:01So we've got fraud identity expert, Katerina Pranjic, with us now to explain more.
35:06Katerina, what are they then, these Frankenstein identities?
35:10Good morning.
35:11I mean, Frankenstein identities or synthetic identities, basically fake identities.
35:17They are created using bits of other people's identities.
35:21For example, it could be your surname or your date of birth and then my postal address.
35:27I live in a flat, so it's easily available.
35:30And then those identities are being created so criminals could apply for loan insurances or credit cards.
35:38And you might wonder, where do they get this data from?
35:41So it can come from phishing, so from links that are attached to suspicious emails or text messages,
35:47or from dark web, even social media profiles, if that data is available.
35:52And it's a huge problem.
35:53Our latest cyber crime report found that one in ten frauds involved synthetic identities last year.
36:03And the fraudsters, they're playing the long game, aren't they, when it comes to this type of fraud.
36:09They're nurturing these false identities over time.
36:12It makes it, yeah, it makes it sound so odd, because we often see fraudsters as opportunistics, right?
36:17Yeah.
36:18They would jump for an opportunity.
36:19And this is not the case.
36:21This is very different.
36:22They would open or create the persona that is brand new, so it has no history whatsoever.
36:28They would need to apply for the loans, and that takes time.
36:31It takes patience and hard work, meaning that after they apply for the loan and in the beginning,
36:37they would act like a genuine customer.
36:39They would pay it back in time, in full, and then they would apply again and again and again
36:45until they are satisfied with their credit score and they can cash out.
36:50And what I mean by cash out means that they will be applying for the loan that's slightly bigger in
36:56amount,
36:57and then they wouldn't pay it back.
36:58So how does this affect the likes of you, us in the room, or anyone at home?
37:03How do we get affected and impacted by this?
37:06Fraud impacts the whole society, I would say, but this specific type of fraud, synthetic identity fraud,
37:12passes the cost back to us as consumers when you think about it.
37:16It creates, it breaks the trust, basically, between the businesses that do provide the loans and individuals.
37:22And not only that, it makes the more uncomfortable experience for us when we are being onboarded and ask questions.
37:29You may be asked multiple questions to prove your identity, or the process can get a lot slower.
37:35And it increased costs and premiums in total, so it's being passed to us.
37:40But I think it's the worst for the businesses because not only they suffer a reputational loss,
37:45they also suffer the financial loss because they never will know whether that customer was just a good customer gone
37:52bad.
37:53But, Katerina, please tell us, there's got to be something we can do to protect ourselves.
37:58Absolutely, that's good news.
38:00There are things that we can do.
38:01We can be vigilant how we use or behave online and first start with not clicking on the links from
38:09suspicious sources.
38:11And then also try not to overshare personal information online.
38:15The more you share that there is more risk that somebody could misuse that.
38:18And probably a very important bit to address is always check your credit reports.
38:24So if you see anything unusual happening, do contact your credit provider or agency immediately.
38:30And again, if you do suspect, report fraud website is available for anyone who wants to report that.
38:36Katerina, great advice.
38:37Thank you so much.
38:38Appreciate it.
38:40Now it's time to meet Elizabeth Cook, a woman literally drawing on her own experiences in the courtroom.
38:48My name's Elizabeth Cook and I'm what's referred to as a court artist.
38:53That doesn't mean I'm at Buckingham Palace.
38:56It means I go into the criminal courts to draw the scene and what's happening and the defendant and the
39:02judges and all the people involved.
39:04So that those can be seen on the television, in the press.
39:09Some people think, well, that's a nice job, but it's quite difficult because the point is that you can't draw
39:15in court.
39:16I'm committing the face and the details to memory because I have to memorise it and then leave the court
39:24to draw the scene.
39:26I want to convey exactly what's happening.
39:28It's important for me to concentrate hard because I don't want to interpret anything.
39:34I want the viewer to see exactly what I've seen.
39:40I went to Southwark Crown Court.
39:45I'm drawing Spacey as the jury came with the verdicts.
39:49And as the Not Guilty verdicts came through one by one, he started to shake a little and the very
39:58last count came through, Not Guilty.
40:01He broke down, and the usher went and got him a tissue, and here I'm drawing him now as he's
40:09breaking down.
40:11These are always the dramatic pictures that people want to see.
40:18Judge is reading out some notes.
40:20He has a microphone, a black stick mic, on a stand.
40:32These days it's rolling news, so really my picture is wanted as soon as.
40:39So I ration myself to 20 minutes of face.
40:43So if there are two barristers and the defendant, that's an hour's drawing.
40:49It's always a rush.
40:50That's why many of my drawings, you'll look at them and you'll see that the colouring in is done, but
40:54it doesn't go to the edges.
41:05I have drawn thousands and thousands of portraits over the years.
41:12Johnny Depp was a couple of years ago, and here he is, he's sitting at the back of the court,
41:18making a few notes and listening very intently.
41:20And I've tried to capture that expression, but you'll notice that in all of my drawings of Johnny Depp, he
41:27has very serious expressions.
41:30He's not the Johnny Depp that we usually see.
41:35Now, this is Harold Shipman.
41:38He's being questioned by the prosecution.
41:44Rosemary West here is listening as the judge is summing up to the jury.
41:51She's got her hands clasped and a very grim expression on her face.
41:57People ask me how I cope with hearing some of the worst things that you could ever hear, that one
42:04human has done to another.
42:05And it does affect me.
42:09Sometimes in court, although I'm there, I can't hear because in my mind there's a different monologue going on, describing
42:17the eyes and the expression on the face.
42:19And those things are taking over my thought processes so that I don't necessarily concentrate on the words I'm hearing.
42:28Looking back over 30 years, I've learned a great deal about human nature, as you can imagine.
42:35And it has kept me occupied doing the thing that I absolutely love, which is drawing.
42:44Such a fascinating job and having to remember all those intricate details outside the world.
42:49And so much detail as well.
42:50Amazing.
42:51That's all we've got time for this morning.
42:53Thanks for watching today.
42:54And remember, you can catch us on iPlayer for up to 30 days after broadcast.
42:58And if you've been affected by any of the issues covered in today's programme, then head to BBC Action Line.
43:04Tomorrow, a wall can't even stand in the way of this group grabbing high-value handbags.
43:10You never expect something like this to happen to you.
43:14This is a heist.
43:15This is an actual heist.
43:18And it's sent a ripple through the whole village.
43:23They got away with around £250,000 worth of bags.
43:27Can you believe it?
43:28Even more than your wardrobe.
43:29See you at the same time, 10.45 tomorrow.
43:32Goodbye.
43:34Goodbye.
43:41Goodbye.
43:56Goodbye.
43:56Goodbye.
43:57Goodbye.
43:59Goodbye.
44:00Goodbye.
44:00Goodbye.
44:01Goodbye.
44:01Goodbye.
44:01Goodbye.
44:01You
Comments