Skip to playerSkip to main content
  • 1 day ago
François Picard welcomes Alexander Abdelilah, an investigative journalist for Forbidden Stories who has just published a report on how the Iranian regime secretly acquired FindFace in 2019, a powerful facial recognition system developed by Russian company NTechLab. Abdelilah explores how this system operates both technically and politically. FindFace allows authorities to run recorded footage from CCTV cameras, street recordings, or even social media videos through an algorithm capable of matching a face in a crowd and tracing individuals who participate in protests. 
Read moreExclusive: Iran, massacre under a blackout While the Iranian case illustrates the risks of such technologies in an authoritarian context, it also raises broader questions about the growing use of facial recognition in democratic societies.

Visit our website:
http://www.france24.com

Like us on Facebook:
https://www.facebook.com/FRANCE24.English

Follow us on Twitter:
https://twitter.com/France24_en

Category

🗞
News
Transcript
00:00We've talked about how Iran's exported its Shahid drones to Russia for its war against Ukraine.
00:05Well, a powerful Russian facial recognition software has been acquired in complete secrecy by the Islamic Republic and put to
00:16use.
00:17It's our spotlight segment.
00:24And we say hello to Alexander Abdullai, investigative journalist at Forbidden Stories.
00:30Tell us, please, about FindFace.
00:35Yeah, so FindFace, as you said, is a facial recognition software that is made by a Russian company called Ntech
00:44Lab.
00:44And what we reveal at Forbidden Stories in our investigation that was published yesterday, Eyes of Iran, is that the
00:53Iranian regime secretly acquired it in 2019.
00:58We obtained a data leak of confidential documents from Iranian and also Russian companies.
01:07And so what we saw is that the Iranian regime has that tool at its disposal that is a top
01:14-notch tool, much more advanced than what experts thought the regime had in its hands.
01:24Give us a concrete example how, for instance, it could have been put to use back in January during the
01:33crackdown on those protests.
01:36Sure. So this tool has actually two ways it can work.
01:41So the first one is what they call the offline mode.
01:44So you can just record videos on the street during a demonstration, for example, or use CCTV camera feeds or
01:53also videos that people put on social media.
01:55And then you run it through that software, and that software actually has an algorithm that compares the faces that
02:03you see in the video with a database that you provided.
02:08So it could match, let's say, one face in a crowd with an identity thanks to AI and biometric data
02:17that the security apparatus in Iran has on most of its citizens.
02:22So it's in less, you know, in a few seconds, it could compare one face in a crowd with millions,
02:30potentially hundreds of millions of faces.
02:31So it's very powerful. And it could help the security forces to identify every people on the street, even if
02:40they hide behind hoodies or masks in some cases.
02:45It's very efficient. So it could really, in the hands of an authoritarian regime such as Iran, be really dangerous
02:53for people taking to the streets and make it way easier for the security forces to go after them.
03:00And some are wondering if this surveillance system that you've been describing for us, Alexander, may have been the undoing
03:07of the supreme leader.
03:08The Financial Times reporting how Israel had long hacked into Iran's surveillance cameras, its cyber unit, using footage to pinpoint
03:18its moment of opportunity.
03:20When Ayatollah Khamenei gathered senior staff at a defense compound on Saturday, these complex algorithms, adding details to dossiers on
03:30members of the security guards,
03:31including their addresses, hours of duty, routes they took to work, and most importantly, who they were usually assigned to
03:38protect and transport.
03:40So is what you're describing what killed, in the end, Ali Khamenei?
03:46So you have to differentiate between the CCTV camera feeds, so the images that cameras are recording, from the tool,
03:55the software itself, which allows to match faces.
03:59So, obviously, I'm not aware of the tool that the Israeli Secret Service is using, but one can imagine that
04:08it's probably a similar tool to that one that was used to be able to run a model on the
04:15images you get,
04:16to be able to say, okay, this face, this person matches this ID from that database, but that's difficult to
04:24say.
04:24Okay. Forbidden stories really dive into the story with our partners, because our mandate is to pursue the stories that
04:35journalists have been trying to work on and have been silenced.
04:40And Iran is the perfect example for that, because in that country, there is no investigation possible.
04:47Every potential investigation is a forbidden story, and this tool is like one more tool in the hands of an
04:56authoritarian regime that silences people.
04:59Because if you are aware of the fact that your government can identify you when you're in the subway, when
05:06you're on the street,
05:08and arrest you the next day or the next hour without having met any policemen on your way to the
05:15demonstration or on the way home,
05:17it's really scary, and that's what we wanted to highlight.
05:22And also, you know, name the companies involved, put names on that machinery.
05:28And so, the Journalist Consortium of Forbidden Stories, well, you got help from whistleblowers from inside of Iran.
05:35That's how you got tipped off to the story.
05:40So, actually, the departure of the story is a data leak that we obtained.
05:46And in addition to that data leak that contained a lot of contracts and exclusive documents,
05:53we were able to talk to someone who knew the system from the inside and helped us check if what
06:02we found was, you know,
06:04in tune with what he, the person, saw in the system.
06:09And we also talked to cybersecurity researchers, to people well aware of how the regime functions, like how it uses
06:20this kind of tools.
06:21And they said, I'm thinking of Nima Fatimi, which we talked to interviewed, he said, when he saw the documents,
06:31and he said, yeah, probably, most probably the regime used the tool during last uprising because it would allow them
06:38to spare their resources.
06:40And instead of sending policemen in a demonstration, they could just, you know, sit behind a desk, look at the
06:47images, run it through the software,
06:49get the list of people who were on the street on that day, and then just pick them up one
06:53by one at their home.
06:55So, it's pretty chilling to imagine what it could mean in the hands of that regime.
07:01And Iran is not alone.
07:03You just mentioned how Israel has its own system.
07:07And, Alexander, I know you read the news.
07:10Last week you watched that tug of war in the United States between Silicon Valley giant Anthropic and the Pentagon.
07:20Anthropic saying it doesn't want its software to be used for mass surveillance.
07:26What was your reaction when you were putting the finishing touches on this investigation and watching what was going on
07:33in Washington?
07:36Well, I mean, this investigation collided, let's say, with the news of the war breaking out in Iran.
07:43Of course, we were working on it already for a few months before we published.
07:49And, I mean, you know, the facial recognition is an issue that also Western democracies, let's say, are confronted with.
08:00Like, let's take the example of France, for example, where it's not officially allowed yet,
08:08but we know that some services are probably already using it.
08:13It's potentially dangerous when you connect it to databases containing private information of citizens.
08:23In itself, searcher software cannot do anything.
08:27It's really the data you feed it to that makes it dangerous and how you use that information once you
08:35have it.
08:36But, yeah, I mean, the example in the U.S. is one of the most recent ones where you have
08:42to, you know,
08:43it's all the ethics that AI brings up, like all these questions of how far can you go when you
08:52want to provide security?
08:53How far can you infringe on private rights and fundamental rights?
08:58So, yeah, that's a separate subject.
09:01But we try to focus and shed light on one of these softwares and what it means when these capacities
09:09are used in a regime that is authoritarian
09:13and is, you know, perpetrating massacres against its own population.
09:20Yeah, and one final question on this, Alexander.
09:23When you think of the Iranians who went out in the streets and braved bullets back in January,
09:30those that weren't killed or arrested, they've got to be looking over their shoulder now in light of what you're
09:38revealing.
09:39So, I think it was the case before.
09:43We saw a lot of demonstrators hiding their faces as much as possible, destroying CCTV cameras.
09:51So, Iranians were aware that the regime was surveilling them.
09:57What we are revealing, and we hope this also helps the Iranian people, is how this specific software works.
10:07And I think it's important that people are informed just to be able to, you know, think twice about how
10:16to go at what they are doing.
10:19I'm not saying, now that this software is active, you don't have to, please stay away from the streets or
10:25whatever.
10:25That's up to every citizen to make that decision.
10:29But I think it's important to be informed.
10:31And our role is really to bring this information to the public and say, here's a tool that is very
10:38powerful in the hands of that regime.
10:40That's something we didn't know.
10:44And potentially, it's very harmful.
10:47And then it's up to, you know, everyone to take their own responsibilities.
10:52And we're also naming the companies.
10:56I think that was something very important to us and to our partners.
11:00Alexandre Abdelila, your investigation available on the website of Forbidden Stories.
11:06So many thanks for speaking with us here on France 24.
11:10Thanks for having me.
Comments

Recommended