Skip to player
Skip to main content
Search
Connect
Watch fullscreen
2
Bookmark
Share
More
Add to Playlist
Report
A Forensics Expert on Princess Kate’s Photo—and How Credentialing Tools Can Help Build Trust in a World of Increasing Uncertainty
TIME
Follow
2 years ago
Hany Farid, an academic who has spent the past 25 years developing techniques to detect photo manipulation, breaks down the Kate Middleton photo and living in the age of AI.
Category
🗞
News
Transcript
Display full video transcript
00:00
Princess Kate is right that many amateur photographers and bring their photos into
00:04
Photoshop and hide the app. And I think probably what she did is one of these best take like
00:09
montages where she took a series of photos and she wanted to get the smiling face just right
00:14
for every before all the kids and herself and did a little montaging and then it just all it
00:19
in invariably leaves a few artifacts. And so I was pretty confident at that point that this was
00:25
fairly benign. I understood why the AP pulled it because they have very particular standards.
00:30
But I don't think it was nefarious. I don't think it's a sign that she's dead. I don't think aliens
00:35
have taken over her body and all the crazy nonsense that the internet is coming up with.
00:40
I think this is a classic story of photo editing that was a slightly sloppy. You know, at the end
00:45
of the day, this is a pretty simple story. But what's interesting about it is even though this
00:50
is a story about Photoshop and fairly benign photo editing, it there is the shadow of AI.
00:56
Like if we were living if we weren't living in the generative AI age, I don't think you'd be
01:01
hearing much of what we are hearing. Yeah, I do think that the reason why these conspiracies can
01:07
take hold is because we do live at a time where somebody could have fully fabricated that image,
01:12
or that was a body double and then they replaced her face there. And so this is still an AI story
01:18
to some degree. But I think it really is quite innocuous. We are seeing two things. One is people
01:26
are creating fake images, fake audio, fake video of Trump, of Biden, of Harris, and trying to use
01:31
that to damage them. But they're also doing something else, which is they're using the
01:36
specter of deep fakes to try to dismiss reality. Right? Because if you live in this world where
01:41
anything could be fake, well, the the video of Trump stumbling on his words and calling his wife
01:46
Mercedes, he can just say it's fake. So that's actually very dangerous also is you can deny
01:53
reality when things can be fake. And I think you're seeing both of those around the election.
01:58
And of course, we're seeing the awful content around non consensual sexual imagery. We saw it
02:02
with Taylor Swift, but high school girls every day are now being victims of this. Journalists,
02:08
human rights activists, politicians, people who attract unwanted attention are being victims of
02:14
being inserted into explicit material. People are using it to create child sexual abuse material,
02:18
people are using it to commit commit small to large scale frauds, getting phone calls from
02:23
what seems like your loved ones or talking to your CEO, but it's not it's actually somebody
02:27
who's faking them. So you're seeing really significant weaponization at the individual
02:33
level at the societal level and at the democracy level. And I think it's probably a sign of things
02:38
to come. Individuals have to stop pretending that they can forensically analyze content and be
02:43
reliable at it. You can't this is a really hard job. I do this for a living. It's hard. So what
02:49
you have to realize as a consumer is that you are vulnerable. And you are not going to be able to
02:54
forensically analyze analyze your way out of this. I think we just need to be more thoughtful about
02:59
how we consume information and understand that the journalists have a serious job. And by the way,
03:04
when they get it wrong, there are consequences. And you can't say that about the knuckleheads
03:07
on Twitter. There's no consequence for them. The first approach is what we call reactive approach.
03:12
I wait for the reporter from the Telegraph or BBC to contact me. They send me the photograph. I run
03:16
a battery of tasks, try to figure out what's going on. I let them know. They set the record
03:20
straight. Meanwhile, you know, an entire new cycle has gone by. Right. So it's a relatively slow
03:27
process. We measure that in hours or days. But at the speed of the Internet, you know, that doesn't
03:33
really work. So the proactive technique is that you're the camera itself. So Princess Kate's
03:41
camera, when she took that photo would have said, Ah, this photo was taken on this date and time at
03:46
this location. This is a credential that author that authenticates that. And then as she put that
03:52
into Photoshop, Photoshop would say, Okay, this is what I did the photo, bing, bing, bing, bing, bing.
03:56
And then when that goes to the AP, AP can look at that content credential. It's manifest, we call it
04:02
and say, ah, okay, this is what happened. This is consistent with our photo editing standards,
04:08
or it is not we will accept it, we will not. And none of this would have happened. Right? If that
04:13
if that existed, and the nice thing about the proactive techniques is you're there at the point
04:17
of creation. Yeah. And as long as those credentials are there, you get to make decisions. Now the
04:23
drawback is, we need those contra credentials in some what 10 billion devices around the world.
04:29
And we need the infrastructure. And so that's going to take a little bit of time. And I think
04:32
at the end of the day, it's going to be a combination of those two solutions. Right?
04:36
The credentials will work. When people like I think in this case, if that Canon camera,
04:41
and that version of Photoshop that Kate had were compliant, I think she wasn't trying to
04:45
fool anybody shoots that dinner photo, then it would have worked perfectly. A dedicated bad actor,
04:50
yeah, they'll find a way around it. And then the reactive techniques that we talked about,
04:54
that's where I come in. So a combination of those two is sort of what you need.
Be the first to comment
Add your comment
Recommended
0:52
|
Up next
Kate Middleton photo recalled after photoshop claims
Australian Community Media
2 years ago
0:35
Kate Middleton's Edited Mother's Day Photo
SKEntertainment
2 years ago
1:22
Conspiracies galore for Kate Middleton’s family pic
Australian Community Media
2 years ago
2:12
Royal Expert Breaks Down ‘Complete Chaos’ of Kate Middleton Photo Editing Fail
US Weekly
2 years ago
3:06
Kate Middleton's Passion for Photography
Royal Family Channel
5 years ago
0:36
Powell Cools Hopes For December Rate Cut: 'Far From A Foregone Conclusion'
Benzinga
7 hours ago
0:37
China Says It Will Work With U.S. On TikTok’s Future, Stops Short Of Endorsing Trump’s Spin-Off Plan
Benzinga
7 hours ago
0:40
AI Giants Deliver Mixed Earnings As Capex Boom Accelerates
Benzinga
8 hours ago
0:50
Jesse Eisenberg is ‘so excited’ to donate his kidney to a stranger in six weeks
The Independent
4 hours ago
0:21
Watch as Alan Carr bursts out laughing when he says he’s a Faithful in Celebrity Traitors
The Independent
4 hours ago
0:29
American Love Island star baffled by £1,200 bill after moving to UK
The Independent
5 hours ago
2:54
Kathy Hochul Asked If She Has 'Confidence' In NYC Mayor Adams Amid Trump Administration Job Offers
Forbes Breaking News
7 weeks ago
1:53
'There Was No Looting, There Were No Cars Set On Fire': Cox Praises Utahns' Response To Kirk Killing
Forbes Breaking News
7 weeks ago
0:45
Cox Reads Heinous Inscriptions Found On Casings Found With Rifle Allegedly Used To Kill Charlie Kirk
Forbes Breaking News
7 weeks ago
1:10
Two Suspects in the Louvre Jewel Heist ‘Partially’ Admit Their Participation, Prosecutor Says
TIME
1 day ago
3:38
As SNAP Benefits Set to Run Out Amid Government Shutdown, Democratic and Republican Lawmakers Blame Each Other
TIME
1 day ago
0:37
Watch: Aerial View of Devastation Caused by Hurricane in Jamaica
TIME
1 day ago
2:17
Hurricane Melissa Brings Flooding and Catastrophic Winds to Jamaica
TIME
1 day ago
0:51
‘This Is Not the Time to Be Brave’: Hurricane Melissa Strikes Jamaica With Catastrophic Strength
TIME
2 days ago
0:51
Trump on Running a Third Term: "I Would Love to Do It"
TIME
2 days ago
0:56
U.S. Air Force 'Hurricane Hunters' Crew Flies into the Eye of Hurricane Melissa
TIME
2 days ago
1:13
TIME For Kids Meets Librarian Mychal Threets, the New Host of 'Reading Rainbow'
TIME
3 days ago
1:18
‘Communities Will Not Survive’: Jamaica Prepares For Worst Ever Storm As Melissa Intensifies to Category 5
TIME
3 days ago
1:00
Trump Says "We Feel Good" About U.S.-China Trade Talks
TIME
3 days ago
1:04
Trump Publicly Rebukes Putin After Russia Says It Tested Nuclear-Powered Missile
TIME
3 days ago
Be the first to comment