Intimate image abuse online is a significant problem worldwide. A new law in the US aims to tackle this issue - and might help provide relief for victims globally.
Category
🗞
NewsTranscript
00:00It's everyone's nightmare finding sexually explicit images of yourself online.
00:05Non-consensual intimate images, widely known as revenge porn, have become a huge problem
00:11worldwide. They can be real pictures or AI-generated deepfakes. Now US President Donald Trump has
00:18signed a law that makes it illegal to post such images. The Take It Down Act also puts
00:24pressure on tech companies to remove that content. Sounds like an overdue step,
00:28but critics are not convinced these efforts will actually help victims.
00:33Here's what you need to know. Having someone share explicit content of you online is a traumatic
00:39experience, no matter if the images are real or AI-generated. Victim support groups worldwide are
00:46sounding the alarm because the number of incidents is rising exponentially. South Korea is the most
00:53affected. According to a study, 53% of all non-consensual sexual deepfakes online show women
01:00from there. By the way, the term revenge porn is very problematic in itself. It suggests the victim
01:07did something to deserve retaliation. It also downplays the seriousness of the abuse by framing
01:14it as pornography. Victim support organizations suggest we should instead use terms like image-based
01:20sexual abuse or intimate image abuse. Unfortunately, legislation in many parts of the world
01:27lags behind and prosecuting these kinds of crimes is difficult. So the US Take It Down Act is seen as
01:35a step in the right direction. What does the new bill say? It will fully come into effect next year.
01:42Websites and social media platforms will have to remove reported content within 48 hours after a
01:48victim requests it. And they have to make reasonable efforts to remove any reposts or copies of the
01:55offending material. Apart from that, the law empowers victims to sue perpetrators for damages. So
02:02there will be a legal path to get compensated for any emotional and reputational harm. Perpetrators also
02:08potentially face years in jail, depending on the severity of the crime. These are good intentions,
02:15but how will this work? Well, it mainly comes down to the platforms. They will have to build new
02:21infrastructure for a takedown system. It could work like this. Firstly, they need reliable ways to verify
02:29that a takedown request is legitimate. The problem? Automated identity verification has its limits. For example,
02:37a face might not always be clearly visible. Apart from that, they must create technology that detects
02:44and removes identical copies. And they also need to set up a clear framework on what users must do
02:52if they want an intimate image taken down. The US Federal Trade Commission will be in charge of
02:57enforcing the law. Well, we don't know if this will really work out as planned, also from a technical
03:03perspective. That's not the only reason why the new bill isn't supported by everyone. NGOs like the
03:10Electronic Frontier Foundation and the Center for Democracy and Technology criticize the law for being
03:16too vague. In their eyes, it could be used to remove a much wider array of content than intended,
03:23and become a weapon to threaten or stifle political opponents. And then there are privacy concerns.
03:29To work effectively, platforms would most probably have to check private messages to keep people from
03:36sharing reported content. Overall, some experts argue that the law might traumatize victims even more
03:43by promising more justice, which it, as of now, can't deliver. Fortunately, there are some organizations
03:50that support affected people. Take It Down is an online tool specifically designed to help minors.
03:56It was created long before the recent bill of the same name and is not directly related. With the tool,
04:03kids can anonymously flag and remove nude or sexually explicit images of themselves from
04:08participating platforms. For adults, a similar tool called Stop NCII has been available since 2021.
04:16Both tools are supported by big platforms like Meta. Even though it might feel awkward, it is also
04:22important to inform the police. If you don't report it, the perpetrators simply walk away. And if it all
04:28gets too much, don't hesitate to seek professional help. Visit this website to find out who you can turn to
04:35in your country. That's all from me today. See you next time.