Zum Player springenZum Hauptinhalt springen
UNFREIWILLIG PORNOSTAR

UNFREIWILLIG PORNOSTAR
Als Kate Isaacs das Video im Internet entdeckte, konnte sie es kaum fassen: Sie selbst beim Sex mit einem fremden Mann. Wer filmt das? Und: Wie ist das ins Netz gelangt? Kate ist kein Einzelfall. Millionen von Frauen sind weltweit unwissentlich Opfer von Deepfake-Pornos. Dabei handelt es sich um pornografische Inhalte, die mithilfe von Künstlicher Intelligenz (KI) erstellt wurden.
Der rasante technologische Fortschritt ermöglicht es heute jedem mit einem Handy und einigen Bildern der Zielperson, mittels frei zugänglicher Apps Deepfake-Pornos zu erstellen und zu verbreiten. Die Opfer leiden immens unter dem virtuellen Missbrauch: Rufschädigung, Karriererückschläge, Gewaltandrohungen, Depressionen bis hin zum Suizid sind die realen Folgen.
Die Dokumentation beleuchtet dieses beunruhigende Phänomen. Es scheint fast unmöglich, sich gegen diese Art der digitalen Gewalt zu wehren. "NZZ Format" spricht mit Betroffenen, KI-Experten, Juristinnen und Aktivistinnen und beleuchtet die erschreckende Leichtigkeit, mit der diese Inhalte entstehen. Ein wichtiger und aufrüttelnder Blick auf eine neue Form des Missbrauchs im digitalen Zeitalter


Deepfake,
Deepfakes,
Deepfake-Pornos,
Deepfake-Opfer,
Künstliche Intelligenz,
KI,
Cybermobbing,
Cyber-Gewalt,
virtueller Missbrauch,
digitale Gewalt,
Racheporno,
Non-Consensual Porn,
Rufschädigung,
Betroffene,
Missbrauch,
Pornografie,
Internet-Kriminalität,
NZZ Format,
Dokumentation,
Film,
Medien,
Ethik,
Recht,
Juristen,
Aktivisten,
Technologie,
Frauenschicksale,
unwissentlich Pornostar,
#Deepfake,
#Deepfakes,
#DeepfakePornos,
#VirtualAbuse,
#KünstlicheIntelligenz,
#KI,
#Cybermobbing,
#CyberViolence,
#Opfer,
#DigitalAbuse,
#NonconsensualPorn,
#Racheporno,
#Reputationsschaden,
#Betroffene,
#Missbrauch,
#Pornografie,
#InternetKriminalität,
#NZZFormat,
#Doku,
#Film,
#Medien,
#Ethik,
#Recht,
#Juristen,
#Aktivismus,
#Technologie,
#Frauenschicksale,
#UnfreiwilligPornostar,
#DeepNude,
#AIAbuse,
#Opferhilfe,
#HassImNetz,
#NetzDG,
#Privatsphäre,
#Digital rights,

Kategorie

📚
Lernen
Transkript
00:00Just because something is artificially produced doesn't mean it has no impact on the real world.
00:24Whenever a new technology comes out, it is of course first used for porn.
00:31My face has been digitally superimposed countless times onto other people's naked bodies to create sexual content that I never consented to.
00:39Whatever information about me can be found on the internet, I can now tell you it's a deepfake. That's not me.
00:46Someone put my face in a porn video and sold it. There's a website, several websites, that exist solely to create pornographic videos of people on request.
01:01My sense of security, my daily life, my entire career have changed forever as a result.
01:06People say, "That's not really you. Nothing was really done to you."
01:14But I consider what happened to me to be a form of violence.
01:18I was so ashamed. I felt unsafe in my own city.
01:22If that's not a crime, what has to happen to a woman before we take action?
01:26It's simply a new way to exploit women and girls. It's a new way to abuse women and girls.
01:36It's so obvious. It sucks.
01:40Shit.
01:50I am Dr. Faker and I have been producing deepfake porn for three years.
01:56My clients are mostly heterosexual men who want a woman in a porn deep hole.
02:06Normally, people send me pictures or videos, sometimes also recordings of video calls from meetings.
02:13Once I have enough material of the target person and know which porn I should fake them into, then I can actually start working.
02:20Artificial intelligence does about 60 percent of the work, and I do about 40 percent manually.
02:27While searching for source material, I learn quite a bit about the person.
02:32Most of the time, it's a woman the man is attracted to.
02:35The boss, the neighbor, or the brother's wife.
02:40Nothing really surprises me anymore.
02:42I have no problem telling my story.
02:44But I don't want to see my face on television or hear my voice.
02:49Therefore, this is a re-enacted interview.
02:52My family, my wife, and my friends know that I produce deepfake porn.
02:57But I'm not proud of it.
02:58I do the job for the money.
03:01I have no guilty conscience.
03:03The woman doesn't know she's in a porn film, so it doesn't harm her.
03:06My name is Cara Hunter.
03:24I am a victim of image-based sexual violence.
03:28This experience has definitely changed me.
03:30I don't think it's possible to experience something so traumatic without it changing you.
03:36I grew up here on the coast of Northern Ireland.
03:41I didn't really like going to school, but I always enjoyed learning and I really enjoyed my time at university.
03:47I have a degree in journalism and wanted to make a film about Northern Ireland and the difficulties we face here.
03:54That's how I got into politics.
04:01Hi Bobby, how are you?
04:03Hello, nice to see you.
04:04You already know Helen, right?
04:06Nice to see you.
04:07I wanted to leave you this information sheet.
04:10If you ever need anything, you know where to find us.
04:14It was nice to see you again.
04:16I think my family was quite shocked because I wasn't usually a conspicuous person.
04:22But I made it clear to them that this is something that is really important to me.
04:27Things like mental health, addiction, and the concerns of our children and young people.
04:33These are all topics that are really important to me.
04:36So I just gave it a try and lo and behold, I was elected.
04:39Good evening, I would like to thank all the voters who went out yesterday in freezing winter weather and said no to the division of our nation and the mistrust we have here in Northern Ireland.
04:53Should I have a political future, I look forward to representing all the people of Northern Ireland.
05:00Thank you.
05:02It was quite a surprise for my family when I went into politics.
05:06They all answer me, don't do that, your life will no longer be your own.
05:14I believe it was 18 days before the elections.
05:18A stranger sent me a message on Facebook and wrote,
05:22Hi, is that you in this video?
05:23He also sent a video that I would describe as extremely pornographic.
05:33The woman in the video was about my height, weight, and figure.
05:41I saw the video and wondered, is this a joke?
05:44So I wrote back, this is crazy, this isn't me?
05:48That's crazy.
05:50That's not me.
05:53From that moment on, all I could think was, what the hell is happening here?
05:59Why am I doing this to myself?
06:00Why do I do this job?
06:02But that was precisely the intention of this video.
06:05They wanted to isolate me and embarrass me.
06:10But at some point I thought, screw it.
06:12I cannot sit here and allow this terrible person,
06:15who did something so disgusting and despicable,
06:20Not only my life, but also my career was ruined.
06:23for which I worked so hard.
06:29Good day everybody.
06:31I would like to thank you all for inviting me here,
06:34so that I can get an idea of the course "Fighting Habits".
06:39We often deal with people who are mentally ill.
06:42and simultaneously struggle with addiction problems,
06:44be it with drugs or alcohol.
06:46And it is my duty to do everything in my power,
06:50to support these people.
06:53I am happy to be here today.
06:55And maybe I'll even indulge a little myself.
06:57Thank you.
06:57So many people have sent me messages.
07:12I received around 100 messages within 15 minutes.
07:16The video went viral.
07:17People laughed at me and said terrible things.
07:31They talked about my body, the size of my breasts.
07:35Some people described what they would like to do to me.
07:38Men from all over the world called me a whore, a slut.
07:44Someone said, "I saw your video."
07:47Wow.
07:47But now you've learned your lesson.
07:50I think you've learned it now.
07:56I couldn't believe the messages people were sending me.
08:00Because they thought they had seen me naked,
08:02They believed they had the right to
08:04to send me horrible, misogynistic, cruel messages.
08:09It makes you really paranoid.
08:11I no longer felt safe anywhere.
08:20When these tools first appeared,
08:23They were not very user-friendly.
08:25And most people shared FaceWarp videos of famous women.
08:30However, as the apps became more user-friendly
08:32and thus found favor with the general public,
08:35Suddenly, many private individuals also became targets of deepfakes.
08:39I think we're now talking about millions of women worldwide.
08:43This is truly a global problem.
08:46My name is Henry Ida.
08:47I am an expert in deepfakes and generative AI.
08:50A deepfake describes the use of artificial intelligence.
08:53to create highly realistic, but fake content.
08:56For example, you can swap faces.
08:58or placing one person's face onto another person's body.
09:01It's also possible to clone voices.
09:05I think this video became so famous,
09:14because it was one of the first ones that made people think,
09:17Oh God, I could fall for that myself.
09:19I know it's not real.
09:21And yet it feels so real.
09:23In 2018, BuzzFeed and director and comedian Jordan Peele
09:39Together, they issued a so-called security alert for deepfakes.
09:44Since then, it has been reported on very frequently in the media.
09:46how this technology is destroying democracy
09:49and will change the media landscape.
09:51In fact, I found during a study that
10:17which I had conducted in 2019,
10:20that at that time 96% of deepfake videos
10:23were of a pornographic nature.
10:27The fear surrounding the use of deepfakes
10:30in connection with elections and political campaigns
10:33Therefore, it could not be scientifically substantiated.
10:35Instead, the study showed,
10:38that most deepfakes,
10:39where image material is maliciously misused,
10:42directed against women.
10:44And millions of women worldwide are now affected by this.
10:54Yes, nowadays practically anyone who has a mobile phone can
10:57Download an app and produce deepfakes.
11:00But if the starting material isn't so good
11:02or if you want to have porn in particularly good quality,
11:05Then you need a bit more technical know-how.
11:08Then people come to people like me,
11:11who do that for them.
11:13Once I have enough material from the target person,
11:16I incorporate it into the porn using software.
11:19Depending on the quality of the source or target material,
11:22That's really tedious work.
11:24I then have to look at each picture to make sure everything looks good.
11:27Flow jobs, for example, are really hard work.
11:30Or even sperm, because it is transparent.
11:32and is difficult to separate from the background.
11:34And what's almost completely unacceptable is kissing.
11:36Because the artificial intelligence will then no longer be able to distinguish the faces from each other.
11:40So all the romantic stuff is gone.
11:42When deepfakes first appeared in 2017,
11:49I had a feeling that this technology could change the world.
11:53Let's fast forward to the year 2023,
11:56Is it really true that the technology has improved so much?
11:59that we can no longer believe what we see.
12:03We can no longer say with certainty,
12:04to distinguish between a deepfake and what is real.
12:10We have reached a truly critical moment,
12:13in which these technologies are easily accessible and highly efficient,
12:17They require less and less data and are extremely realistic.
12:21As a society, we must therefore ask ourselves,
12:25how we want to integrate these technologies into our daily lives.
12:29How will they improve our lives?
12:31And in which cases do the negative aspects outweigh the negative ones?
12:37Hello.
12:38Hello, my dear.
12:39Would you like a coffee?
12:41Oh yes, with pleasure.
12:43I desperately need a cup of coffee.
12:46My mother is a nurse.
12:49My father took care of us at home.
12:52We are passionate Irish nationalists,
12:55But we never shared our views publicly.
12:58We never talked about politics at the dinner table.
13:01This won't simply disappear.
13:03It's now a year later and we're still dealing with it.
13:08And it didn't just affect you,
13:10It had an impact on the whole family.
13:13Neighbors and friends approached me.
13:15Even strangers approached me and said,
13:18Your daughter is a slut.
13:19And I said what?
13:21That wasn't my daughter.
13:23And they said, but we don't know.
13:25And you don't know either.
13:26It's tough.
13:29It only takes a spark of doubt for people to change their opinion of you.
13:33It was very difficult for your father.
13:36He took it very badly.
13:38And when your uncle came to tell us about the video,
13:41He was extremely upset.
13:44He even wanted you to leave politics because of it.
13:47The hardest part for me was,
13:50I have built up trust and respect among the voters.
13:53And then there's someone sitting somewhere in a bedroom,
13:56He takes a holiday photo of me and makes a video.
14:01If you think about it.
14:01Whoever did that was clever.
14:05Because this crime leaves no trace.
14:08To this day, we don't know who is behind it.
14:10This is extremely disappointing.
14:13You go to the police and they say,
14:16We are sorry that this happened to you.
14:18But actually, nobody broke any laws.
14:21Why don't you send out a press release? Good luck.
14:24I thought it was a terrible time to be a woman.
14:27There are no laws to protect us.
14:32The distribution and sharing of deepfake pornography
14:35is still legal in most countries of the world.
14:37Only a few countries have taken measures,
14:40to criminalize the distribution of deepfake pornography.
14:46I am Claire McLean
14:47and I am a law professor at Durham University in Great Britain.
14:52In most countries there are laws,
14:55which oppose specific aspects of creation and distribution
14:58could be used for deepfake pornography.
15:00For example, it is possible to
15:02that such videos violate copyright laws
15:05or constitute a form of reputational damage.
15:07However, the reality is this:
15:09that one needs money to be able to take legal action.
15:13Furthermore, one must know
15:14who created and distributed this deepfake pornography.
15:17So there is very little,
15:18what ordinary people like you and me can do,
15:21to take action against such activities.
15:23That was the hardest part for me.
15:26We as women know,
15:27that such things are part of women's everyday reality.
15:30Our society believes
15:31that the best way to humiliate a woman,
15:34that consists of turning them into an object.
15:37and to sexualize them.
15:38And that was probably the reason,
15:40which is why I absolutely wanted to speak about it publicly.
15:43No woman should have to experience that.
15:45You did a really good job.
15:50I probably wouldn't have been able to handle it so well.
15:52But you did a great job.
15:54Thank God I have the support of my family.
15:56I see you.
16:24Clear and concise.
16:26My name is Kate Isaacs
16:29and I am the founder of the Not Your Porn campaign.
16:33I launched the Not Your Porn campaign in 2019,
16:37as a friend of mine without her consent
16:39landed on Pornhub.
16:41Your account has been hacked
16:42and a video of her,
16:43how she had sex with her partner,
16:45was uploaded to Pornhub.
16:46It was incredibly difficult,
16:48to remove this video again.
16:50So I wrote to the Ministry of Justice
16:52and it turned out,
16:53that there was no law here,
16:54that a company prevented it,
16:56from the publication of sexual content
16:58to profit without the consent of the people depicted.
17:00That's what I wanted to change.
17:03I had absolutely no experience in activism.
17:06I didn't know what I was getting myself into.
17:07It was much more difficult than I had expected.
17:10I was very naive.
17:11But yes, the Pornhub story
17:13That was quite an experience, to say the least.
17:15I think if you've seen what I've seen,
17:21There's no going back.
17:23The content was so humiliating.
17:26The lives of the victims were completely turned upside down.
17:29These women have lost so much.
17:32If not their job, then at least their dignity.
17:34So I designed a logo,
17:40launched a campaign
17:41and published them on Twitter.
17:43Image-based sexual violence is steadily increasing here.
17:46The porn industry has long profited from it.
17:48of this type of abuse.
17:51These are companies registered here.
17:54The government must shut down these porn companies.
17:55hold them accountable.
17:57We need to solve this problem.
17:58The campaign has been running since my angry Twitter days.
18:05has grown significantly.
18:07I'm really proud of it.
18:08And also to the women,
18:09which have made it what it is today.
18:12A fully-fledged, non-profit organization
18:14with numerous women,
18:16who work full-time for the campaign.
18:18We offer educational programs in schools,
18:22Police training to teach police officers how to deal with victims
18:25to support the fight against image-based sexual violence.
18:28We are raising awareness within society.
18:30and assist the government in the drafting process
18:33various legal texts are available for consultation.
18:39Hello.
18:40You look good.
18:42How are you doing?
18:43Very good. And how are you?
18:45Yes, I'm fine. Thank you.
18:46How was your week?
18:48Hectic, but not too bad.
18:50I can't complain. And yours?
18:51Yes, very hectic too.
18:54I mean, there are just so many things that need doing.
18:56There is so much to discuss, so much to do.
19:01As always.
19:02All right then.
19:03Shall we start with a brief update?
19:05That's coming up.
19:07We are currently helping to draft a new article of law.
19:10Our current campaign is really effective,
19:13because we found some victims who spoke openly about their experiences.
19:16Their statements have once again shown,
19:18how outdated not only the legislation is,
19:20but also the way in which political decision-makers
19:23and the legislator thinks about this issue.
19:25It's simply not practical.
19:28And what do you hope to achieve?
19:30There are ways to enforce the law,
19:33but they are not strong enough.
19:34On paper, everything looks great,
19:36But it's not well thought out.
19:37The measures come from people with a legal background.
19:40and political understanding
19:41not from victims who had to go through all of that.
19:44We have now reached a point,
19:45where we need to start with prevention,
19:47But we are not currently doing that on these platforms.
19:50And that was the crux of the campaign.
19:52When I started, it was more about damage prevention.
19:54rather than damage control.
19:56The goal was to prevent
19:58that this content is even uploaded to these websites.
20:01Because as soon as they get up there,
20:03They can be downloaded and then uploaded again.
20:05Then we will have no chance of stopping the spread.
20:11We are dealing with an epidemic of deepfake pornography.
20:16Despite this, very little is being done.
20:18to take action against the existence of such websites,
20:21which specifically focus on the distribution of deepfake pornography
20:23have specialized.
20:24That's simply because...
20:28that the governments and politicians
20:30have not yet taken any measures,
20:31to stop such websites.
20:33We could do it if we wanted to.
20:35But the governments and politicians
20:37actively decide against it.
20:38I think the real reason for this is,
20:41that the vast majority of victims are women.
20:44The abuse of women
20:46It's simply not taken seriously enough.
20:48Women are still often marginalized,
20:50are vulnerable
20:51and are often subjected to social discrimination.
20:55Now, new ways have simply been found.
20:57to exploit this vulnerability
20:58and to commit new forms of violence and abuse.
21:01These online companies are currently making a name for themselves.
21:08and porn platforms a good life
21:10and earn millions in the process
21:12while they are hardly regulated.
21:17It is up to governments around the world,
21:20to take measures,
21:21to regulate these companies.
21:23Just as we regulate cars, trains and ovens,
21:26with whom we cook daily.
21:28We need to regulate the internet.
21:29and we need to get governments to
21:31to take action.
21:42We live in a modern,
21:45an ever-changing world.
21:47And we want more young women
21:49in politics.
21:50It is incredibly important,
21:52that our government promotes diversity
21:53represented within society.
21:56If you are a young woman,
21:57Many people believe
21:59that one lacks sufficient experience
22:00or brings knowledge relevant to this job.
22:02But I think young voices
22:04should definitely be included.
22:08Hello.
22:10Are we all present and accounted for?
22:11It's great to be here and see you all again.
22:17I know that you are strongly committed to this,
22:19that young people are also heard by politicians.
22:22That's why I wanted to talk about these topics today,
22:24that are important to us.
22:28I think that women are underrepresented in governments.
22:32This leads to many laws
22:33targeting the needs of men.
22:35The laws are written by men for men.
22:38We need more equality.
22:42We have come a long way,
22:43But there is still much to be done.
22:45That's why this youth group is so important,
22:47because she advocates for greater diversity.
22:50Diversity should also be reflected here,
22:52that we see in our society.
22:56After the last elections
22:58We want to focus specifically on the topics
23:00Misogyny within society
23:02and focus online.
23:04If a young woman wants to go into politics,
23:07Everyone asks you about it immediately.
23:08Are you really ready for this?
23:10When we talk about misogyny,
23:12Perhaps people will also start to
23:13to talk about it.
23:15And hopefully that will encourage more women to...
23:16to open up.
23:18Education is so important.
23:19Social media can also be used to raise awareness.
23:21and we can deliver our message
23:23spread in this way.
23:26Considering what happened to me,
23:29I find the fact that
23:30that there has been no change in the law so far
23:32is extremely disappointing.
23:34If I had the power to change the law,
23:36I would do it immediately.
23:38The government must fulfill its responsibilities.
23:41and take action.
23:43I think that the women affected
23:44We need to talk about it.
23:46how their lives are affected by it.
23:48Thank you so much for taking the time.
23:50It was the year 2020.
24:12We were able to achieve a great victory.
24:14Pornhub deleted 80% of its content overnight.
24:1710 million videos.
24:18Simply gone.
24:25Unfortunately, as with everything,
24:26There was a small group of people,
24:28especially on Twitter,
24:29who was not very happy about it
24:31that their porn videos were deleted.
24:32And so I became public enemy number one.
24:35I became their target.
24:43In their words, they wanted to teach me a lesson.
24:47They found my home and work addresses.
24:49and published them on Twitter.
24:52They said they would follow me home,
24:55rape me, film it and upload it to Pornhub.
24:58That was frightening.
25:01And then they took one of my interviews,
25:03that I had given to a British news channel
25:07to make a deepfake porn video out of it.
25:10They then claimed the reason was,
25:12which is why I wanted to make Pornhub do
25:14Deleting porn,
25:15that I had made a porn film myself
25:17and just wanted to try to get it off the network.
25:20So they simply reversed the story.
25:22It was her attempt to silence me.
25:25I saw the video on Twitter.
25:31So many thoughts raced through my head at that moment.
25:35One of them was, oh my God, who is this man?
25:38Who is recording this video?
25:39And then, oh my god, it's on the internet.
25:42How the hell do I get that off?
25:44When I had calmed down
25:46and the questions stopped swirling around in my head,
25:49I realized that the video couldn't be real.
25:51When I realized it was a deepfake,
25:55I was somewhat relieved.
25:57However, not for very long.
25:59Because then I immediately thought,
26:00Okay, but it looks really real.
26:03Will people think that's really me?
26:06Because that causes the same damage.
26:09I knew the police wouldn't do anything.
26:13I already had enough experience with this topic.
26:14and knew that they would do nothing.
26:16There was nothing I could say or do.
26:18I could have said over and over again, "That's not me."
26:22But it didn't matter whether it was me or not.
26:24Because people believe what they want to believe.
26:29I'm so damn angry.
26:34Day in, day out, I listen to victims and support them.
26:37even though we have so few resources.
26:40Day in, day out, I have to tell them,
26:42The system is letting you down.
26:43We will do our best to help you.
26:46But the legal options are limited.
26:50It scares me,
26:52that deepfakes can be used as weapons,
26:54to silence someone.
26:57This is one of the main reasons,
26:58why I am so tired
26:59and now wishes to resign from my office.
27:02Actually, it wasn't our job from the beginning,
27:04to take care of such issues.
27:06You shouldn't have to worry about it.
27:08that you get deep-baked
27:10just because you're an activist.
27:12There should be things,
27:14for which we don't have to fight.
27:16That's not our job.
27:17That has never been our job.
27:21To be honest,
27:25Every time I give such an interview,
27:30I'm getting much more scared again.
27:35Because they might see you?
27:37Because it means I'm putting myself in the public eye again.
27:45And that is now
27:47more frightening,
27:50than walking home alone.
27:57Sorry, I didn't think so.
27:59that I have to cry.
28:01But you still talk to me.
28:03Why?
28:05Because it's important.
28:09I have spent the last four years
28:11Women encouraged
28:12to share their stories.
28:14Because people listen to stories.
28:16Stories bring about change.
28:18Wouldn't I share my own now?
28:20How hypocritical would that be?
28:21Deepfake pornography is being misused,
28:30about women out of politics
28:31and to remove it from the internet.
28:34This is a massive impairment.
28:36our public life
28:37and our democracy.
28:43I'm so angry.
28:46There are no words,
28:47to describe the anger,
28:48which I feel.
28:49Women and girls are the target.
28:57They are silenced,
28:59made submissive.
29:00Something that men have been doing for generations.
29:06There is no law,
29:07that protects women from this act.
29:09An act,
29:10which is not even considered a crime.
29:13You can change life
29:14ruined by someone
29:15and get away with it.
29:16I am not the first woman,
29:19to whom that happened.
29:20And I won't be the last.
29:22ną Apostel,
29:29during the day.
29:30harmful
29:31and thus.
29:33Subtitles by ZDF, 2020
Schreibe den ersten Kommentar
Kommentar hinzufügen

Empfohlen