Zum Player springenZum Hauptinhalt springen
Wie gefährlich sind TikTok und Co für die Demokratie? shift

SCHOCKIEREND: Wie TikTok & Social Media die DEMOKRATIE ZERSTÖREN! | shift – Die Gefahr der Algorithmen 📝

Die Demokratie steht unter Druck, und die größten Bedrohungen kommen heute nicht mehr nur aus traditionellen politischen Lagern, sondern aus den Tiefen der sozialen Medien. In dieser brandneuen, investigativen Folge von "shift" nehmen wir die Plattformen TikTok und Co unter die Lupe: Wie gefährlich sind sie wirklich für unsere Demokratie? Wir decken auf, wie der perfektionierte Algorithmus von TikTok und ähnlichen Diensten, wie YouTube und Instagram, nicht nur unsere Aufmerksamkeit fesselt, sondern auch unsere politische Meinung manipulieren kann.
Erfahren Sie, wie Fake News, Desinformation und radikale Inhalte sich blitzschnell verbreiten und so das Fundament unserer Gesellschaft erodieren. Wir sprechen mit Experten für Cybersicherheit und Medienpsychologie über die Gefahren der sogenannten "Filterblase" und wie Social Media-Giganten durch intransparente Empfehlungen die politische Wahrheit verzerren. Von gezielten Beeinflussungskampagnen bis hin zur viralen Verbreitung von Hassbotschaften – die Methoden sind vielfältig und die Konsequenzen gravierend. Playtime und Engagement sind entscheidend, um die Reichweite dieses wichtigen Themas zu erhöhen. Helfen Sie uns, diese Gefahr aufzudecken, indem Sie das Video liken und teilen.

Diese Episode von "shift" ist ein dringender Weckruf! Wir zeigen Ihnen konkrete Beispiele aus der ganzen Welt, wo Social Media Wahlen destabilisiert und gesellschaftliche Spaltungen vertieft hat. Es geht nicht nur um lustige Tänze oder kurze Clips; es geht um die Integrität unserer politischen Prozesse und die Zukunft unserer freiheitlichen Grundordnung. Was muss die Politik tun, um diese Tech-Giganten zu regulieren? Und viel wichtiger: Was können Sie selbst tun, um sich vor dieser subtilen Manipulation zu schützen? Schauen Sie rein, abonnieren Sie "shift" für mehr investigative Inhalte und diskutieren Sie mit uns in den Kommentaren über die Macht von TikTok und die Zukunft der Demokratie.



Wie gefährlich sind TikTok und Co für die Demokratie shift shd20251113


#TikTokGefahr,
#Demokratie,
#SocialMedia,
#shift,
#Algorithmus,
#FakeNews,
#Desinformation,
#Cybersicherheit,
#Filterblase,
#TikTok,
#YouTube,
#Instagram,
#Gesellschaft,
#Politik,
#Gefahr,
#Wahrheit,
#Manipulation,
#Medienkompetenz,
#Technologie,
#Zukunft,
#Dailymotion,
#Investigativ,
#Playtime,
#Likes,
#Teilen,
#Nachrichten,
#PolitischeBildung,
TikTok Gefahr,
Demokratie,
Social Media,
shift,
Algorithmus,
Fake News,
Desinformation,
Cybersicherheit,
Filterblase,
TikTok,
YouTube,
Instagram,
Gesellschaft,
Politik,
Gefahr,
Wahrheit,
Manipulation,
Medienkompetenz,
Technologie,
Zukunft,
Dailymotion,
Investigativ,
Playtime,
Likes,
Teilen,
Nachrichten,
Politische Bildung,


Kategorie

📚
Lernen
Transkript
00:00Does social media endanger our democracy?
00:02One thing is certain: radicalization online is dividing societies worldwide.
00:06Many blame social media for this.
00:09People primarily interact with like-minded individuals, and algorithms ensure that...
00:13that we see what corresponds to our opinion.
00:16What's behind it? Find out more at Shift.
00:21Texting with friends, checking out new dance moves,
00:24To briefly gaze longingly at my crush and catch up on some news at the same time.
00:28Many people have been using TikTok, Instagram, or Facebook for a very long time.
00:32But that's only one side of social media.
00:34says this Indian software developer,
00:36who deals intensively with radicalization online.
00:39People need to understand,
00:41that social media is not the same as it was ten years ago.
00:44Today they are a minefield, a battlefield for information warfare.
00:47An information war, also waged with AI, as you will see later.
00:52Everyone is fighting for your attention and your time.
00:55because that brings in advertising revenue.
00:56But reach is only available for things that are loud and entertaining.
01:00Many people had to get used to that.
01:02Politicians often find it difficult to communicate content in a way that...
01:05that they arrive on TikTok and similar platforms.
01:07Many therefore employ experts like Luna Möbius.
01:10She is a social media officer for the German Green Party.
01:13and shows her colleagues what is possible and what is not.
01:16This dryness, I'm standing in front of a lectern and I'm giving my speech.
01:20This objective approach simply doesn't work.
01:22Anger is incredibly effective, fear is incredibly effective, but comedy is also incredibly effective.
01:28And I think that democrats also need to become more populist in their content.
01:33I don't find populism inherently wrong at all,
01:35as long as the facts are somehow not wrong.
01:38Because online, it's not the best arguments that count, but the strongest reactions.
01:43The algorithm rewards content that we like, dislike, or share.
01:47These are primarily simple messages that evoke strong emotions.
01:50Such posts are specifically amplified by the algorithms of many platforms.
01:59and generate even more comments.
02:03Politicians are trying to exploit this digital outrage.
02:08This has become part of the media strategy in the political business.
02:14Luna Möbius considers this fatal.
02:17Anger isn't the only emotion that somehow drives content creation.
02:21but it's actually anger, fear, joy, and a kind of practical value.
02:24So, if I somehow feel like I've learned something, that is.
02:26But what we are currently experiencing is very divisive content.
02:29who deliberately plays on anger and fear.
02:31And less of an effort, I somehow make people laugh.
02:34Or I give them something to take with them, something they've learned.
02:37And that's where we need to go further.
02:38But the operators of the major social media platforms
02:41are not interested in changing their behavior.
02:43For them, the business of creating excitement works too well.
02:48Revenues from digital advertising are expected to reach a certain level by 2025.
02:51expected to be more than one trillion US dollars.
02:54And the big three share more than half of that.
02:58Alphabet, ByteDance and Meta.
03:00And that means the only solution for me is to actually address it politically.
03:03and to write truly firm laws,
03:05where, if these rules are broken, there are severe consequences.
03:07Consequences would only arise if laws for the platforms were uniform worldwide.
03:12But we are still a long way from that.
03:15Instead of uniting us, social media continues to polarize.
03:18Politically as well.
03:19Entire platforms are thus politically biased.
03:21For example, acts by Elon Musk or TikTok.
03:24Right-wing opinions are demonstrably spread more widely there by the algorithm.
03:27Right-wing and conservative content also dominates on Donald Trump's Truth Social.
03:33Furthermore, the platform's AI algorithms often show us topics that
03:36which we have liked in the past.
03:38So that we can scroll on for as long as possible.
03:41But this can also amplify serotypes across our world.
03:44At the same time, people with extreme or rare opinions find
03:47connect with each other more quickly on social media.
03:50As a group, they appear louder there than in the pre-social media era.
03:53And then there are also trolls and AI bots here,
03:56who spread political fake news and disinformation by the thousands.
03:59For example, about the war in Ukraine.
04:02They post conspiracy theories and hate messages.
04:05But what constitutes hate speech?
04:07What constitutes freedom of expression?
04:09There is no consensus on this worldwide.
04:11What you end up seeing is often not neutral.
04:14but filtered, polarized, sometimes also manipulated.
04:17It's no wonder, then, that the tone on social media is becoming harsher.
04:20That people shut themselves off from disturbing opinions.
04:23Studies also show this.
04:24But what happens in some chat groups on Telegram, Signal or WhatsApp is particularly disturbing.
04:30There, even children of extremists are approached in order to politically radicalize them.
04:35We are seeing a strong trend of young men becoming digitally radicalized.
04:38There is a widespread perception that adults approach these children and bring them into the scene.
04:44In reality, however, we tend to see that it is slightly older people who radicalize the younger ones.
04:48So, the 13-year-old then the 11-year-old.
04:50We see that content is being created on social media platforms that is appealing.
04:54After initial contact on social media, recruiters usually invite the children into private chat groups.
05:02For example, on WhatsApp, Telegram and Signal.
05:05This way, the conversation remains hidden from prying eyes.
05:08Miro and his colleagues provided us with screenshots from such groups, whose names we are deliberately not mentioning here.
05:15For example, a group that encourages its members to resort to violence.
05:20One day, the group discusses completely normal school topics.
05:23Shared with other documents such as Killguide or Terrorguide.
05:27Some of the members in the chat are said to be only 12 years old.
05:31Groups like these are being uncovered worldwide.
05:33There, a sense of community is formed, and these children are given a certain purpose.
05:39a certain motivation for why one actually lives, a purpose in life.
05:43And that's what makes these groups so dangerous.
05:45Such groups are linked to terrorist attacks and hate crimes against minorities.
05:51In the summer of 2025, police in the Czech Republic and Austria arrested five teenagers.
05:56They had tried to set fire to a synagogue.
05:59They had become radicalized on social media.
06:01Extremists are trying to recruit children not only there.
06:05For very young boys, the central place where such things take place is probably through gaming worlds.
06:09One platform that we should definitely consider is Roblox.
06:12Roblox is a very popular game among children.
06:15But it also has a dangerous side.
06:17Many of the rooms there look cute.
06:19But some also recreate terrorist attacks or mass shootings.
06:23That male teenagers become radicalized relatively often,
06:27But it is also related to other problems in society.
06:29We are in the post-pandemic era.
06:32There are many crises affecting this young generation.
06:35We see a very strong problem with loneliness.
06:37This means that people are looking for new communities that simply no longer exist in the same way.
06:40We also have a crisis of masculinity, which unfortunately leads these people into these far-right worlds.
06:45He believes that the frustration over these crises makes Jung more susceptible to hypermasculine rhetoric.
06:50Exploiting their insecurities. Child's play for radical groups.
06:57But classic fake news and propaganda can also radicalize social media users.
07:02How do we get away from this?
07:04An experimental AI tool by Sairaj Balaji could help.
07:08You know, the Indian student and software developer from the beginning.
07:11Online radicalization is a problem that affects all sides of the political spectrum.
07:16We don't want people who kill other people.
07:21Sairaj Balaji is 20 and studies computer science in Chennai, India.
07:26He developed PrismX.
07:28The tool is designed to identify radicalized users online.
07:31Once this has happened, an AI chatboard should contact such users and deradicalize them.
07:36This is the PrismX dashboard. I'll simply enter the keyword here, FGC9, a code word used to describe 3D-printed weapons.
07:45Then I click on Run and Scan.
07:48Using AI language models, PrismX can search entire social media networks for radical content.
07:55It can assess the radical nature of accounts, posts, and comments and generate a detailed report.
08:02Currently, PrismX is only running as a test version.
08:06We currently have twelve marked users. The bot can then communicate with them.
08:11Currently, we are only communicating with dummy accounts. There are no real people behind them.
08:18The bot attempts to gain the user's trust in order to dissuade him from his radical beliefs.
08:25To appear more convincing, he does not reveal himself as a bot.
08:31Nowadays, AI bots communicate so sophisticatedly that they are almost indistinguishable from real people.
08:36The technology is now very easy to use.
08:41What I want to show with PrismX is that someone like me can do this without training in radicalization or deradicalization.
08:48If I can do it, the government and businesses can too.
08:52And they won't tell them whether they will or not.
08:55And they, as users, no longer know whether their chat partner is real or not.
08:58Since his tool has received media attention, Seiraj has also received many negative comments.
09:06Because a tool like PrismX could also cause a lot of damage and, in the wrong hands, radicalize people.
09:11Whatever they say, whatever they post, can be picked up by such a tool.
09:18It can create a profile of them and target them.
09:21A more complex version of a tool like PrismX could spark unrest.
09:26It could sow discord.
09:27It could change many people's opinions on a particular topic.
09:30It could also identify people who are susceptible to radicalization and turn them into terrorists.
09:36And that's automated and on a large scale.
09:38Imagine how frightening that is.
09:42So, a double-edged sword.
09:44Perhaps it's better to show people directly how fake news, disinformation, and deepfakes work.
09:50And how to recognize them.
09:52Of course, that will be done too.
09:53For example, at the Youth Media Days in Berlin.
09:58Quiz questions.
10:00Virtual Reality Escape Room Games.
10:04And a deepfake photo challenge.
10:07There is a lot to do here at the Youth Media Democracy Days in Berlin.
10:14At this station, you have seven minutes to expose fake news.
10:18Only those who get stuck receive occasional hints from game master Jörg Friedl.
10:24My point is that we should be hunting down fake news, so to speak.
10:27This means that visitors take on the role of fact-checkers in the fictional Ministry of Information Literacy.
10:36To expose fakes, the students check whether at least two reliable sources report the same facts.
10:42It is important to pay attention to the dates and the names.
10:48And if you're unsure, just look it up.
10:51Some people spend six hours every day on social media.
10:55Even though they know that there is a lot of fake news on platforms like Telegram, X or TikTok.
11:00TikTok is covered a lot when it comes to news like Trump or Kik.
11:07The young people here now know that many images in fake news are generated by AI.
11:13At this station, they have to try creating deepfake images themselves using an AI tool.
11:18I think it's important to create this so that you can see for yourself what these things look like.
11:29So that one can also learn about the Jewels, especially how people make these things.
11:35The trainers, all from local non-governmental organizations, are worried that fake news and conspiracy theories are influencing the students.
11:43Some teachers here are also worried.
11:46I believe that it has the same effects on students or young people as it does on adults.
11:53It divides society, leading to fights and the formation of cliques, which are then unleashed in hatred and hate both offline and online.
12:06Social media has drastically changed political communication.
12:10Despite the problems, there is also an advantage.
12:12We can immediately post what we think and join the discussion today.
12:15And maybe even go viral.
12:18What are your thoughts on political freedom of speech on social media?
12:21Does she go too far for you, or not far enough?
12:23Write to us. See you soon.

Empfohlen

28:08
Als nächstes auf Sendung