- há 18 minutos
New whistleblowers and insiders from social media companies speak out to reveal how algorithms designed to connect people have been helping to tear them apart. With new testimony and documents, they expose a machine thriving on outrage and division as part of a business model, with radicalisation, real-world violence and fractured societies some of the consequences of a system built to shape how users think, feel and see the world.
Categoria
🗞
NotíciasTranscrição
00:03I know the whole house will be very concerned at the extremely serious
00:08incidents that has taken place in Southport. A 17 year old male from Banks
00:15in Lancashire who is originally from Cardiff has been arrested on suspicion
00:20of murder. As soon as I saw those tweets spreading misinformation about the identity of the attacker
00:34I knew exactly what was going to happen.
00:47Violence started after false rumors about the suspect.
00:52Social media platforms are a way that people express their shock and anger.
00:58An undocumented migrant decided to go into a Taylor Swift dance class.
01:03When toxic opinions go viral it can have real world harm.
01:12But Southport was just the latest in a long chain of events.
01:22We have seen social media have so much influence over and over again.
01:31And that is down to the algorithm. An unstoppable flow of content.
01:38And those of us who understand how this works need to stop that happening.
01:46Our world is being actively remade.
01:51Falsehood spreads faster than fact, dividing communities and distorting reality.
01:58And as trust collapses so does our shared sense of truth.
02:02The defining challenge of the 21st century is not simply who wields the most powerful technologies,
02:10but who guides them with the greatest wisdom.
02:13You've created these platforms and now they are being misused.
02:18It's for profits, for money.
02:20Influence and power.
02:21People don't understand the beast that they're feeding.
02:24It becomes our duty to speak out.
02:38You want me to look at the camera by the way?
02:39Yeah, yeah.
02:40Ready? Should I clap?
02:43Ever since I can remember I've loved social media.
02:47I grew up on Facebook.
02:51It was an important part of connecting with my friends.
02:55The worst thing that would happen is you'd find out you hadn't been invited to a party or a sleepover.
03:01It doesn't feel like that now. It feels quite different.
03:07My feed is full of this toxic, angry stuff.
03:10People screaming in my face.
03:12And I can't not look, because it's my job.
03:16As the BBC's social media investigations correspondent,
03:20part of what I do is try to understand why there's so much of this stuff on these platforms.
03:25But it's incredibly hard to get straight answers from the companies.
03:30And the few people who do come forward do so at great risk.
03:42If you come out as a whistleblower and you are named, you have a scarlet letter on your head.
03:49So my goal is to not get sued.
03:54By coming public, I know that my career prospects in big tech
04:00are probably next to zero now.
04:04How does it feel showing me this?
04:08Sort of terrifying.
04:10Sort of terrifying.
04:14Oh wow, you've got so much. This must have taken you ages to document.
04:21Yeah, yeah, a little bit.
04:24I spent four years at Meta, so I've seen the other side of the curtain and I know some of
04:29the problems
04:31more than I wish I did.
04:34I have these kind of higher level research documents showing like all sorts of harms
04:40to users on these platforms.
04:43If these companies are not going to protect people, it becomes my duty and people like me,
04:50as people that actually understand how these systems work, to speak out.
04:59So a lot of the screenshots that I have here are how we ranked content.
05:05The algorithm is built to prioritize the stuff that gets the most engagement.
05:10And the political content that gets the most engagement is typically misinformation.
05:16It's typically very toxic, so people yelling and being really mean to each other.
05:20How worried should we be about the algorithm in terms of the role they play in pushing this
05:25kind of stuff?
05:26People don't want to see that content, but people don't know that they're choosing to
05:33push this content to you even though you say you don't want it.
05:36And the consequences are really, really bad.
05:42Meta's products are used by north of 3 billion people and the more time they can keep you on there,
05:48the more ads they sell and the more money they make.
05:52But it's very important that they get this stuff right,
05:56because when they don't, really bad things happen.
06:22In 2018, Facebook took Sri Lanka by storm.
06:26This whole road is full of smartphone shops on either side and it's just shop after shop after
06:32shop. And people come here from all over the city actually to get a good deal.
06:39Facebook had a program where with every phone that was sold you got Facebook for free.
06:45It didn't use up your data. So it kind of became the default platform.
06:49You felt a real sense of community, a sense of belonging.
06:57Scratch beneath that veneer and Facebook became the predominant way that the worst
07:03elements in society used it to inflame hate.
07:10Sri Lanka is marked by a majority single Buddhist population.
07:15We've had Muslims for centuries, but there is a small group of Buddhist monks
07:24who used Facebook to directly target the Muslim community in Sri Lanka.
07:31I was really tuned into how Facebook was being weaponized.
07:36And there was this video that I saw of this Muslim man.
07:43The people recording the footage are asking him whether he put infertility pills into the food
07:50that they consume.
07:59And he's confused. He doesn't clearly understand Singhala.
08:03In those parts of the country, Tamil is spoken, not Singhala.
08:11But the voices take it as this was the evidence, right? I mean, here was a man
08:16who was admitting to the fact that the Muslim community was making Singhala Buddhist women infertile.
08:25So by 48 hours, this was shared in the thousands.
08:29It was like a...
08:32a frothing hair.
08:33I used to die.
08:56Buddhist mobs set fire to homes and businesses killing young Muslim
09:01It was quite simply one of the worst anti-Muslim violence that Sri Lanka had ever seen.
09:20Facebook was the win to the seeds of racism.
09:24What was on the platform resulted in people believing it and taking action as a consequence.
09:31And I didn't have any way to reach out, there was no one to reach out to.
09:37We were utterly helpless.
09:43I find it bonkers when you look at the stuff that Matt shared from inside Facebook.
09:49I think that this document is one of the most revealing ones, as it's an internal study
09:56led by a data scientist.
10:00It essentially talks about how Facebook rewards outrage, and that if you get more negative
10:06comments you're going to get more clicks.
10:08And there's this bit where they talk about the incentives that the company are creating,
10:13essentially saying, outrage gets attention, and then laying out and saying, the current
10:17set of financial incentives our algorithms create does not appear to be aligned with
10:22our mission.
10:23E.g., their mission is to do good in the world, and yet those recommendation systems, those
10:28algorithms, are pushing the kind of content that feeds off negative reaction, and therefore
10:34you can see how it's impacting users.
10:37And they essentially lay it out in black and white, it's not like it's much of a secret,
10:41really.
10:42We kind of all know it.
10:43But the thing is, they know it too.
10:46And Sri Lanka was not a one-off thing.
10:50Almost as far as the eye could see, a tide of humanity.
10:54They're carrying with them whatever they could salvage from their villages, their homes, that
10:58they say they were burnt out of by the Myanmar military.
11:00Look at that little baby there in a basket.
11:05Take, for example, what happened in Myanmar.
11:08Facebook was actively pushing hate.
11:11And it's not just these places.
11:15It was part of a pattern that was happening all over the world.
11:19People were acting on the basis of what they were seeing on Facebook.
11:22There wasn't any other information they were responding to.
11:25And I think it's for that reason that Facebook almost had to come out and apologize, because
11:29there was no way of saying, well, hang on, this wasn't to do with us.
11:32It was entirely connected to Facebook.
11:34We've got to keep moving, folks.
11:37Watch your steps.
11:46Recently, UN investigators blamed Facebook for playing a role in citing possible genocide
11:53in Myanmar.
11:54What's happening in Myanmar is a terrible tragedy, and we need to do more.
11:58We all agree with that.
12:01I was recruited to Meta because the company got a lot of bad PR.
12:06I mean, there was Cambridge Analytica.
12:07There was Russia's involvement in the 2016 election.
12:12And from places like Myanmar or Sri Lanka, they just weren't investing enough in a lot of
12:19the hate speech that was just missed.
12:21It wasn't being caught.
12:22We didn't take a broad enough view of our responsibility, and that was a big mistake.
12:28And it was my mistake.
12:29And I'm sorry.
12:33Mark spent in the billions to hire a bunch of experts and create specialized integrity
12:42teams.
12:43Our job description was reducing harms to users at a global scale.
12:49I said, oh, that's my dream.
12:52That is my life's work.
12:56I had the opportunity to join Facebook, to actually sit for myself from the inside.
13:06And so I think it was really once-in-a-lifetime opportunity.
13:11There were so many dedicated people that wanted to do the right thing.
13:20You know, I believe deeply in transparency, and I think one of the solutions out there
13:24is to share more data with the outside world.
13:27And so I created CrowdTangle, an online dashboard you could log into, that what we did was made
13:34it really easy to see what stories and what content was true or false on Facebook that day.
13:42If it weren't for that CrowdTangle dashboard, you couldn't get any of that.
13:47Mark was so invested in it and bought it to make sure Facebook has a positive role on
13:53the world.
13:55What is the new mission of Facebook?
13:58So our new mission is to bring the world closer together.
14:02When Mark announced a major algorithm change called meaningful social interactions, my sense
14:09actually was that there was a real genuine interest in trying to build a better algorithm
14:14during those years.
14:15There was a real overhaul to the fundamental way that content was ranked inside the feed,
14:21ranking friends and family over something called civic content, news and political content.
14:26They said, yeah, it turns out people click on the politics, but actually they end up like
14:31feeling worse after a while.
14:32And so we're going to reduce the amount of politics in the feed.
14:36My job entailed running large scale experiments on sometimes as many as hundreds of millions
14:43of people where we would change how content was ranked in feed.
14:47And these people often had no idea that they were in experiments.
14:50You just might have a different feel to your feed one day.
14:54So you log in and much of the civic content from your feed might be missing or the protections
15:00against harmful content might be turned off.
15:04And what I found was that you remove the reliable, trustworthy news sources and you end up with
15:11my uncle who likes to post conspiracy theories.
15:16This change that was meant to bring people closer together and help them connect actually
15:22had the opposite effect, where it was promoting content that was dividing people and sowing
15:28more division and creating more conflict between people.
15:31So while it sounded great, we've more than doubled views on misinformation by demoting civic content.
15:38Is that the goal?
15:40No, it's not the goal.
15:41It's the opposite of the goal.
15:45So we warned against this in a document that we delivered to the team.
15:53And that was read to Zuck in advance of launching these broad scale experiments, and that was all ignored.
16:05We warned very sternly, this is not going to go well, and then it didn't go well.
16:17Major cities across the country are looking more like ghost towns.
16:21Shops locked up, normally bustling streets, now oddly quiet.
16:26At first, it seemed like just another event.
16:30And then suddenly, we had this massive surge in terms of volume.
16:36People were spending a large amount of time on online platforms, where all the information was brand new,
16:42and we weren't able to clearly identify exactly what the truth was.
16:46New cases and deaths associated with the coronavirus continue to climb right around the world.
16:52What the pandemic created was this perfect storm, because you've got people stuck at home spending more time online.
16:59And the algorithms, the recommendation systems, they were becoming ever more influential.
17:05They'll kill millions with their vaccines.
17:08There is no vaccine currently on the schedule for any RNA virus that works.
17:16I found it very frustrating.
17:18A lot of my time at Meta was spent focused on how to quickly identify conspiracy theories, conflicting information.
17:25Right now, you are being lied to.
17:28It was always difficult to identify which of those would escalate to rioting or violence.
17:34And that's a hard thing to deal with.
17:38None of you have reported on the truth since day one.
17:40You're complicit in crimes against humanity.
17:44Just knowing that there's the possibility that something bad might have happened,
17:48and maybe there might have been an opportunity to prevent it.
17:52That's a really jarring thing for me.
17:56The nightmare that kind of sits in the back of my brain.
18:02It's definitely a space which is stressful for me.
18:06This information is running rampant.
18:08You are being lied to.
18:11I'm killing millions.
18:27What did it feel like when you were at the riots that night?
18:38I was like...
18:39During the lockdown, there was no light in the city.
18:41Everything was closed.
18:44And as a DJ artist, it was devastating.
18:48I felt really lonely.
18:49Sometimes I even wanted to cry.
18:52I got a little bit more frustrated.
18:57We wanted our freedom back and go back to normal.
19:01So, when I saw a post of a Facebook group that was anti-Dutch government, it intrigued
19:08me.
19:09They do it for the same cause like I did.
19:14It was my first time in the group.
19:18It felt like I found a new family.
19:21We stayed in contact every day, sometimes even at night.
19:26The whole pandemic, it divided people into so many different groups.
19:34But then you always have a few people who have a lot of influence on other people.
19:39They have power, mental power.
19:44It was almost a daily routine for them to spread hate and to be negative.
19:50And the tension was growing.
19:52I mean, I could feel that something was happening.
19:58Conversations got angrier, more out of control.
20:01And people really wanted to do something about it.
20:03So, to me, fighting for our freedom was more important than even my own life.
20:12Leading up to the protest, I think they were all hyped up.
20:15They were full of adrenaline and people already started to incite each other.
20:21In a certain moment, I was really scared because the way some people looked, with the hate
20:27and aggression, like they were ready to even kill another man.
20:33I saw very young boys, maybe age 13 or 14, throwing rocks at police cars.
20:40The whole city changed and black people just turned into animals.
20:49It was a complete war zone in 20 minutes.
20:54They make each other crazy using social media, I'm sure about that.
21:00That day was the worst day of my life.
21:05This is the place where I used to have my restaurant.
21:10And they burned it with the fireworks.
21:15There was a shooting as well.
21:18It must have been so scary.
21:20I mean, even when you left, it must have been scary.
21:22Yes, of course.
21:23I cried.
21:24And I thought, tomorrow, there's nothing left to work.
21:31The pandemic was such a stark example that Facebook had got so much bigger than Mark Zuckerberg.
21:39He'd sort of lost control.
21:42Social media was really dividing people, particularly in Facebook groups.
21:47People were suddenly talking about hanging doctors and nurses and attacking people.
21:53They're very passionate.
21:56I've just been struck in the back of the head by a can.
22:00That really hurt.
22:02Wanting retribution, wanting justice in cities across Europe.
22:10The violence connected to what was happening on social media.
22:14That started in places like Sri Lanka was coming closer and closer to home.
22:25Are you willing to acknowledge that Facebook is contributing to society's woes, polarization,
22:31and all the rest?
22:32It's pouring gasoline on the burning fire in front of us?
22:35Well, look, obviously, for a platform which has, what, a third of the world's population on it,
22:39of course, you see the good, the bad, and the ugly of humanity show up on our platform as well.
22:44Our job is to mitigate the bad, reduce it, and amplify the good.
22:53These user insight documents from inside Facebook come from the period when the company was using
22:59this new argument.
23:01We're not responsible for everything.
23:03We are a mirror reflecting back society.
23:06These documents basically talk about the way that users are having their psychological
23:12processes activated by certain types of content.
23:18In particular, they say this happens when users view sensitive content,
23:22which includes content that has an elevated risk of causing harm to people and inciting violence.
23:28So you basically are admitting that the way that you are building the platforms
23:32and the way that you are using the algorithms, the recommendation systems,
23:36to ultimately result in these harms. And that can have a real-world consequence for us.
23:44Generally, we felt like leadership wasn't sure what the answer was,
23:46but it began to calcify into a sort of defensiveness.
23:50We're not responsible for all of polarization in society.
23:54Nobody is saying you're responsible for all polarization.
23:57We're just saying you contribute to it, and probably in ways where, like, you don't have to.
24:03And if you just made a few changes, you might not contribute to it as much.
24:07And yeah, I think that was dispiriting, because it felt like there was a window
24:11in which it was, like, genuinely introspective.
24:18If you haven't heard of the TikTok app, you will soon.
24:22It is among the fastest social networking app to be able to go from zero
24:25to over 200 million in registered users, 60 million in daily actives around the world
24:30in the span of three years. So that growth was part of...
24:36I was very lucky I joined TikTok.
24:39My job was to improve the algorithm so you're able to watch 50 videos.
24:45In a row, maybe in five or 10 minutes, the TikTok algorithm is able to collect
24:51a lot more information from you.
24:53Is anyone else, like, a little weirded out about how specific TikTok's algorithm gets for the For You page?
25:02Since Facebook became Meta, what, in mid-October, it's really struggled to find its footing or gain
25:08some traction here. User growth is stalled or shrinking.
25:12Meta was down because TikTok is literally eating its lunch, right? Like, there is no doubt in my mind.
25:18Mark Zuckerberg, for the first time ever, looked vulnerable last night.
25:24I actually remember a lot of fear about TikTok. Mark seemed like he was actually concerned about
25:32them overtaking Meta. The ethos is very much move fast and break things. And one way to try to
25:39compete with TikTok was to just move fast.
25:45It seemed to me at that time that it was almost like this engagement arms race.
25:52Mark is very paranoid about competition. And so when he feels like there are potential
25:57competitive forces, there is no amount of money that is too much.
26:02One way to try to compete with another company that's offering a unique product is to try to mimic it.
26:09And I remember a moment when regular headcount planning process and there were some integrity
26:15teams and safety teams were going to ask for two headcount to work on kid stuff and 10 to work
26:20on
26:20elections. There was another team that went, oh, we just got 700 for Instagram reels. I was like, okay.
26:34Anytime you're introducing a new type of product, say reels, there's an elevated risk because the
26:41infrastructure that existed before, it's either completely absent or it's very immature. So it's hard to
26:49prepare sufficiently in advance of that launch. I was frustrated because there's a common trade-off
26:57between protecting people from harmful content and engagement.
27:07There's another set of documents that speak specifically about the difference between TikToks
27:11and reels. Reels are similar in how they work to TikToks. So you're getting push stuff that you
27:15will show an interest in, not necessarily from people you know.
27:21And this is Meta's own study with stats shared internally, where they actively acknowledge that
27:29they are struggling to prevent harm when it comes to reels. So stuff like violence and incitement.
27:36So you can see here, they admit that comments on reels posts have a higher violating hostile speech
27:42prevalence. So 75% higher for bullying and harassment, 19% higher for hate speech,
27:47and 7% higher for violence and incitement. And so essentially what this document suggests to me
27:54is that they're building these new interfaces, they're introducing these new products to their
28:00sites to compete with their rivals like TikTok, but yet they're not seemingly putting the right
28:05safeguards in place in time. So their rush to evolve, to grow, to keep the user engaged is ultimately
28:12leaving users at risk, which they're admitting. Why is that happening?
28:18When you're working on these integrity teams, they are disadvantaged because in order to launch
28:24something that's going to protect people from some kind of harm in reels or in feed, you have to
28:30convince the team that owns feed or that owns reels to sign off on the product change that you want.
28:38But there's this power imbalance. They have incentives to not let those products launch because toxic stuff
28:47gets more engagement than non-toxic. TikTok was really trying to take up the market, ship out a new
28:57version, improve the algorithm maybe every week. But I started noticing more and more issues of people
29:06using TikTok, especially more borderline content. For example, something subtle like conspiracy theory or
29:15problematic content that will only be seen after you browse for more than a certain time. As the model
29:22become more advanced, I think the borderline content issue become more noticeable.
29:30In 2022, I remember how social media feeds were flooded with borderline content, the gray area stuff.
29:38Some of it clearly totally unacceptable. Maybe that's misinformation or racist hate, antisemitic abuse.
29:48And I was a bit confused, actually, because it felt like during the pandemic, there'd been
29:53efforts from the companies, the social media companies to tackle this borderline stuff.
29:58And it was back again and back again with a vengeance, really. Why was that happening?
30:04I'm so happy that you wanted to speak to me and to do this interview.
30:18So tell me what you actually did as an engineer.
30:22I worked at Meta building algorithms or programmes to reduce the amount of borderline content.
30:29And over time, the business positioning of our team changed towards like allowing more borderline content.
30:37Do you think the decision was connected to this engagement race with TikTok?
30:43Yes, it was definitely connected with that. You're losing to TikTok and therefore your stock price must suffer.
30:49And then, you know, that's when people started becoming like sort of very paranoid and, you know, sort of reactive.
30:55And they were like, let's just do whatever we can to catch up. Where can we get like two,
31:01three percent revenue for the next quarter? Maybe if this stuff is really bad, then users will stop
31:07using the app. If they're continuing to use the app, then by definition, it's not bad.
31:12So if someone likes harmful content, they'll get more harmful content?
31:16Yeah, you don't care as much about it because they'll continue to use it in the long term.
31:21Who ultimately was making those decisions?
31:24Senior VP, and I think he reported directly to the CEO. So they sort of told us that it's because
31:32the stock price is down. So we might actually come back to this later.
31:36It kind of been the most reassuring thing to hear. Or we might come back to it at a later
31:40date.
31:42Yeah, because there's no legal risk, you know. No one actually knows what Facebook is doing.
31:49So there's actually no incentive for any of these companies.
31:56It started with immigration for me. I was sort of the perfect candidate.
32:03When I was 13 and 14, I was kind of struggling with a lot of insecurities
32:08around my ability to have friends and relationships.
32:13It was very easy for me to believe that the world was against me. It made me believe in this
32:19idea
32:19of decline. I started at first, you know, the kind of entry level, fact-based video essay type.
32:29It is time to beef up our borders.
32:31Like, we just need to reduce immigration. Then it goes slowly, but surely it starts to recommend you
32:37more and more channels about that. And then you kind of just go into the whole ecosystem at large.
32:44And then it goes to our culture as being changed.
32:48The sacrificing of our children on the altar of mass migration.
32:53The West, as we all knew it, was being replaced. And then you start getting recommended to other
32:59fields in that space where it then talks about feminism.
33:04And then talks about LGBT rights. And then all of a sudden, you hear from a perspective that sounds
33:11reasonable at first. And then a couple months later, you believe literally everything in that space.
33:24Breaking news, Twitter has just accepted Elon Musk's offer.
33:29The social media platform is now in the hands of the world's richest man.
33:35I learned it on Twitter that Elon Musk was buying Twitter. It felt like a bizarre move. Everybody was
33:40trying to understand why he was doing that. In a tweet, Musk wrote, entering Twitter HQ. Let that
33:46sink in. I had concerns because the amount of resources Twitter was putting in at that time to
33:54making it a safer platform. Twitter was never perfect, but it was the top company priority. It's,
34:01you know, a rage-based business model in a sense. If the algorithm is left unchecked,
34:06the algorithm could prioritise the worst possible, most inflammatory opinions. Our job was to make
34:13sure that when you search for something, what you saw was reliable, true. We all believed in what
34:20we were doing so much. Never dull with Elon Musk. Expect lots of ideas, lots of changes, and lots of
34:30uncertainty. Whenever he spoke to us, he wanted Twitter to be more like TikTok.
34:35TikTok has taken America by storm. Engagement at TikTok at that time was just growing phenomenally.
34:41He was focused on making Twitter what he called hardcore.
34:50I found out on November 15th that my access to my work laptop and my email account were cut in
34:58the
34:58middle of the night. How did that fail? I, again, was shocked because I thought at least the company
35:06would have some decency to give an official notice, not something that felt so casual.
35:13What percentage of your staff did you fire at Twitter? I think we're about 20% of the original size.
35:19So 80% left? Yes. We had 250 designers about, all design-wide, and there was probably eight of us
35:28left. The entire Trust and Safety Org, almost all, was gone at that point. That wasn't a priority.
35:35Turns out you don't need all that many people to run Twitter. But 80%? That's a lot.
35:41Yes. If you're not trying to run some sort of glorified activist organization and you don't care
35:49that much about censorship, then you can really let go of a lot of people, turns out.
35:57I don't think any of us saw it as getting as bad as it got as quickly as it got.
36:01Come on, get back, get back. It's good to be back on Twitter. Don't believe what you read.
36:05Don't believe the mainstream. Elon reinstated my Twitter account after taking it back.
36:12Some of the people who'd been banned, all these names that caused so much division,
36:17allowed back on. Because he has sole control of that platform, he can tweak and nudge the algorithm
36:31to boost certain kinds of opinions and push down others. It is complete gaming of freedom of speech.
36:49In the internet referral unit, we see thousands and thousands of pieces of content every year
36:53being referred to us. These are kind of examples that we see on a daily basis that shows a normalization
37:02of the graphic content on social media. So people spot stuff and then they send it to you guys?
37:09Yeah, so posts were shared on X, describing Jews as rodents.
37:18And it's not just that, really. I mean, the language is really extreme.
37:20No, exactly. It's extreme anti-Semitism, kind of encouraging further attacks on Jews,
37:25viewed by thousands upon thousands of people.
37:30And yeah, we're certainly seeing extreme right-wing rhetoric and the imagery is so graphic and so
37:37overtly racist. And it's kind of increased to such a level now of this gratuitous violence, really.
37:46If a user is seeing more or more of this kind of stuff, you can see how they start to
37:51find even more
37:53extreme versions of that imagery. That, to me, seems akin to essentially how radicalization works.
37:59People are more desensitized to real-world violence and they are not afraid to share their views.
38:07Four times before us, the Muslims tried to wipe out Europe with invasion and they were...
38:13They just made me feel perpetually... They energized me, but not really in a good way.
38:20They just made me very kind of angry.
38:23Imam said to me that they were going to replace the UK government and that they would be Sharia law.
38:29It very much reflected the way I felt internally, that I was angry at the people around me.
38:36It was very easy for me to believe that the world was against me, which I think is completely central
38:42to a lot of kind of hard-right content on YouTube, TikTok.
38:50Senator Marco Rubio has announced bipartisan legislation to ban TikTok from operating in the United States.
38:59Your platform should be banned. TikTok surveils us all. And the Chinese Communist Party is able to use
39:06this as a tool to manipulate America as a whole.
39:10There were threats being made against TikTok because it's owned by ByteDance, a Chinese company,
39:16by the US government, that China was going to misuse sensitive data. ByteDance has always very strongly
39:21denied that was the case. But actually, the information that TikTok has about us, which is
39:26the most powerful, is the way that we linger on a video, the way we scroll.
39:32As an engineer, how can you build recommendation systems that are inherently safe?
39:40I don't actually know the answer to that question about how to build a completely safe one.
39:49We have no control on the deep learning algorithm.
39:54To us, it's still like a very black box how internally it works.
40:00Which also is, in and of itself, slightly terrifying because you clearly understand
40:04algorithms probably better than anyone we've spoken to.
40:08Yeah. To be honest, we don't actually pay too much attention to specific contents.
40:14To us, all the content, just a different number.
40:17We, the ones that are responsible for the recommendation and the content safety team,
40:21they're responsible for eliminating those bad contents.
40:26Like the car manufacturer, right? There's a team that's responsible for the
40:30acceleration, the engine, right? So we expect the team working on the braking system was doing a good job.
40:45I am looking back through my messages with someone who currently works at TikTok.
40:50And they got back to me earlier this year, actually.
40:53They sent me a message and we've been speaking since then, trying to work out a time when
40:58we can meet each other. They only wanted to meet me in person.
41:03We've been messaging back and forth on encrypted channels.
41:07They are working in trust and safety and there's stuff they're concerned about.
41:12They want people to know what's happening on the inside.
41:16It's so hard to get someone to want to show you stuff.
41:20It's a pretty high risk thing to do when you still work there.
41:27Hello. Hi.
41:30So is this your laptop? Yeah.
41:33Why have you decided that you want to speak out?
41:36That you want to show me some of these documents, conversations?
41:39If you're feeling guilty on a daily basis because of what you're instructed to do,
41:44at some point you can decide, should I say something?
41:47What are the high risk issues? What are the things that most concern you?
41:51I'll show you.
41:54On the dashboard, the first thing you can see is volume.
41:58So that is a lot of cases.
42:00Yeah. The platform is being used by children, so it's upon that platform to be held at a high
42:07level of accountability. And there's content out there linked to terrorism, sexual violence,
42:16physical violence, abuse, trafficking. It feels like it's increasing.
42:22And it's quite damaging because children are much more easily influenced.
42:26So I guess the risk of being activated in a negative way.
42:31Yeah, especially on a generation that's literally hooked to this app.
42:38Look at this for example.
42:41Yeah. So these P2, P3, P1, those all show the level of priority essentially.
42:47Exactly. If you look at the country where this report comes from,
42:51it's very high risk because it's a minor that involves, you know, sexual blackmail.
42:55Yeah.
42:57And then you can see the priority here. The urgency is not high at all.
43:00Priority two.
43:03You can see all these are reported.
43:05Oh, yeah. Yeah, yeah, yeah, yeah.
43:07So very high risk.
43:08Yeah.
43:09Encouraging people to commit crimes, terrorism, or join terrorist organisations.
43:15And that's...
43:16What? P2?
43:20And then you have another case that's classed as P1.
43:23A high priority case.
43:26It's a case that relates to a politician essentially.
43:30Just to maintain a strong relationship, not because there's a high risk for the user.
43:39A high priority case.
43:40Looking through all of these documents from the TikTok whistleblower,
43:42I'm just struck by how many of them point to the same thing,
43:46which is that politics seems to matter more than protecting kids or teenagers.
43:51And the case we discussed involving the 16-year-old girl,
43:54who reported that a TikTok account was posting her photo alongside explicit images.
44:00And this is priority number two.
44:02But this is about an under 18 who is essentially having a concern relating to explicit images.
44:08And then there's a 17-year-old who is reporting being the victim of cyberbullying by two accounts.
44:15This is priority number two.
44:18So take, for example, priority number one.
44:21In the picture, there is a picture of a chicken.
44:22And the person is a candidate of the Iraqi parliamentary election of 2025.
44:26So this is in the same country.
44:28You've got a teenager.
44:30And then you've got a politician who's being affected by a picture of a chicken.
44:33And the politician is P1.
44:35The whistleblower says that these kinds of cases happen all the time.
44:38And they are really uncomfortable with this.
44:39That when kids are at risk, when under 18s are at risk,
44:42it's not the first thing that they are tasked with dealing with.
44:45Instead, they're tasked with dealing with stuff about politicians or about political figures
44:49or in places where it's a priority to deal with it.
44:52But this, for example, this is the next wave of stuff they've got to come to.
44:55And they've already got loads of examples of cases and harms that they're looking at.
45:03What would you say to a parent who has a kid or a teenager that is using TikTok?
45:08Delete it.
45:10Keep them as far away as possible from the app for as long as possible.
45:34What would you say to a parent who has a kid or a kid or a kid or a kid
45:43or a kid or a kid?
46:01Do you find yourself, like, does it tend to be more what we call short-form videos,
46:11so TikTok reels?
46:12Do you think it's that kind of thing?
46:13Yeah, I just scroll on there for hours.
46:15I also do that too.
46:18Are there things that pop up on your feed and you think,
46:21I don't really want to see more of that?
46:23Probably, like, the same, like, violence.
46:24It's like the bullying, the fighting, really, because I don't like seeing them, like,
46:29it just makes people think it's right.
46:31And then you report it, it doesn't come up for, like, a day or two,
46:34and then it'll just pop back up.
46:36And then I keep doing it and it just keeps coming back.
46:38It kind of just makes me feel like, oh, they're not listening to me,
46:41I'm telling them I don't want to see this kind of stuff.
46:44How do you feel when you see that kind of content?
46:46Just really frustrated, like, I've told them,
46:48I don't want to see this, I don't want to see this over and over again,
46:50and they just keep it there again.
46:52Because it's not necessarily something that people agree with,
46:55but it has so many views and so many likes that they keep putting the videos
46:58back out there and back out there because people react with them.
47:01If I watch too much and then there's just something that comes up,
47:06some person online just trying to get popularity,
47:09I shouldn't get too annoyed about it, but it's because I'm in that sort of state.
47:13I have a very strong opinion about it,
47:14though it could make me feel quite annoyed and angry.
47:20What I found at the time was just how often we were seeing the same content again and again
47:25that we didn't want to see from people we do not know.
47:29Way outside of the remit of even what you would have ever searched.
47:33And I think when that comes to kids and teenagers in particular,
47:36that was a serious problem because you're still trying to work out what's alright and not okay to do.
47:41And your feed is beginning to dictate the way that you look at the world.
47:50I've been wanting to talk about this publicly for over a year now,
47:54and now I've finally worked up the courage to talk about things I've kept quite close to myself.
48:01I started to abandon my old racist and misogynistic views as they no longer made any sense to me in
48:07my head.
48:08I think people need to be made aware that the amount of anger in a lot of hard rights circles
48:17was extreme.
48:19If the only thing you're hearing is about asylum seekers committing really bad offences,
48:25if you're constantly seeing that on your feed, then it is going to radicalise you.
48:31Like, a hundred percent it is.
48:39I know the whole house will be very concerned at the extremely serious incidents that has taken place in Southport.
48:48Two children have been killed in what police called a ferocious stabbing at a dance class in Southport.
48:55And the mother that I was just consoling, just the look on her face,
48:59because she knew something had happened to a child.
49:00It's like, oh my gosh, how do you even comprehend? There's nothing you can do.
49:12At Twitter, when we'd have a really serious news story like that happen,
49:18our job was to make sure that anyone sharing completely unverified information would be pushed down in the algorithm.
49:30The safeguards were taken away.
49:35Because that is the first thing that Musk did.
49:39There are unsubstantiated opinions speculating about the identity of the attacker in Southport.
49:45And people are listening to them.
49:51And that is a spider web of misinformation that grows and grows and grows.
49:57So an undocumented migrant decided to go into a Taylor Swift dance class today and stab six little girls.
50:04We knew that people with an agenda could gain the algorithm, could harness this huge amount of power.
50:11I knew exactly what was going to happen.
50:24I remember seeing that on my video.
50:28Like within hours, a whole massive crowd descend on, on the town.
50:33A 17 year old male from Banks in Lancashire, who is originally from Cardiff, has been arrested on suspicion of
50:41murder.
50:41The news is stepping in and saying, we know who did this.
50:45But by then it's too late.
50:55The mosque there is the first thing that gets desecrated.
51:09And all of this is self-enforcing, completely circular, because when these things were happening,
51:16they're then being reported on social media platforms.
51:19And it spins faster, and faster, and people get angrier, and angrier.
51:30Fucking hell.
51:52And I remember, at that time, wanting to get to the bottom of who was behind these false rumors, allegations,
51:59suggesting that the attacker is in asylum.
52:01That they arrived by boat to the UK, that they're Muslim.
52:05And I think now, if I look...
52:08One of the first places that published some of the speculation was called Channel 3 Now.
52:15It was just sort of pumping out stories.
52:17And I tracked down someone, a man called Farhan, in Pakistan, and I finally got this email back.
52:24We're only just trying to, you know, put news out.
52:27Essentially, we're kind of trying to get clicks, and we didn't mean to get it wrong.
52:32This is clearly a product of the algorithms, the social media systems,
52:35because Channel 3 Now wouldn't have wanted to post this content had it not known that
52:41it was going to get views and clicks and likes if it posted it as quickly as it possibly could,
52:45even if it was wrong.
52:46And in fact, it being wrong meant it got more likes, more views, more clicks,
52:50because it got engagement, it sparked a reaction, it was emotive.
53:02Southport felt like a real shift, like the violence connected to what was happening on social media.
53:07It's no longer far away.
53:15We are the canary in the coal mine.
53:17A lot of what goes on and wrong on social media was trialled and tested in countries like Sri Lanka,
53:22because nobody cares about us.
53:26All of our markets are active experiments.
53:30And so what is now much more evident in the UK, in other parts of the world,
53:37was very evident in countries like Sri Lanka years prior.
53:43So what I have warned against, what I have cried about, suggests that this is not a Sri Lankan problem,
53:50it is a social media problem, it is a Silicon Valley problem, it is a governance problem,
53:55it is a democracy problem, it is a problem for all of us.
54:02Algorithms now intact all our lives.
54:12Immigration centers targeted nationwide.
54:16Police overwhelmed.
54:17There have already been hundreds of arrests.
54:21The criminal law applies online as well as offline.
54:28Around this time at Meta, there was a shift.
54:31It's time to get back to our roots around free expression on Facebook and Instagram.
54:36They announced that they were killing the fact-checking program and they cut back very severely on content moderation.
54:41We're going to simplify our content policies and get rid of a bunch of restrictions on topics like immigration and
54:47gender
54:47that are just out of touch with mainstream discourse.
54:51It felt like they lost the commitment to combating a lot of these harms
54:57and seeming to just not really care about actually listening to the data and actually trying to solve problems.
55:04My biggest worry is the politicization of these platforms.
55:08Mark was generally trying to be neutral.
55:12But now Elon, I think, specifically treats X as a political outlet for his views.
55:18And I don't think any single company should be allowed to have this amount of influence and power over 4
55:25billion people.
55:28I want Trump to be present.
55:32It now feels like we're in a world increasingly inhabited by a new type of individual.
55:38Would you like to have a conversation?
55:40Sure, absolutely.
55:41Yeah, yeah.
55:43I don't like you.
55:44I think you spread hate.
55:45I think you spread bigotry.
55:46I think you piss a lot of people off.
55:49We record all of it so that we put on the internet so people can see these ideas collide.
55:54When people stop talking, that's when you get violence.
55:59Influencers whose capacity for generating engagement online is a huge part of their fame.
56:14Oh my God.
56:16Oh my God.
56:17He was just shot.
56:18He was just shot.
56:20Charlie Kirk was shot dead while addressing students on a campus in Utah.
56:25Videos of his assassination reached millions across the globe.
56:28The killer's reasons were initially unclear.
56:32Right now federal investigators say they are talking to this man, Tyler Robinson, about why he might want to kill
56:38Charlie Kirk.
56:40Conspiracies about who he is and what he believed are flourishing.
56:44But he'd left some clues at the scene.
56:47The inscriptions carved onto the bullet casings that were believed to be owned by the alleged shooter.
56:54You'd only understand them if you've spent a lot of time on social media.
56:58They contain a number of seemingly sarcastic, irreverent phrases that are common in gaming and online communities.
57:06A seemingly ordinary student wrapped up in a dark, nihilistic meme culture.
57:13Did the killer have one eye on the online reaction provoked by his violence?
57:17Are we witnessing a new ideology of engagement?
57:25Algorithmic systems shaping politics, shaping society, shaping how events unfold.
57:37If I was parenting young children right now, I would encourage them to spend as little time online as possible.
57:45We have lost lives to this. We know where this goes.
57:50Social media platforms have an essential responsibility to the truth.
57:55We want this sort of accountability and transparency. We have to make it legally required.
58:00Problems going to get worse.
58:02I can't continue existing without trying to fight.
58:18I can't continue without trying to fight.
58:38Theシリティ
59:05For more insights from me and the experts
59:08at the Open University about how to stay alert to algorithms,
59:12scan the QR code on screen
59:14or visit connect.open.ac.uk forward slash rage machine.
59:51For more information, visit www.fema.gov.au
Comentários