Passer au playerPasser au contenu principal
  • il y a 1 semaine
Viral Voices, Virtual Walls: What is Social Media’s Role in Shaping Democracy?

Catégorie

🤖
Technologie
Transcription
00:00Sous-titrage Société Radio-Canada
00:30Thank you, Alan. As the founder of Lead Stories and being a professional journalist for several years, including your time
00:39in CNN, your current work involves a lot around debunking misinformation across platforms. Could you walk us through how social
00:49media has fundamentally changed over the last few years?
00:53Yeah, I come from the perspective, when she says journalism a few years, I actually started in the business more
01:00than 50 years ago.
01:01When I was writing my stories on a typewriter, reading them to an AM radio audience, this was back when
01:09people would get their news from the radio or the local newspaper that was delivered to your doorsteps, and everybody
01:15was pretty much literally on the same page.
01:20But that's not the way it is anymore. People get their news now, 50 years later, through social platforms.
01:26And it's really distorted what's happening, because instead of now just a few sources who you knew, and who, if
01:35they made a mistake, you could find them.
01:37We've got millions of creators of content. And most of them, by far, are anonymous. And there's no accountability.
01:50And if I wanted to create misinformation, not that I do, but if I wanted to, I could take a
01:57political speech, and with about $2 in 10 minutes, I could make it seem like that politician was saying something
02:07the complete opposite, and put it out without any significant accountability or consequences.
02:14It might get a community note attached to it, or might not, and that would come days later, after millions
02:21of people have already seen it.
02:23So there's really not accountability. People don't know what to trust. And it's very easy to create misinformation.
02:32When Mark Zuckerberg threw the fact-checkers under the bus a few months ago in the U.S., and I
02:38was one of them, we were one of the third-party fact-checkers.
02:42He said that fact-checking is censorship, but it's not. He knew better. We add information to the conversation.
02:50We put facts in where there need to be facts, and they're absent, and they're missing.
02:55So fact-checkers are demonized. Journalists are called the enemy.
03:00What happens then is people will turn to something like ChatGPT to ask, is this real, or grok.
03:07And on social media, believe me, they're liars. AI tools are liars.
03:13We can give you some very real examples every day of where that happens.
03:17And so it's really a dangerous, perverted situation, and it kills people.
03:22Misinformation kills people on social media.
03:26I had a good friend who died this week from breast cancer.
03:29Diagnosed six years ago, she didn't want to go for it.
03:32She was looking for alternatives online, and she found the first two years, she was following this path of alternative
03:39medicine that she learned about largely through social media.
03:42Two years into it, she realized that wasn't working, but it was too late.
03:46So now her 14-year-old son doesn't have a mother.
03:50social media misinformation kills, and you have to have fact-checking, you have to have accountability.
03:56And in the U.S., we're really looking to Europe to hold the platforms accountable because it's not happening in
04:03the United States.
04:04And that's kind of where we are today.
04:08Looking at you, Craig, you're an investigative journalist, you have co-founded Indicator Media, and you are inadvertently exposed to
04:17disinformation, manipulation.
04:20How do you think algorithms have evolved in that aspect?
04:24Are they keeping up with the sheer amount of disinformation online?
04:29Yeah, the, you know, social platforms are systems really about capturing people's attention and then distributing it.
04:36They want you to post, they want you to be engaged, they want you to spend time.
04:39And so they come up with algorithms, basically recipes, to figure out what content they want to show to what
04:45people at what time.
04:46And when it comes to false information, but also just anything that's a little more extreme, could be something a
04:53little more violent, a little more sexualized, the way that these systems are built and tuned tends to kind of
04:59reward the extreme, the outrageous claim, the remarkable image.
05:03So it could be something that's completely false or something that's, you know, just far beyond what they're supposed to
05:08be showing people on these more mainstream platforms.
05:10And, you know, that kind of bias towards whatever is going to get people to click and to spend time
05:15is really still baked into these algorithms today.
05:19So whether it's TikTok or whether you're on Facebook, at the end of the day, it's them choosing what's going
05:24to be shown to you.
05:25And sure, it might be the accounts that you chose to follow, it might be your friends, but they're also
05:29inserting other content in there.
05:31And, you know, when we look back at the history, say over the last decade or so, these algorithms still
05:36to this day, knowing all of these sort of false misleading content is there, they still reward stuff that tends
05:42to be the most extreme, the most titillating to get people to spend time on it.
05:47And I think, you know, Alan mentioned Europe, the reality is that the EU is sort of different than anywhere
05:52else around the world right now when it comes to algorithms, because the Digital Services Act have mandated a certain
05:58level of transparency around these companies.
06:01And that's the exception.
06:03And we still don't know a lot about how the algorithms choose what to show you.
06:08There have been lots of efforts to kind of reverse engineer TikTok or what have you.
06:12And at the end of the day, we don't know, they're black boxes, but because they tend to reward the
06:17stuff that's going to grab people's attention, it tends to align with things like false and misleading information or sometimes
06:23stuff a little more violent or what have you.
06:27Agatha, picking up on the fact that social media can have actual harm on people, you're an executive director for
06:36Agora Research, a center of research based in Uganda.
06:39And you've recently had, well, let's say, a lot going on.
06:44Can you walk us through what you're doing and what has been going on recently?
06:50Thank you very much.
06:51It's a lot.
06:54I hope that I've cried enough over it and I will not today.
06:58But I'll start with the different side of social media that for us in Uganda and East Africa, it has
07:09helped us to counter the narrative of the authoritarian governments that we are dealing with.
07:15I used to be a journalist in mainstream media and there were two disadvantages there.
07:25The first was that you are dealing with 70% of Uganda's population being under 30 with a very, very
07:34short lifespan.
07:35They don't read newspapers.
07:37They are not interested in that.
07:38But you want them to get to know what's happening because a lot is happening, a lot of corruption, a
07:44lot of, you know, a lot of what comes with the government that has been in power for 40 years
07:49uninterrupted and doesn't want to live.
07:52A lot of dictatorship, patronage, human rights violations.
07:56So we saw that social media is where you find people in one place because they were there at first
08:03time initially until we brought the serious, important issues to them in a way that they would grasp.
08:11And the other reason was that in mainstream media, it's owned by government, mainly in Uganda, or business people that
08:20do not want to antagonize government,
08:22or politicians that are MPs, members of parliament from the ruling party.
08:28So there was no way you would criticize the government independently in the mainstream media.
08:36What happens is that now the government has come for us where we are publishing and we are targets.
08:44So when you ask what happened recently is that I was arrested in Tanzania and I was missing, no one
08:55knew where I and my colleague Boniface from Kenya were for four days.
09:01And in those four days, we were tortured.
09:04So my feet up to now can't walk.
09:09I have a picture of how they looked yesterday after that flight and they were so swollen and scary.
09:15So they tied us up because we had tweeted about the ongoing trial of the opposition leader in Tanzania over
09:26treason, trumped up charges, of course.
09:28But we just went for a trial and I was tweeting what was happening, facts, what there was nothing I
09:34added for myself.
09:36So we were beaten, we were sexually assaulted with objects being shoved into our rear.
09:45We left with wounds and broken bodies.
09:49And we had been threatened never to say anything about what happened, never to step in Tanzania again, never mind
09:57that Tanzania is part of the East African community.
10:00We all have one East African passport.
10:03So we should be free to go in either country.
10:08But we have learned that our ordeal, what we went through was probably to share because we have a platform.
10:19I and Boniface Mwangi have a platform that the Tanzanians with no name, no voice that are going through, that
10:28have been going through what we went through are facing.
10:31So we have talked about it difficult as it is because dictators who use sexual violence, they use it to
10:41achieve two things.
10:45It inflicts maximum pain, but it also dehumanizes.
10:50So they use it to shame you, and they have used it so that the victims don't go and speak
11:01about it.
11:02And when they were threatening us not to say anything about it, because they filmed us as they did that,
11:11and they said that they have our videos,
11:15and if we talk about it, they would publish it, and in my heart, of course, I couldn't talk to
11:22them.
11:22I was with six killers, six criminals, those four or five days, blindfolded.
11:29You don't know what more they were capable of.
11:32So, but they were telling me, and I was thinking, but you must be stupid people, with so much power,
11:38but also stupid.
11:40Why do you think I should be the one ashamed of a video that proves your criminality, you know?
11:47I doubt they would put it out, but I am sure they have used that to silence many people,
11:53because it's so dehumanizing and erodes your dignity that that happened to you, that they did that.
12:04Yeah, so let me just stop there for now and catch a breath.
12:08Thank you.
12:09Thank you.
12:19No, thank you for being here and sharing your testimony with us, and it's really powerful to be able to
12:27learn that first-handedly from you.
12:31I'm going to move to Jen now, because you're a lecturer at Columbia Law, Columbia University,
12:37but before that, you were also working in Meta over trust and safety issues
12:43and are quite familiar with the internal structures within these platforms.
12:50What insights can you offer us relating to how these platforms work on user safety issues and security?
12:59Yeah, thank you.
13:01So I think there's a number of different angles to explore here,
13:05and I'll kind of go at the highest level, the most strategic, down to more granular.
13:11So one is what's been alluded to, or not even been alluded to, but what's been directly spoken about,
13:17which is how corporate priorities shape civic discourse, right?
13:22When we talk about how platforms impact civic discourse,
13:25it's really important to understand what's at the root of what drives these companies,
13:29what's their business model.
13:31And the company's business models aren't just a way of making money.
13:36They are a gravitational force that then determines how decisions get made,
13:42what behaviors get incentivized, how teams are rewarded as part of their performance reviews.
13:48And all of these things do not necessarily align with values of democratic resilience,
13:55supporting vulnerable users, or improving the health of civic discourse.
14:03So I think understanding the core of the business model is key here.
14:09And thus, civic discourse isn't shaped by what people are saying necessarily online.
14:14It's by what the system is rewarding and amplifying.
14:20I think at the highest level, there's this sort of business model.
14:24Then there's the incentives about the teams that go into these decision-making rooms.
14:29And they're incentivized by what their teams are supposed to do, right?
14:35And often, those key performance indicators are not aligned with healthy, civic discourse.
14:42They're usually defined by growth, engagement, and what keeps people active,
14:48assuming it's an ads model.
14:49There's different business models that incentivize different behaviors.
14:53So internally, you have teams that are jockeying for influence and power there.
14:58And then you have external drivers, so regulatory environment, press, political influence.
15:07And so you have this sort of mix of internal incentives and disagreements and structural challenges,
15:14and then external pressures.
15:16And the impact of all of this messiness becomes acute when you're dealing with decisions
15:24that affect countries that might have weak institutions or that may not have – that have ongoing violence.
15:31So these decisions get a lot more important.
15:34Also, I worked on the trust and safety side of the house.
15:38And we – those organizations are not typically the ones that have the most power, right?
15:45They're often brought in after the fact or are sort of handed an issue to deal with downstream.
15:54And it's really usually, with many companies, the product and the growth teams that have the most power.
16:01And their success metrics are, again, not usually aligned with healthy societies.
16:07So really, global civic discourse is being influenced by decision-making structures that are meant for scale
16:15and for incentivizing specific things and are not, you know, designed to take into account truth or pluralism or justice.
16:24I think also – so we have incentive structures.
16:28We have external drivers.
16:30There's – inside tech companies, the governance doesn't look like something that's sort of constitutional
16:36or a level of – depending on the size of the company and the maturity, right?
16:43The governance and sausage-making of decisions are often cross-functional meetings with a lot of different teams
16:49that, again, have a lot of power – uneven power distribution, right?
16:54Often, the teams with the most power and the loudest voices have an outsized influence on how decisions get made.
17:01Often in crises, there's unclear escalation paths or there's sort of well-meaning experts who might chime in
17:07or executives who may have a pet interest in something.
17:12So all of this is a sort of constitutionalism without a constitution.
17:16It's a bunch of messy inputs that get really acutely tried in tensions in times of, you know, sort of
17:26crises.
17:28And there's no transparency often with how these decisions get made.
17:32So if we think of these internal governance processes as a mirror of sort of democratic institutions,
17:39there's a lot of – there's a lot that's lacking with regard to that.
17:45A couple other quick notes is internal culture plays a really big role in governance.
17:50Does your internal culture reward people who raise risks?
17:55Does it reward people who are kind of waving the flag?
17:59Or is it something that's suppressed?
18:02Or is it something that gets – that impacts your performance review, right?
18:06So there's all these other – these sort of things that go into how decisions get made inside companies
18:11that are not necessarily transparent.
18:14And, yeah, the last thing I'll say is also on the lack of transparency, you know,
18:19and Craig knows this all too well, is researchers and investigators are no longer getting access
18:25to the type of data that they used to.
18:28And so this, again, cuts off a really crucial external scrutiny and oversight into not only how decisions get made,
18:39but what's proliferating on the platform or whatnot.
18:41So it's a long-winded way of saying there's a lot of things that go into governance,
18:46not a lot of transparency, and it's a complicated situation.
18:52I'm looking at you, Alan, I think – yeah.
18:54Yeah.
18:55Six years as a third-party fact-checker with Meta.
18:58I suppose there's some NDAs that prevent me from saying a lot of the details,
19:02but I will confirm what she said, that there are tensions between the different legs of the stool.
19:09The trust and safety people are really good,
19:13and they became people that we respected and we worked with.
19:17But we would also see people on the, let's say, the governmental relations side
19:22that were not exactly happy about independent fact-checkers,
19:29fact-checking content from their friends or people that they wanted to influence.
19:33And then there's also the growth side, the product side at a platform.
19:39If they really thought about it, fact-checking and having content moderation
19:44and a good experience for the user is positive for growth,
19:48but I think, short-sighted, they didn't want us to fact-check some ads that they had
19:55or some things that – content that drove a lot of people to spend time on the platform.
20:04So there's a lot of tension, and that's why, ultimately, I think fact-checking
20:08is a very difficult thing on a social media platform.
20:11But we were doing it for years, and I do have to commend Meta, Facebook at the time,
20:16for creating the third-party fact-checking system,
20:18because it still today is driving an awful lot of good work in this world,
20:24but unfortunately, they're retreating on it.
20:27And also, Jen, picking up a bit on this,
20:30having a good trust and safety team is maybe, like, half the battle.
20:35So what are some, like, approaches and safety by design principles
20:40that companies or policymakers can proactively take on this regard?
20:46Yeah, I think – so one is that, you know, making safety and the quality of the experience
20:54not just the responsibility of the trust and safety team, right?
20:58This should be something that's included and conceptualized within sort of metrics of health.
21:06So first is sort of including teams early on in the design process,
21:12whose expertise, you know, if they're reflective of different communities,
21:17getting different perspectives in the room early on
21:20as to sort of what potential harms or issues could come up.
21:25The earlier, the better.
21:27The gaming industry, actually, I think, has something to be learned from.
21:30Certainly, no industry has solved this challenge of human nature,
21:35but a lot of gaming companies intentionally designed systems
21:41to get players to behave in certain ways, right?
21:44And some of the ways that they do that is through setting norms
21:47in a way that people can learn and engage with
21:52rather than sort of reading a punitive list of community guidelines, right?
21:56But sort of having a more organic and embedded way of letting communities
22:03set their own norms and communicate it accordingly.
22:08I think, again, the more we can provide –
22:10if it's, you know, if we're talking about the kind of bigger incumbent platforms,
22:13the more we can provide things like tools to admins and moderators
22:18to be able to control what's – or not control, but work with their community
22:24to design a vibe that works for them
22:27and hold people accountable if they don't behave
22:29or if they don't act in certain ways,
22:32sort of putting more power into the hands of communities
22:35rather than a more top-down approach.
22:38So I think early involvement of different voices,
22:43empowering those people to be able to provide input,
22:47designing intentionally depending on what the product is,
22:51and, yeah, more power tools and autonomy to communities
22:56for them to decide what's best
22:59rather than a group of folks in Silicon Valley
23:02who, you know, may not have the same incentives.
23:08Alan, you're dealing with content moderation
23:11across different countries, 21 languages, 4 continents.
23:16What are the significant differences do you see
23:20on how misinformation exactly spreads?
23:23Well, every region and every language has its differences.
23:27We're in Asia.
23:29You know, Japan and South Korea are certainly different to work in
23:36as opposed to Eastern Europe or the U.S.
23:41And so it's definitely different.
23:44But you see global trends.
23:46And this is the perspective that we have at our company.
23:49We have about 70 or 80 people who are looking for these trends,
23:53as you said, in 21 regions, 21 languages.
23:56And we can connect the dots and see the trends.
23:59And I can tell you it is a global problem.
24:03I was just thinking about this question that we're asked,
24:07what is social media's role in shaping democracy?
24:09My question is, what is democracy's role in shaping social media?
24:13And I think we ought to flip it a little bit
24:15because we don't need social media to shape us.
24:20We need to shape social media.
24:25Applause?
24:30For a fact checker?
24:34Agatha, looking at you,
24:35because you're also dealing with misinformation,
24:38specifically in a country where there's a lot of low literacy rates.
24:43So what are the biggest challenges in educating the public
24:48about digital literacy, about the risk of misinformation?
24:55So like Alan said, that people are uploading a fact checker.
24:59People don't like facts.
25:02They are happy that everyone can write whatever they want to write.
25:06No gatekeeping.
25:08No one asks them.
25:09So it's hard to have people like that, you know, pay attention to.
25:14Because when you look at, in Uganda's case,
25:18the people that have the biggest following on social media
25:21are these bloggers of bloggers that never investigate anything.
25:25They just, you know, put it out there
25:27and people are happy to follow that.
25:29But you just decide to become one voice of reason
25:33and there's going to be some people that pay attention to that.
25:38because it turns out they follow bloggers and those bloggers for fun.
25:44But when they want something, you know, factual,
25:48you can tell they know where to go to.
25:52I have seen it through X in Uganda
25:54where people just write about breaking news,
25:58someone is dead, you know.
26:00And they say, no, I'm not going to believe it
26:03until I see it on the handle of this news media house.
26:07When it's something about,
26:09I cover the parliament of Uganda a lot.
26:11So when they hear something about the parliament,
26:13they say, we won't believe it until Agatha writes it.
26:16Because that means that they have seen that you write facts,
26:21you tell the truth about something.
26:23So you'll have a few that care about that.
26:25But the majority do not.
26:27And it's just those of us that understand.
26:32Because many times they do not understand the implications,
26:37the consequences.
26:39And Alan here talked about how social media,
26:42I mean misinformation kills.
26:43But he was talking about the person it killed
26:47because of a disease.
26:49It kills people that are not even ill.
26:51Because people have committed suicide,
26:54have died by suicide because of cyberbullying.
26:58People have had mental illnesses
27:01that have ended up killing them
27:03because, you know,
27:05someone just gets up and for fun,
27:07they are writing things about you
27:08and they do not quite get the gravity of it.
27:13So one other thing we face a lot
27:17as people who are insisting on the truth
27:21in a country I come from,
27:23I don't know if it happens to other people in...
27:25We tend to think that more developed countries
27:28have less problems or different problems from us
27:31is that the first way of dealing with us
27:37by the government was, it still is,
27:40by having both accounts, propagandists
27:43that spread the most malicious falsehoods against you.
27:49If you are a woman,
27:51they will start by slut-shaming you.
27:53They will, every day they list
27:55a number of men I sleep with.
27:57Some I've never had them,
27:58I've never seen them.
27:59But every day they list a number of men I sleep with.
28:02They have given me more children than I have
28:04and they say every child has a different father
28:08or something like that.
28:10And that, it takes a certain level of strength
28:16to deal with that every day,
28:18to know that your children tomorrow
28:20will find stuff like that.
28:22it makes many people live in that space
28:26and that's why you don't have many women in that space.
28:29So, in my case I decided to use it
28:33to sort of sensitize other women
28:36so that it doesn't affect them.
28:38So, every time they've said that,
28:40I say, okay, so what's the problem?
28:42It's not a crime to sleep with whoever I want to sleep with.
28:45How is that a problem?
28:46How do you counter that I am saying
28:48you are torturing people
28:50with she sleeps around?
28:52How do you counter that you are stealing money
28:55that should be saving children
28:58and women while giving birth
29:00and you say she has four children with four men?
29:02So, what I said,
29:04until you bring a young boy I defiled,
29:08this doesn't, it's nonsense.
29:10And I tell other women,
29:11let them not use your sexual
29:16life,
29:17real or perceived.
29:18It's not a crime.
29:20But,
29:22that's not just me.
29:23Others, they just go away.
29:25They get into depression.
29:27They get something like that.
29:29And that has been a problem.
29:31When they saw that people like me
29:33who has become their biggest headache
29:35is not bothered by that,
29:37they have started another line
29:39of how I am an agent of imperialists.
29:41If someone took a photo of me
29:43seated with you guys here
29:46and this audience,
29:47you see,
29:48she's an agent of imperialists.
29:49She's ever on the plane.
29:51She's ever on something like that.
29:53And the narrative,
29:54it goes to that disinformation.
29:57Because many people believe it.
29:59There are people who believe it.
30:01and they believe that
30:03whatever you're doing,
30:04you're doing it because
30:05someone has paid you to do it.
30:07It doesn't matter that
30:08what you're saying is there
30:09and they can see it.
30:11So,
30:12it's,
30:13it's just
30:14keeping at it
30:15that belief,
30:16that conviction
30:17that the truth is important
30:20is what drives you
30:21and makes you to say,
30:22I will do it at whatever cost.
30:25Until one day,
30:26you like,
30:27you just hope
30:28that you keep shifting a bit
30:30and have some sort of
30:32incremental change.
30:33Yes.
30:35Yeah.
30:36Picking up on the aspect
30:37of accountability
30:38and power structures,
30:41also,
30:41Alan said,
30:42you know,
30:43we should be shaping social media
30:45and not the other way around.
30:46And Agatha said,
30:47like,
30:47there needs to be accountability.
30:50Looking at credit,
30:51how do you,
30:52how do you see,
30:53how do you answer
30:54the regulatory gaps
30:56that currently exist
30:57that don't keep
30:59platforms accountable?
31:02Yeah,
31:03it's,
31:03it's a huge challenge
31:04because as you've kind of heard,
31:06I mean,
31:06these are
31:07for-profit companies
31:09pretty much
31:10typically
31:10publicly traded.
31:12They have shareholders,
31:14they have goals
31:14they have to meet
31:15and those things
31:16play a really big role
31:17and for them,
31:18regulation
31:19restrains their ability
31:20to meet their business objectives.
31:22And so that's one of the reasons
31:23that we have the gaps
31:24and the problems
31:25is that for a long time
31:26the view was,
31:27well,
31:27social media is a great thing.
31:29There's more voices,
31:30there's more people participating,
31:32people finding community.
31:34There are lots of benefits from it.
31:35But the government's approach was,
31:37let's let the companies
31:39do what they want.
31:40And now they're trying to rein in
31:42years and years
31:43of not having any regulation
31:44in this area
31:45in many of the major places
31:47like where the companies are based
31:49largely in the U.S.
31:50and other regions.
31:51And so there is
31:52a kind of catch-up
31:53that's trying to happen.
31:54There's the EU
31:55but in many other countries
31:56in the world.
31:56I'm based in Canada.
31:58The Canadian government
31:58has tried
31:59to bring in legislation
32:00to get the platforms
32:01to contribute money
32:02to help support
32:03the news and film
32:04and content industry
32:05in Canada
32:06so that, you know,
32:07Canadian content
32:08still gets represented.
32:10And Meta's response to that
32:11was to basically block
32:12the sharing of all news links
32:14in the country.
32:14If you try to share
32:15a news article
32:16from a Canadian publication
32:17in Canada,
32:17you are not able to do that
32:19on Facebook or Instagram.
32:20And so the gaps are significant
32:22because of the hands-off approach,
32:24because they have
32:25the funds and resources
32:26to do lobbying,
32:27and then there's also
32:28kind of the will.
32:29And this, you know,
32:31for the people here,
32:31you might be feeling
32:32very depressed
32:33and like,
32:33well, wait,
32:34what can we do?
32:35We're overwhelmed.
32:36This is what the platforms are at.
32:37And the truth is that
32:38I think the average person
32:39still does have power.
32:40You have power starting
32:41with where you choose
32:43to put your attention.
32:44What are you looking at?
32:45Are you allowing yourself
32:47to sort of be influenced
32:48by what is there
32:49and what is in front of you?
32:50Or are you keeping in mind
32:51that, yes,
32:52these platforms
32:52have certain incentives.
32:54People get very good
32:55at playing to these incentives
32:56and getting their content
32:57in front of more people,
32:59manipulating them emotionally
33:00to get those views
33:01to earn money.
33:02And so guarding your attention
33:04and just at the very least
33:05being aware
33:06how much these environments
33:07are being manipulated.
33:08It is a professionalized industry
33:10for bots,
33:11for trolling,
33:12for generating content.
33:13you can hire people
33:15to do this for you.
33:16And so you need to be aware
33:17of that in the environment.
33:18And you also need to think
33:19about what you value
33:20and what you might want
33:21to communicate
33:22to your elected officials.
33:23Because that is the way
33:25that it potentially starts
33:26to change
33:27and those gaps get filled
33:28is that the regulators
33:30feel that there is
33:31the support to do this
33:32because there isn't always.
33:34So they do need to hear that.
33:35And for them to also
33:37hopefully push on
33:37some of the core things
33:38like, for example,
33:40the one that I think
33:40the EU has been leading on,
33:41which is a good one
33:43to go after,
33:43is transparency.
33:45So it is really tricky
33:46to regulate
33:47these massive platforms.
33:48These companies
33:48don't even know
33:49most of what is on there.
33:50There is so much content
33:51being uploaded
33:51at any given minute.
33:52They're trying to find it.
33:54They're trying to see it
33:55and use automated systems.
33:56They don't see things
33:57because there's too much of it.
33:58So to actually sort of
33:59tell them like,
34:00well, you have to do this
34:01and you do that,
34:02at the scale they operate,
34:03it's really hard
34:04to enforce that.
34:06You could try to get them
34:07to spend a lot more money on it,
34:08which would be one option.
34:09But I think transparency
34:10is one of the key pieces,
34:11which is can they make data
34:13available to independent
34:14researchers and others
34:15to look at it
34:16and study and identify
34:17and point out gaps
34:18and problems?
34:19Can people have access
34:20to their own data
34:21on these platforms
34:22about what kind of
34:23targeting information
34:24they have about you,
34:25what kind of preferences
34:26they think you have,
34:27so you can actually see it
34:29and choose to leave
34:30that platform
34:30if you don't like
34:31what they seem to be
34:32collecting about you?
34:33So I think that piece,
34:34pushing more on transparency,
34:36getting more about
34:37what is going on
34:38on these platforms,
34:38getting more data
34:39for you as an individual,
34:41those are probably
34:41good places,
34:42but realistically,
34:43it is going to be
34:44a long, hard fight
34:45to fill in those
34:46regulatory gaps.
34:48And we should also be aware
34:49that it's very hard
34:50to legislate well
34:51in this area.
34:52You could have
34:52very well-meaning legislation
34:54that does genuinely end up
34:55in the censorship
34:56of completely valid,
34:59helpful information
35:00around the world
35:01because they're trying
35:02to rein things in so much.
35:03So it is tricky.
35:05There are some things
35:06where there are valid points
35:07about not moving too fast
35:09on regulation,
35:10but obviously there is
35:10a lot of ground
35:11to be made up.
35:14Yeah, I write a lot
35:15about the Digital Services Act
35:17and of course it's facing
35:18a lot of criticism recently,
35:19especially from
35:21the Trump administration
35:22over creating censorship.
35:25And interestingly,
35:28the European Commissioner,
35:30Hannah Verkunen,
35:31who is in charge
35:31of the DSA said
35:33that the platform's
35:34own terms and services
35:36blocks more content
35:40than the DSA on its own.
35:43So that's also
35:44an interesting perspective.
35:46And I think transparency
35:47is a key issue
35:48and I'm looking at Jen now
35:50to see what can we do
35:53collectively
35:54from a civil society perspective,
35:55from governments,
35:56from regulators
35:57on creating
35:59healthier digital spaces.
36:02Yeah, I think
36:03we've sort of discussed
36:05some of that
36:06in terms of giving
36:07researchers information
36:09because policymakers
36:11and regulators
36:12can't create
36:13good policies
36:13or regulations
36:14without data
36:17and without information, right?
36:19So that's, you know,
36:20a lot of poor policies
36:21or poor regulation
36:22gets churned out
36:23in the absence
36:24of more data.
36:26I think investing
36:27in global civil society
36:28networks
36:30sort of having
36:30to our earlier discussion
36:32on sort of
36:33community-specific harms
36:35and nuances, right?
36:36A lot of these things
36:37are not monolithic issues
36:39across the globe.
36:41They're, you know,
36:41it's very context-dependent.
36:43I also, you know,
36:45I ran for public office myself
36:47because I really wanted
36:48to get involved
36:49on the local level
36:50and kind of,
36:51and also really wanted
36:53to kind of move things
36:55more analog
36:55because when I was running
36:57our town Facebook groups
37:00were a dumpster fire
37:02of misinformation.
37:04I, you know,
37:05was bullied and harassed
37:06and those types of things
37:08were sort of not conducive
37:10to having decent civic discussions
37:12on, in this case,
37:13as school board.
37:14So I'm also thinking
37:15about our children.
37:16How are we raising our children
37:18around these technologies?
37:19I, you know,
37:20I'm not someone
37:21who believes that
37:22cutting off all access
37:23to all children
37:24is the answer.
37:25I think it's very,
37:26again, dependent
37:26on the use case.
37:28But I, you know,
37:30I have a child
37:30and as our schools
37:33around the world
37:34are thinking through
37:34incorporation of AI,
37:36what does that mean?
37:38Is that appropriate at all?
37:40And one last thing
37:42on sort of kids
37:43and education and AI
37:44with regard to long-term
37:47democratic resilience
37:48is if we have kids
37:50who are using chatbots
37:51or who are engaging a lot
37:54for, with it
37:55as a personal assistant,
37:57they're not necessarily,
37:57and the chatbots
37:58are sycophantic,
37:59they're not teaching
38:00these kids
38:02unbelievably critical skills
38:03of productive struggle,
38:05of learning how to disagree,
38:07of learning how to engage
38:09with people
38:10who disagree with you.
38:11I mean, these are core skills
38:13for functioning democracies.
38:15And if we're exposing
38:17our children to technologies
38:19that are undermining
38:20their ability to interact
38:21with a complex world,
38:23I am concerned
38:25about what that means
38:26for our societies
38:27moving forward.
38:29So those are my thoughts
38:30on that.
38:33Alan, the same question
38:34to you.
38:35Yeah.
38:36It seems like it would be
38:37a pessimistic situation,
38:39but I would point you
38:42to recent years
38:43when we were
38:44in a very dark time,
38:45the COVID pandemic,
38:47where we had an awful lot
38:49of life-threatening
38:51disinformation
38:52about medicines
38:53and the spread
38:54of the disease.
38:56And fact-checkers
38:57around the world
38:57stood up to that challenge,
38:59and I believe
39:00that we saved
39:01a lot of lives.
39:02Because of fact-checkers' work,
39:04you knew about
39:05various cures
39:06that weren't cures.
39:08You knew about
39:09the importance
39:09of vaccinations
39:11and the importance
39:11of social distance,
39:13that sort of thing.
39:14and we did it.
39:17Facts are nothing
39:18to be afraid of.
39:19Do not demonize
39:20fact-checkers.
39:21That's what I ask.
39:22I'm not asking
39:23for any platform
39:24necessarily to cover up
39:26or take anything down,
39:28but if you would add
39:29fact-checking
39:30alongside,
39:31give us access.
39:32You don't even really
39:34have to pay us,
39:36actually,
39:37if the Community Notes program,
39:39which is free,
39:39would at least
39:40give fact-checkers
39:41a part in it,
39:43a lane in it,
39:44then we can contribute
39:46to that.
39:47It doesn't necessarily
39:48take governmental rules
39:50to do it,
39:51but just good business
39:52sense by the platform.
39:54So it's not as pessimistic
39:56as it may sound,
39:58but just give fact-checkers
40:00a chance.
40:02Well, I heard.
40:03Agatha,
40:05how do you think
40:06platforms,
40:07government,
40:07civil society,
40:08what role can they play
40:10in creating
40:10a healthier digital
40:12environment?
40:14I think
40:16research,
40:18Jen emphasized that,
40:20which really
40:22is a function
40:23of resources.
40:24I think we need
40:25to put
40:26a lot of resources
40:28in research.
40:29We don't have
40:31a lot of that
40:32in civil society
40:34and journalism
40:34where I come from.
40:35There is no money.
40:38Many journalists
40:39in Uganda
40:40are paid
40:41per story.
40:42Now,
40:43if you're paid
40:44per story,
40:45you're not going
40:45to spend
40:45a week
40:46pursuing a story,
40:48looking for facts,
40:51verifying,
40:51because that means
40:53in one week
40:54you will do
40:54one story
40:55and you will not.
40:57So you have,
40:58the story count
40:59is very important.
41:01That can only
41:03be cured
41:06by getting
41:07independent funding,
41:09funding independent
41:11journalism.
41:12I think we need
41:13to look at that
41:14more seriously
41:15than we do.
41:16Civil society
41:17organizations,
41:19they are a lot
41:22like politicians
41:23or political parties
41:24where I come from.
41:26They political
41:27or they just
41:28ride on people's
41:29sentiments
41:29because they see
41:30that sentiments
41:31work better.
41:32For example,
41:34and the politicians
41:35as well.
41:36There's nothing
41:36that is driven
41:37by research.
41:39In Uganda,
41:40they passed
41:41the Anti-Homosexuality
41:42Act
41:43for the second time
41:44in 2023.
41:48And the narrative
41:49was that
41:50homosexuals
41:51are recruiting
41:52children,
41:54you know,
41:54young children.
41:56And there was
41:58no report
41:59that showed
42:00anything like that.
42:01And no one
42:02asked that.
42:03They were talking
42:04about our children
42:05and our children.
42:06And when you look
42:07at police crime
42:08reports,
42:09the children
42:10that are at risk
42:12are young girls
42:13that are defiled
42:14by heterosexuals.
42:16no one cared
42:18about that.
42:18So can we do
42:19enough in
42:20research
42:21but also
42:22the consequences,
42:24accountability.
42:25Is there a way
42:27that we can
42:27have consequences
42:29for people
42:29that,
42:30you know,
42:31peddle
42:31falsehoods,
42:33these narratives?
42:34But who would
42:35ensure
42:37there are those
42:37consequences?
42:38It's not the
42:39government
42:39because they are
42:41benefiting
42:41from falsehoods.
42:44It's not our
42:44governments.
42:46I think it's also
42:48not the audiences
42:49of these platforms.
42:51It would be
42:52the owners,
42:53the administrators
42:55of these platforms.
42:56But how?
42:57Because even for them,
42:59it doesn't suit them.
43:00So I think
43:01we need to go back
43:03on the drawing board
43:04as people that care
43:05about truth,
43:06facts,
43:07and say,
43:08is there a way?
43:09Do we have any leverage
43:10against Facebook,
43:12against X,
43:13against TikTok
43:15that flag everything
43:17that is false,
43:19that, you know,
43:20there should be
43:21penalties for
43:22these people?
43:24Well,
43:24I am not sure,
43:25but those are my
43:26views.
43:27That's what I try
43:29to think about
43:30when I think about
43:31what is happening
43:32in these spaces.
43:36Thank you very much.
43:37So going back
43:37to the drawing board,
43:39indeed,
43:40and with that,
43:41we'll end the session.
43:42Thanks a lot,
43:43everyone,
43:44and let's give
43:45a warm round of applause
43:46for all the panelists.
43:47Thanks.
43:48Thank you.
Commentaires

Recommandations