- il y a 2 jours
What Does Democratic Tech Look Like
Catégorie
🤖
TechnologieTranscription
00:00Great. Thank you all for being here. We've got a great panel coming up.
00:06My name is Jeremy Kahn. I'm the AI editor at Fortune, the business publication.
00:12I'm also the author of a forthcoming book on AI called Mastering AI, a survival guide to our super-powered
00:18future that will be out in July.
00:20We've got, as I said, a great group of panelists here to talk about tech and democracy.
00:24To my left is Sonia Solomon, the deputy director of the Center for Media, Technology, and Democracy at McGill University.
00:33Next to her is Andy Yen, the founder and CEO of Proton.
00:38Next to him is Mitchell Baker, the executive chair of Mozilla.
00:42And then at the end is Audrey Tang, who just this week stepped down, I guess, or ended the term
00:48as digital affairs minister of Taiwan.
00:53So there's obviously a lot of concern about technology and democracy and what technology is doing to democracy,
01:00particularly, and I'm sure we'll get into this later with regard to AI and deep fakes and the spread of
01:06disinformation.
01:08But I want to start out by talking about maybe some ways in which technology can enhance democracy.
01:13And in particular, how can technology increase the ability of citizens to influence their governments and potentially make their views
01:25known?
01:25And I want to start with Sonia to talk a little bit about some ideas that they've tried at McGill
01:33and in Canada on this very idea.
01:36Sure. Thanks, Jeremy.
01:37So for me, I would define technology as a possibility, right?
01:42It either inhibits or enables a possible outcome, including its impact on democracy, as you've known.
01:47So I think in order for us to tap into those democratic potentials,
01:51we really need to be intentional about those values from the design and the governance of technology.
01:58I think it's a very different thing to make sure bad things don't happen and to make sure good things
02:04do happen.
02:05So I think we need to be intentional about those goals and direct AI development towards socially beneficial outcomes that
02:13keep our public interest in mind,
02:14that keep the environment in mind.
02:16And ultimately, if we want democratic tech, I think we're going to need democratic governance of tech.
02:23And we're going to need industry incentives that are really intentional about empowering social impact.
02:30Some of the things that we've done at McGill is we've hosted a number of citizens' assemblies in order...
02:38Can you tell people what an citizens' assembly is?
02:40Because I think that's...
02:41People may not be familiar with how that works.
02:43It's a very Canadian concept.
02:45So a citizens' assembly is basically a randomly selected and representative policy jury
02:51that over the course of a number of days or weeks get together in a room to consider and be
02:59informed on a range of tech-related issues
03:02and ultimately reach consensus in the form of direct recommendations that feed into government or public bodies.
03:10So we've hosted a number of in-person citizens' assemblies.
03:14And what's remarkable about them is it's quite a different flavor and tone as what you see on social media.
03:23As polarized as people seem and are, it's very different when you get people together in a room to debate
03:30these issues.
03:30And they reach consensus.
03:32And not only do they reach consensus, but remarkably quickly.
03:35Like, we're talking in a matter of days.
03:37That's great.
03:38I want to get Audrey in here because, Audrey, I know you have experience in Taiwan of also running these
03:42citizen assemblies.
03:43Can you talk a little bit about that experience?
03:45And then how has technology helped to facilitate those?
03:49Sure.
03:49My current organization as of today, the Collective Intelligence Project, or CIP,
03:54worked with my previous organization, the Taiwan Digital Ministry, on such an alignment assembly.
04:01It's a citizen's assembly around one particular issue.
04:04How shall we react to the information integrity harms?
04:08That's a deepfake fraud or whatever that is generative AI causing to the online world.
04:13Now, in the traditional poll or a traditional focus group,
04:17the solutions are already predetermined by think tanks, by the, you know, state or whatever.
04:22But in Taiwan, we tried a very different method.
04:25We sent 200,000 random SMS messages to random SMS numbers.
04:31So it all contains this one single survey link that says the government would like you to propose
04:36or to do about information integrity online.
04:40And because it comes from the 111, it's a trusted government number.
04:43People know it comes from the state.
04:44And for the people who fill in the survey, we ask some demographic questions
04:49so that we know when we randomly select the 450 people, it's a microcosm.
04:55It represents the composition of the Taiwanese population.
04:58The main difference is that instead of face-to-face meeting a human facilitator,
05:02we use video meetings and robot facilitators.
05:05So the 400 people, 10 at a time, meet online.
05:09And the robotic facilitation with Stanford nudged them to push them to think about novel solutions
05:17and share their feelings and so on to make sure that people are there not to listen to a broadcast
05:22but instead to have a broad conversation about such metas.
05:25And in just a few hours, really, the most polarized sites start to find bridges between them.
05:31And one of the bridges that's reached early on is that people really want digital signatures
05:36to go with all the online advertisements so people cannot just deepfake Audrey Tang
05:42or any celebrities anymore.
05:43And if the platform, like Facebook or YouTube, doesn't implement that,
05:47people want, if people get scammed for one million,
05:50they want this platform to be liable for that one million.
05:53And this is so strong consensus across party lines, across different regions,
05:58more than 85% approval,
06:00so that we immediately did a multi-stakeholder meeting
06:03to check with OpenAI and Meitai and Microsoft,
06:07you know, the usual suspect, Google and so on,
06:09to say, you know, people think that this ought to happen.
06:11We're going to write it into law.
06:13Is this technically feasible?
06:15And when they say it's technically feasible,
06:17all the three top consensus is now at the parliament now.
06:20So all this process took only three months.
06:23And it's a much faster turnaround
06:25than the traditional waiting for four years to vote for your next president.
06:29Yeah, and just to be clear,
06:30the way this works is actually trying to change people's opinion
06:36through the course of this dialogue.
06:38It's not simply trying to take the pulse of people's opinion.
06:41Is that correct?
06:42Yeah, it's deliberative in a sense that people are there
06:45not to answer some standardized questions,
06:47but filling in those surveys that was also co-created by other citizens.
06:52So there's multiple rounds.
06:54We call it the wiki survey.
06:55So the survey topics, agenda, are set by your fellow citizens.
07:00Interesting.
07:01I want to get on to some other ways technology
07:03can potentially enhance democracy in a minute.
07:06But I'm just curious.
07:07I see you nodding, Mitchell.
07:09How do you react to that idea of using technology in that way
07:13to sort of as a way of facilitating dialogue among people?
07:16It's sort of the dream people had about social media
07:19that turned out never to kind of quite come to pass.
07:23Well, what's interesting is that those are actual human-to-human engagements
07:27without the like, process, rebroadcast, emotional, quick reaction piece.
07:36So I think what we are starting to see from these activities
07:39is what the core flaws of social media are.
07:43And so today there's the interest in AI and what's it going to do.
07:47But the reality is we're already seeing this.
07:49Like the big change is the ease and volume
07:54with which content can be created, making it less expensive.
07:59But we're already experiencing it.
08:01And so these changes about how do you address social media,
08:03I think we shouldn't forget there's been, especially in Europe,
08:07a lot of focus on what do we do about the current state
08:09and not forgetting or losing that
08:12because it says AI now instead of social media.
08:16Right. Yeah, it's a good point.
08:18And we're moving from, you know, increasingly to a world where,
08:21yeah, all the content's generated not by person to begin with
08:24and being sent to not people to read.
08:26And we were talking a little bit about that backstage.
08:29Andy, I want to bring you in here.
08:31I want to talk about how simply having access to technology
08:34and particularly internet access in a lot of parts of the world
08:36is a serious enhancer of democracy
08:40and can actually serve as a kind of check on authoritarianism.
08:44Yeah, I think for democracy to work, access is very important.
08:47Access to information.
08:48So this is being able to access data, the truth,
08:53and getting different objective opinions.
08:54And to give maybe a very concrete example,
08:57we've seen situations like, let's take the war in Ukraine, for example.
09:01Right.
09:01When the Russian troop moved in,
09:03the first thing they did was change internet access.
09:06And we saw this actually in our data.
09:08So, you know, right when the war started,
09:10we saw a very big increase in VPN usage in Ukraine.
09:14And we were trying to figure out what was going on.
09:16And what was happening was the citizens
09:18that were now behind the front lines in occupied territories
09:22were now using VPN to try to get online and get real information.
09:26And then immediately after that,
09:27there was also a surge of usage in Russia.
09:30So actually in Russia, we saw, you know,
09:32tens of millions of people sign up to use VPNs
09:34within a very short period of time.
09:35And this was, again, because if you didn't have a VPN in Russia,
09:39you actually couldn't access the truth.
09:41And this might seem like one example,
09:43but what we did at Proton in our VPN observatory
09:46is we tracked the VPN demand and send-up data
09:49across the entire globe.
09:51And what we saw was, you know, every single year,
09:54dozens of situations where wherever democracy was under attack,
09:58whether it was a military coup in Africa,
10:00whether it was an election that was being rigged,
10:02whether it was some sort of suppression event
10:04happening somewhere in the world,
10:05it was correlated directly with an increase
10:08in usage of Proton services.
10:09And I think that is the data that proves definitively
10:13that actually democracy and access to information
10:15and access to, you know, free internet
10:17are entirely correlated.
10:19And you cannot have one without the other.
10:22And you've put in place some efforts
10:24to try to bring VPNs to places
10:26where they may have been hard to access previously.
10:28Is that correct?
10:29Yeah, that's something that we spend a lot of our time doing.
10:32It's very difficult to do, of course.
10:34And the reason is there actually
10:36is no financial incentive to do it.
10:38Let's say you get the best operating VPN
10:41in a place like, you know, in Russia or even in Iran.
10:44There's actually no way for those customers to pay you.
10:47Even in Ukraine, they can't pay you
10:49because there's a lack of funds.
10:50So actually our services have to be available for free.
10:53You know, in a sanctioned country like Russia and Iran,
10:56they can't use credit cards.
10:57They can't use bank transactions.
10:58There is no PayPal.
10:59So it's a business where you can acquire
11:03potentially hundreds of millions of users
11:05that all have negative value
11:06because you can never get revenue from that.
11:08So that's why the typical economic models here will fail
11:12because there's no incentive to do it.
11:14You have to do it because you believe in the mission.
11:16You believe in the importance of open information.
11:18And these are probably multi-million euro investments
11:21every single year to make it possible.
11:23That's interesting.
11:23Mitchell, do you have a view on that
11:25about sort of how business models kind of work
11:29against deploying technology in a way
11:30that might actually enhance democracy?
11:32And is there a way to kind of square that circle
11:34and try to create business models
11:36that do actually promote, you know,
11:38the use of technology in ways
11:40that would enhance democratic processes?
11:42Yes, I do have an opinion.
11:45Actually, I thought Andy's comments
11:49were really quite interesting
11:50because, you know, Mozilla runs a business
11:53through our products.
11:58and Mozilla was necessary
12:00because of a market failure, right?
12:02Mozilla comes out of the past monopoly era
12:05where today, you know, we have a handful of companies
12:07but at the beginning of the internet,
12:09there was only one.
12:10It was Microsoft and it controlled everything.
12:13And so it was totally known
12:15that Microsoft through the browser
12:16would control the internet.
12:18Just accepted.
12:19and there is no business rationale
12:22to build a browser.
12:23There was no business model when we started.
12:25We started for the reason that Andy described.
12:28It had to be done.
12:29And we found a business model
12:31but it's not replicable for many people
12:35and it's not perfect either.
12:37And so I have this interest
12:39we've seen in sort of recent French activities
12:43and even the president's comments
12:45in the last few days
12:47and you're a living example
12:49but a greater interest in
12:52if you want public benefit
12:54like the business world is not set up
12:57or incented or rewarded
12:58to provide it directly.
13:00There's a lot of indirect public benefit.
13:02If it's profitable, you'll get public benefit.
13:05And so I think the realization
13:07that for society to build in public benefit
13:10this phrase that I used to dislike so much
13:14public-private partnerships
13:16public investment
13:17public resources are necessary.
13:20We view what we do at Mozilla
13:22as creating infrastructure
13:24that's a public resource.
13:27But it's hard.
13:29There aren't many Mozillas
13:30and it's, you know,
13:31sure we're great
13:32but, you know, it's not all us.
13:34It's just the system is not set up well
13:36to help people who want to do the right thing.
13:39Yeah, Audrey, I see you nodding.
13:41I mean, what's your experience been in Taiwan about this?
13:44I mean, do you feel like government has to step in
13:47and actually build these resources
13:49or is there a way
13:50to sort of incentivize the private sector to do this?
13:53Yeah, one of the wins that I did
13:57when serving as the digital minister
13:59is to convince the accounting department
14:01to consider what we call public digital infrastructure
14:05to use the pocket of money
14:07that's to build highways and roads and airports
14:10instead of just R&D.
14:13Because R&D are usually project-based.
14:15Once you hit that R&D target
14:18then, of course, commercialization
14:20is the private sector's business.
14:21But if you're building
14:23a public digital infrastructure
14:24in the same budget as highways and so on
14:26then obviously the conversation changes.
14:29If you have a good team
14:31that has been building bridges
14:33or roads in a local district
14:35or a municipality
14:36with a particularly good way
14:38you can pay for success
14:40which is another word for impact investing
14:42or retroactive funding.
14:44And this kind of funding ways
14:45we do through presidential hackathon.
14:48It is the highest level of hackathon
14:50that elevates five teams every year domestically
14:54to showcase their work
14:55doing public digital infrastructure
14:57on a smaller municipal district level
14:59and there is no prize money.
15:02We don't give out money
15:03but the money that is going
15:05to this public digital infrastructure
15:07is going to the next fiscal year's
15:09you know, public budget.
15:11And so it solves
15:12the initial bootstrapping problem
15:14for impact investing.
15:16Because for people
15:16who impact invest on that stage
15:18there's almost no way
15:19for them to gain a return
15:20because all the benefit
15:22is externality is
15:23you know, to the entire nation
15:24or internationally.
15:26But when the president promises
15:27whatever you did in this city
15:29will become nationwide infrastructure
15:31to be shared with the world
15:33also with public code
15:34the next year
15:35then that risk is reduced
15:37and then the impact investor
15:38can just focus on
15:39the individual components
15:40for success
15:41without worrying
15:42about the initial bootstrapping.
15:44So this kind of novel funding
15:46approaches
15:46with quadratic funding
15:48and many retroactive funding
15:49and so on
15:50was actually pioneered
15:51in the decentralized
15:52or W3 world
15:53but we're also using it a lot
15:55in Taiwanese public sector.
15:57That's fascinating.
15:58A little later in the session
15:59I am going to ask for questions
16:00from the audience
16:01so please think of
16:02think of your questions.
16:03While we're on the sort of subject
16:05of access
16:06as itself being
16:09enhancing of democracy
16:10I wanted to ask the panel
16:12what they think of
16:12efforts such as
16:13and I know he's speaking
16:14later here today
16:16at VivaTech
16:16Elon Musk's Starlink
16:18I mean he's gone around the world
16:20and sort of said
16:20I'm going to offer this
16:21in places where
16:22I'm worried that the internet
16:23is going to be denied
16:24and he's essentially
16:25sort of given the same message
16:26I think access to this
16:28is going to increase democracy
16:31but I think some people
16:33are skeptical
16:33of his motives
16:34and skeptical
16:35of how this is actually
16:36going to turn out
16:36Andy I know you have
16:38some thoughts on this
16:38what's your view
16:39of sort of his program?
16:42Yeah I think
16:42with any technology
16:43there is the question
16:44of the tech
16:45and there's also
16:45the question of governance
16:47right
16:47and from a tech standpoint
16:48the ability to beam down
16:50internet from satellites
16:52and have it access
16:53anywhere in the world
16:54without building infrastructure
16:55on the ground
16:56this as we see
16:58for example
16:58in the Ukrainian example
16:59is very good
17:01right
17:01in a contested space
17:02where infrastructure
17:03is being destroyed
17:04the ability to have
17:05internet access
17:05anywhere instantly
17:08that's a marvelous technology
17:10the problem
17:11I think people have
17:12with Starlink
17:13is not the technology
17:14but actually the governance
17:15because they're a bit
17:17uncomfortable with the fact
17:18that the accessibility
17:20of this technology
17:20depends entirely
17:21on the whims
17:22of one person
17:23that is not
17:24under any checks
17:25or balances
17:25or regulations
17:26who could change
17:28his mind
17:28based off of what
17:29he reads on Twitter
17:30that morning
17:30when he gets up
17:32and that may not
17:34be what you want
17:34the future of democratic
17:35access to the internet
17:36to be based upon
17:37so I would say
17:38it's not a technological
17:38failure
17:39actual technology
17:40is very sound
17:40but it's potentially
17:41a governance problem
17:42that is quite concerning
17:43yeah absolutely
17:45Audrey
17:46yeah Taiwan uses
17:47OneWeb
17:48and also SES
17:49which performed remarkably
17:51during the Hualien earthquake
17:52so I think
17:54the key here
17:54is plurality
17:55is more than one vendor
17:57just like when we implement
17:59the local resilience
18:01for the you know
18:02Azure and GCP
18:04that is the Google Cloud
18:05and Amazon and so on
18:06we insist that
18:07it's not putting egg
18:08in one basket
18:09but to ensure that
18:10our critical infrastructure
18:11can run
18:12with all three cloud providers
18:14and even two of them
18:15are gone
18:15the other one
18:16still can sustain
18:17our conversations
18:18so the same must be true
18:20also for satellite connection
18:21interesting
18:22redundancy
18:23and kind of being key
18:25we talked a little bit
18:26about citizens assemblies
18:28and the idea
18:29of using technology
18:30to help with persuasion
18:32but what about the idea
18:33Sonia
18:33of using AI
18:35and other technology
18:36to try to
18:37sort of monitor
18:38social media
18:39to
18:40so government
18:40gets a better sense
18:41of what do citizens
18:42actually think
18:43about topic X or Y
18:44I mean is
18:45is that a useful avenue
18:46at all
18:46or is the information
18:47ecosystem now
18:48so polluted
18:49that you can't actually
18:51get good signal
18:52anymore from social media
18:53yeah that's a really
18:55good and difficult question
18:56I think
18:57I'm a bit skeptical
18:58about AI being used
19:01for tools
19:02to like fact check
19:03disinformation
19:04for instance
19:05particularly because
19:07you know
19:07disinformation
19:08is not just
19:09a tech problem
19:10right
19:10it's deeply embedded
19:11in politics
19:12and in culture
19:13and often
19:16it attaches itself
19:17to elements of truth
19:18that are then
19:20kind of attached
19:21to myths
19:21about inequality
19:22in our society
19:23in North America
19:24our research echoes
19:26a lot of findings
19:27from others
19:28around the world
19:29that
19:29you know
19:31disinformation campaigns
19:32are successful
19:32when they tap
19:34into kind of
19:34a culture wars framework
19:37often very misogynist
19:39classist
19:39racist
19:40values
19:41in our society
19:41right
19:42so it's also
19:42about power
19:43and you know
19:45you can't
19:45fact check racism
19:46so I think
19:48we need
19:48I think we need
19:49a more comprehensive
19:52remit of tools
19:53in terms of
19:54shoring up
19:56opinions
19:56to government
19:58officials
19:58I think it's
19:59an interesting idea
20:00and Audrey
20:01may have some
20:02concrete
20:03ways to do this
20:04I think
20:05if we are to
20:06build in those
20:07potentials
20:07we would need
20:08to do that
20:08in a really
20:09privacy preserving way
20:10we would need
20:11to be very intentional
20:12and careful
20:12about data collection
20:13I would need
20:14to see a lot
20:15of safeguards
20:15and a lot
20:16of governance
20:17before we
20:19kind of unleash
20:21into the wild
20:22some new
20:23democratic potentials
20:24and potentially
20:26rehearse some
20:27of similar problems
20:27we've seen
20:28so
20:28yeah
20:29Audrey do you feel
20:30like there's
20:31potential there
20:31or is that something
20:32Taiwan was doing
20:33when you were
20:33yeah
20:34I mean using
20:35the more
20:36antisocial form
20:37of social media
20:37as a town hall
20:38it's like using
20:40the nightclub
20:40as a town hall
20:41it's very loud
20:43intoxicating
20:44bouncers asking
20:45you out
20:45I mean
20:46you can't get
20:47people there
20:48but you won't
20:49get a lot
20:49of signal
20:49but in Taiwan
20:51as I mentioned
20:52when we sent
20:53those SMS
20:53to 200,000 people
20:55only people
20:55receiving those SMS
20:57can actually
20:57participate in the
20:58survey
20:59and so
20:59at least
21:00that's one problem
21:01the so called
21:02Sybil problem
21:03solved
21:04right
21:04so we need
21:05to have
21:06a pro-social
21:07social media
21:08and there's
21:09some really
21:09good uses
21:10of language
21:10models there
21:11like from
21:12Google Jigsaw
21:12that can just
21:13promote
21:14the feed
21:15mechanism
21:16so that
21:17instead of
21:17the most
21:18hateful
21:18or the most
21:19polarizing
21:19floating to the
21:20top
21:21it's like
21:22community notes
21:22but with
21:23AI moderation
21:25that gets
21:25things with
21:26more new ones
21:27with more
21:28personal stories
21:29with more
21:29bridging potentials
21:30so it's not
21:31people changing
21:32their minds
21:32but it's
21:33kind of reaching
21:33out across
21:34distances
21:35and floating
21:36those to the
21:36top
21:37and there's
21:37research that
21:38shows
21:38it's as
21:39engaging
21:39people spend
21:40as much
21:41time
21:41as the
21:42old
21:42polarizing
21:43one
21:44but people
21:44feel much
21:45better about
21:45themselves
21:45after using
21:46this kind
21:47of new
21:47sorting
21:47mechanism
21:48so I think
21:49the point
21:49is not
21:49to do
21:50away
21:50with
21:51anti-social
21:51social media
21:52altogether
21:52or banning
21:53them
21:53but to
21:54actively
21:54build
21:55pro-social
21:55social media
21:56and say
21:56only these
21:57are town halls
21:58only these
21:58are public
21:59parks
21:59we're not
22:00going there
22:00for our
22:01town hall
22:01discussions
22:02that's
22:02fascinating
22:03pro-social
22:05social media
22:05I wanted
22:07to ask
22:07as we move
22:08to a world
22:08where it seems
22:09like we're all
22:10going to have
22:10some kind of
22:11personal assistant
22:12maybe we're going
22:13to have these
22:13AI agents
22:14that are going
22:14to be doing
22:15our bidding
22:15across the
22:16internet
22:16are you
22:17concerned
22:18and I'll put
22:18this to the
22:19whole panel
22:19and see who
22:20wants to take it
22:21that this will
22:22simply reinforce
22:22the kind of
22:23filter bubbles
22:23that we already
22:24have
22:24that you'll have
22:25your own AI
22:26that's only going
22:26to send you
22:27information
22:27that it already
22:28knows you want
22:29to hear
22:29and it's going
22:30to be
22:30you know
22:31we're going to
22:31be even more
22:32siloed
22:33and even less
22:34connected
22:34than we are
22:35in the current
22:36media environment
22:37I don't know
22:38if anyone has
22:38thoughts on this
22:39but I mean
22:40it's a concern
22:40of mine
22:41I think
22:41we're sort of
22:41maybe heading
22:42in that way
22:43Andy I don't know
22:44you're sort of
22:44nodding there
22:45yeah sure
22:46I think when people
22:47talk about AI
22:49it seems to the
22:50public like
22:51something gets
22:51brand new
22:52right
22:53but you know
22:53as a scientist
22:55I can tell you
22:56the notion of AI
22:57isn't new
22:58you know
22:59the precursor of AI
23:00machine learning
23:00has been around
23:01for a long time
23:01and what AI
23:03actually is
23:03is it's simply
23:05a continuation
23:05of a cycle
23:08that's been taking
23:08place in the last
23:09three decades
23:09which is making
23:10it more efficient
23:11for humans
23:12to communicate
23:13with computers
23:13that's actually
23:15all it really is
23:16so is it going
23:18to you know
23:18radically change
23:20things as you say
23:21I think it's maybe
23:22an acceleration
23:22but it will also
23:24more or less
23:26drive the existing
23:27trends
23:27so people also
23:29often ask
23:29who is going to be
23:30the winner of AI
23:31revolution
23:32well
23:33what do you need
23:34for AI
23:34you need a large
23:35amount of data
23:36huge amounts of
23:36users to train on
23:37and tons of money
23:38to buy GPUs
23:39and those three
23:40assets all belong
23:41today to big tech
23:42companies
23:42so it's probably
23:43less of a revolution
23:44you know
23:45than we think
23:46and this is why
23:47when we think about
23:48you know
23:48personal assistants
23:50yes they're coming
23:51but unless there is
23:52some business
23:53or policy change
23:54it's simply going
23:55to accelerate
23:55the existing trends
23:57which is taking
23:57the internet down
23:58a path
23:58that you know
23:59has higher
24:00concentration
24:00and more monopoly
24:02power
24:02in a small number
24:03of firms
24:03yeah
24:04Mitchell
24:04what do you feel
24:05about that
24:05and also
24:06I guess
24:07maybe the difference
24:07Andy though
24:08is that
24:08at least today
24:09if you do a Google
24:10search for instance
24:10you get a set of links
24:12and there's a little bit
24:13of friction
24:13where you have to think
24:14like oh
24:15where am I getting
24:16this information from
24:17you actually have to
24:17select one of those links
24:19and then
24:19you at least
24:20there's an opportunity
24:21to think
24:21maybe for a half a second
24:23about where am I
24:24getting this information from
24:24what am I reading
24:25whereas if it's a single
24:27capsule response
24:28that's fed to you
24:29maybe there's less
24:30of that friction
24:31and people think less
24:32I don't know
24:33does anyone have a view
24:34on that
24:34Mitchell
24:34well I think the research
24:36is that if you're not
24:37in the top
24:38like five links
24:39on Google
24:39it's irrelevant
24:40right
24:40the long tail is gone
24:42so
24:44I'm not sure
24:44that the change
24:45is that different
24:46to personal agents
24:48I do think they're coming
24:49I'm not sure
24:50if it gets worse
24:51as a filter bubble
24:53you know
24:53I'll just use my own country
24:55like if you've got Fox News
24:56on 24 hours
24:57like a personal agent
24:58I don't know
24:59that it's any different
25:00or
25:00I don't know
25:01whatever the other
25:02you know
25:02on the left equivalent
25:03of it is
25:04but I do think
25:05there's a real possibility
25:07with personal agents
25:08of actually
25:09so
25:09you know
25:10like right now
25:11you use some giant system
25:13and I'm number
25:14one billion
25:14and whatever
25:15and you're number
25:16three billion
25:17but
25:17but there is actually
25:19very little
25:20with scale
25:21and power
25:21that can represent
25:22me acting
25:23in the world
25:24and I think
25:25personal agents
25:26can do that
25:26I agree with Andy
25:27that the default
25:28right now
25:29is a continuation
25:30of those
25:32with massive resources
25:33and so that's why
25:35I think the open source
25:38interest
25:38and movement
25:39and development
25:40is
25:41at the moment
25:42probably one of the
25:44biggest possibilities
25:45for changing that
25:47both for the development
25:48of things
25:49that are more
25:49me focused
25:50like if you
25:51if you have
25:51billions and billions
25:52of dollars
25:53and you have
25:53data centers
25:54and you've already
25:55scraped the web
25:56and you have
25:56all this garbage
25:57information
25:58in your mighty model
25:59and you've spent
26:00hundreds of thousands
26:01of person hours
26:02checking the guardrails
26:03that's your architecture
26:04right
26:05but already
26:06in a year
26:06of open source
26:07we have these models
26:08running on ever
26:09smaller machines
26:10soon on phones
26:11all sorts of things
26:12that don't fit
26:13the incentives
26:13if you have
26:14those resources
26:16and that's not
26:17a dig
26:17at the people
26:18or the companies
26:18but if you have
26:19those resources
26:20you want to use them
26:21and so I think
26:23open source
26:23was different
26:24amount of resources
26:25with how might
26:26you get
26:27a shared resource
26:28that has compute
26:29and training sets
26:31that can be shared
26:32to build things
26:33that are both
26:34pluralistic
26:35but also
26:36you can get
26:37that experimentation
26:38of how does it
26:39represent me
26:40out in the world
26:40which I think
26:41we need
26:42because otherwise
26:43everything
26:44are these big systems
26:45aimed at us
26:46to buy things
26:47or be angry
26:48yeah
26:49that's a good point
26:50I wanted to ask
26:51a little bit
26:52about the role
26:53of maybe education
26:54here and how
26:54technology itself
26:55can help
26:56help with education
26:58what do you think
26:59you know
26:59citizens need to know
27:01in this sort of
27:02digital world
27:03we're moving into
27:04in order to
27:05to be
27:06informed
27:07and you know
27:08active participants
27:09in democracy
27:10I don't know
27:10if you have thoughts
27:11Sonia do you
27:11I know you guys
27:12have done some research
27:13at McGill
27:13around you know
27:14how you educate
27:15citizens
27:16to be sort of
27:17digitally aware
27:18and savvy
27:19in the current
27:20environment
27:20what are some
27:21of the learnings
27:22from that
27:22yeah so
27:23it's really difficult
27:25we actually prefer
27:26public literacy
27:28instead of digital literacy
27:29because it's
27:30you know
27:31it goes beyond
27:32just a tech problem
27:33and we do need
27:35to be aware of
27:36especially when we
27:37talk about
27:37disinformation
27:38or particular
27:39you know
27:40there's like
27:40a playbook
27:41and it's quite
27:42banal
27:43how consistent
27:45it is
27:46and so
27:47I think
27:48we need
27:49we found
27:49that we need
27:50a lot more
27:50than just
27:51digital literacy
27:51of course
27:52digital literacy
27:52is key
27:54we need
27:54things
27:55we need
27:55more deliberative
27:56democracy
27:57initiatives
27:58that's
27:58that's one
27:58of the
27:59you know
28:00findings
28:00that keeps
28:00coming back
28:01over and over
28:01again
28:02we need
28:03to be able
28:03to deliberate
28:04on these issues
28:05like we're doing
28:06here today
28:06and we need
28:07meaningful engagement
28:08from people
28:09who are
28:09traditionally
28:10kept out
28:11of those rooms
28:11and
28:12are
28:12you know
28:14especially
28:14because
28:15marginalized
28:16communities
28:16and underrepresented
28:17groups
28:17are both
28:18at the forefront
28:19of research
28:20providing
28:20certain solutions
28:21for how to
28:22you know
28:22practically deal
28:23with online harms
28:24and then
28:25they're also
28:26the ones
28:26most adversely
28:27impacted
28:28right
28:28so
28:29we need
28:30to find
28:30ways
28:30to call
28:31people in
28:32and
28:33you know
28:33just some
28:35of the
28:36proposals
28:37we've heard
28:38today
28:38are
28:38a huge
28:40way forward
28:40to do that
28:41Mitchell
28:42I know
28:42Mozilla's been
28:42doing a lot
28:43on education
28:43also
28:44or has been
28:45trying to back
28:45educational initiatives
28:47can you talk
28:47a little bit
28:48about what
28:48you've been
28:49doing there
28:52I am
28:52both a yes
28:53and a no
28:54on education
28:55so I'll say
28:56a little bit
28:56about both
28:57there is
28:58a
29:01public literacy
29:02I think
29:02that's a beautiful
29:03phrase
29:04an understanding
29:05of
29:07you know
29:07each generation
29:08is different
29:08in its approach
29:09to information
29:10and what's real
29:10and where you get it
29:11and how you get it
29:13and so that
29:14sharing of the knowledge
29:16that actually exists
29:17and developing more
29:19that I think
29:20is a fundamental piece
29:22the no piece
29:23though
29:24is
29:24citizens have
29:26X amount of time
29:27in a day
29:27to attend to this
29:29and in between
29:30life and convenience
29:32and whatever else
29:32you're doing
29:33like the answer
29:34is not
29:34that we are
29:35individual citizens
29:36and we alone
29:37are going to
29:38sort this problem out
29:39like governments
29:40and societies
29:42and the business
29:43industry together
29:45have to build
29:45better tools
29:46so that the education
29:49responsibility
29:49at the individual level
29:51is the right depth
29:52right
29:53I want to get questions
29:54for the audience
29:55so if you have a question
29:56please raise your hand
29:56and we'll get a mic to you
29:57the first hand I saw
29:58is the gentleman
30:00in the second row here
30:01I saw his hand go up first
30:03so we'll go to him first
30:04if you can please state
30:05who you are
30:15we need to get you
30:15a better mic
30:16or at least get that one on
30:17let's see
30:19I'm a student
30:22in web development
30:24my question is
30:26a bit broad
30:27and a bit long
30:29so kindly bear with me
30:30try to keep it
30:31relatively not long
30:32I'll try
30:34okay my question
30:35is the following
30:36in the tech industry
30:38as it is today
30:42be it proponents
30:43and actors
30:43on open internet
30:45the big five
30:46or AI creators
30:49at the moment
30:50is there
30:50are there
30:51I would say
30:52places
30:53actors
30:53where
30:55there is
30:56an epistemologic
30:57reflection
30:59being led
31:00I mean
31:01in collaboration
31:02with for example
31:03academics
31:04philosophers
31:05in order
31:07to develop
31:08the tech of the future
31:09with intent
31:10in that regard
31:11or is that reflection
31:13being made
31:15more reactively
31:16as it were
31:17and in effect
31:20separated
31:21from the development
31:22and implementation
31:23of new technology
31:25so if I understand
31:27the question correctly
31:28it's essentially
31:29around the market power
31:30of the sort of
31:31largest tech companies
31:32to shape the future
31:33and is there an effort
31:34outside of that
31:35that's intentionally
31:36trying to create
31:37something different
31:37or is it simply
31:39you know
31:39that people are sort of
31:40reacting against
31:41the existing structure
31:43is that correct?
31:44is that the idea?
31:46my question is more
31:47tech has
31:48and will keep
31:49a very important place
31:51in the notion
31:54of knowledge
31:55human knowledge
31:57and the way
31:58it shapes it
31:59as well
31:59it's an actor
32:00in that regard
32:02now
32:02is that importance
32:05that place
32:05being taken into
32:06account
32:07in the tradition
32:07of the reflection
32:09on knowledge
32:10and the place
32:11it occupies
32:11in society
32:14and I'm speaking
32:15philosophically here
32:16or not
32:17basically
32:18yeah
32:18Audrey looked like
32:19he had an answer
32:20or that he wanted
32:21to say
32:22I mean
32:23just like
32:25telescopes
32:25define how we know
32:27about astronomical
32:28bodies
32:30teleepistemology
32:31as you call it
32:32I think the designers
32:34of technologies
32:35need to know
32:36that this is not
32:37neutral
32:38this is changing
32:39the epistemic norms
32:40right
32:40is that what you mean
32:42right
32:42yes
32:42so I happen to
32:44just publish a book
32:45about this
32:46called plurality
32:47than that
32:48and it stems
32:48from a different
32:49tradition
32:50in Taiwan
32:51in traditional
32:52Mandarin
32:53Shuwei
32:54means both
32:54digital
32:55and plural
32:56as in more than one
32:58and so
32:59pluralism
33:00as a
33:02epistemological
33:02tradition
33:03means that
33:04what matters
33:05is not one viewpoint
33:07dominating the other
33:08viewpoint
33:08but how
33:09like
33:09this eyeglass
33:10can help us
33:11see better
33:12each other
33:13it is not
33:13this eyeglass
33:14replacing my eyes
33:15or things like that
33:16but rather
33:17everyone attuned
33:18to a
33:19I wouldn't say
33:20convincing each other
33:21but just bridging
33:22across short distances
33:23and then slightly longer
33:25and longer distances
33:26so that we can
33:27have a quality
33:29conversation
33:29or deliberation
33:30with slightly more people
33:32than before
33:32or
33:33if we have the same
33:34conversation
33:35with the same size
33:36of people
33:36but with slightly
33:37more quality
33:38of conversation
33:39so if we do zigzag
33:41like this
33:41in our technical
33:42development
33:43instead of going
33:44the other way
33:45which is like
33:46condensing everybody
33:47to a one-dimensional
33:48epistemological
33:49like and unlike
33:50point
33:51then we are
33:52on a good course
33:53of plurality
33:54because then
33:54increases both
33:55the bandwidth
33:56and also the reach
33:57of the deliberation
33:59and communication
34:00I hope that answers
34:00your question
34:01great
34:01thanks
34:02I want to get to
34:03another question
34:03and there are some others
34:04how about down here
34:05in front
34:05this woman
34:06here on the right
34:09thank you
34:10my name is
34:11I'm working as
34:12international
34:13innovation lead
34:14in a consulting
34:15company
34:15and my question
34:15is the following
34:18in the world
34:19where the
34:19data security
34:20private data security
34:21confidentiality
34:22is quite important
34:23and in the world
34:24we also have
34:25at the same time
34:26a lot of
34:27threats
34:28linked to the
34:29national security
34:30what
34:31on your mind
34:32on your thoughts
34:33is the right
34:34balance between
34:35two of those
34:37needs
34:37and values
34:38thank you
34:39balancing security
34:40and privacy
34:41with the need
34:42to do this
34:42Andy
34:42yeah I'm happy
34:43to take that question
34:44I think it's
34:45a very tricky one
34:47and you know
34:48I'm originally
34:49from Taiwan
34:49so let me give you
34:51a perspective
34:51from you know
34:52Asia
34:53right
34:54there is a
34:55country
34:56where there is
34:57actually very low crime
35:00there is
35:01almost complete
35:02surveillance
35:02and people can
35:04you know
35:04the police
35:06sees everything
35:06it's called
35:07North Korea
35:08and the people
35:10there actually
35:10don't feel very safe
35:11living in that
35:12type of society
35:13but
35:14now that's
35:15of course
35:16maybe kind of
35:17an extreme answer
35:17and the answer
35:19to your question
35:19is there actually
35:20is no
35:21let's say
35:22correct answer
35:23it's not a
35:23black and white
35:24it's really
35:25a grey spectrum
35:26right
35:26if you go to
35:27the other hand
35:28and you
35:29respect and provide
35:30individual privacy
35:31for everybody
35:32and take that
35:32to the extreme
35:33then actually
35:34you can have
35:35tons of crime
35:35that happens
35:36that goes
35:36completely unpunished
35:37so how do we
35:39sort of
35:40you know
35:40draw the balance
35:41there
35:41I think we need
35:41to look at
35:42the overall
35:42social good
35:43of systems
35:44right
35:44so when we
35:46defend privacy
35:47as a fundamental
35:48human right
35:48we understand
35:49we're making
35:49a compromise
35:50and the compromise
35:51that we're making
35:51is that
35:53we are going
35:53to accept
35:54some level
35:55of abuse
35:56of privacy
35:56because the
35:58alternative
35:58which is a role
35:59of no privacy
35:59is simply much worse
36:01but it can never
36:02be a black and white
36:03answer
36:03and that's why
36:04you know
36:05we need to have
36:06democratic systems
36:07that allow us
36:07to find that
36:08right balance
36:09interesting
36:10I want to get
36:10some other
36:10questions in here
36:12how about we go
36:12to the other
36:13woman right here
36:13in front
36:13saw her hand up
36:16thank you so much
36:17I'm Katya
36:18I work on
36:19human rights
36:20social justice
36:21and tech
36:21very often
36:22I think
36:23these are two
36:23different worlds
36:24that don't speak
36:25to each other
36:25so I'm super happy
36:26to hear
36:27to have this session
36:29so I feel
36:30like the situation
36:31where we are today
36:31is that we have
36:32islands of hope
36:33and we definitely
36:34know where we want
36:35to go
36:35and what are the
36:35possibilities of tech
36:37right
36:37but the reality
36:38of the world
36:39is that we have
36:39immense needs
36:40and especially
36:42this year
36:42we have the year
36:43of elections
36:44democracy is a threat
36:46right
36:46and if I think
36:47about people
36:47and needs
36:48they're usually
36:49the most
36:49marginalized
36:50right
36:52so to me
36:52it is about
36:53not only what
36:54we do
36:55but how we get
36:56there
36:56and it is about
36:57how we build
36:57bridges
36:58so I see
36:58islands
36:59but I see
37:00also the ocean
37:02of unmet needs
37:03so what is your
37:05view on how do
37:05we build actually
37:06those bridges
37:07to those opportunities
37:08it's about investment
37:09I heard you also
37:10excellent point
37:11about hackathon
37:12and those countries
37:13that are open to it
37:14but basically
37:14how do we build
37:15those bridges
37:17views on how we do
37:18that
37:18Sonia what do you think
37:19yeah this is a really
37:21tremendous question
37:22that I spend
37:23most of my days
37:24preoccupied with
37:26one of the things
37:27I think
37:27that is key
37:28is supporting
37:30and funding
37:31public programs
37:32and making sure
37:34that people
37:34who need
37:36who should have
37:37a voice
37:37are able to
37:39enter the rooms
37:40where we are
37:41deliberating
37:42and devising policies
37:43because this is what
37:44the aim of deliberative
37:45democracy is
37:46right
37:46and I think
37:47over the past few years
37:48we've been talking
37:49a lot about
37:50how do we clean up
37:51how can we build
37:52those bridges
37:53if we don't even
37:54have
37:54we don't even share
37:55a common version
37:56of reality
37:57right
37:57I hear this all the time
37:58and I'm not sure
38:00it's the right question
38:00to ask
38:02democracy to me
38:04is about deliberation
38:04it's about
38:05differences
38:06it's about
38:07conflict
38:08even
38:08in fact
38:10one shared version
38:11of reality
38:12might actually be
38:13the very opposite
38:14of democracy
38:14so I think
38:16it's about
38:16building bridges
38:17but also plurality
38:19is key
38:20and I'll just
38:21underscore
38:22Audrey's
38:22sentiments on this
38:24so
38:25in order to
38:26shore up
38:27as much plurality
38:28as possible
38:28I think we need
38:28those unlikely
38:29connections
38:30I think we need
38:30panels
38:32you know
38:32you started by
38:33saying you're
38:34part of this
38:35two body problem
38:35like tech
38:36and social justice
38:37right
38:37like this is
38:38what we need
38:38more of this
38:39we need
38:39unlikely connections
38:40we need
38:41more people
38:42who wouldn't
38:42speak to each
38:43other together
38:43in rooms
38:45and
38:45I don't know
38:46if you catch me
38:47on a bad day
38:48I'm just as
38:49pessimistic
38:50as you
38:50and you know
38:51maybe by the
38:51weekend
38:52I'm a little
38:52more optimistic
38:53so
38:54very good
38:55other questions
38:56this gentleman
38:56in front
38:57has a question
39:02hi
39:02my name is
39:03Julian
39:03I have a
39:04question
39:05for you
39:05Audrey
39:06first of all
39:07I want to thank
39:07you for all
39:08your sharing
39:08it's very
39:08very inspiring
39:10I am from
39:11Switzerland
39:11and so
39:13we vote
39:13a lot
39:14and I
39:15wanted to
39:15ask you
39:16have you
39:17ever
39:17considered
39:18or thought
39:18about
39:19digitalizing
39:20the
39:20votation
39:21process
39:22yeah
39:23yes
39:24for the
39:25past
39:25seven and
39:26a half
39:26years
39:28in Taiwan
39:29for voting
39:30on people
39:31we use
39:32paper only
39:33ballots
39:33and we
39:34use a lot
39:35of digital
39:35but only
39:36on the
39:36counting
39:37part
39:37so
39:38for each
39:39and every
39:39count
39:39it's taken
39:40out of
39:41the box
39:41it's shown
39:42to three
39:42different angles
39:43three different
39:44parties
39:44people
39:45use
39:45camcorder
39:46or phone
39:47to record
39:47everything
39:48and that
39:48is because
39:49we know
39:50that there
39:50will be
39:51deepfakes
39:52and cheapfakes
39:53accusing
39:54of election
39:54rigging
39:55and so on
39:55that actually
39:56appeared
39:57like two
39:58hours after
39:58the January
39:59election
39:59and the
40:00only way
40:01to debunk
40:01that
40:02is to
40:02pre-bunk
40:03it is to
40:04invite all
40:05three parties
40:05into the
40:06counting process
40:07because otherwise
40:08nowadays the
40:09deepfakes are so
40:09good you cannot
40:10tell from the
40:11content
40:12the only way
40:13is to go back
40:13and say
40:14oh from this
40:15counting station
40:15there are
40:16your parties
40:16people recording
40:17that and that
40:18record doesn't
40:19show it
40:20right so
40:20all three
40:21party leaders
40:22very quickly
40:23went out and
40:23say nothing
40:24like that
40:24happens and so
40:25that particular
40:26information
40:27manipulation
40:27attack did not
40:29succeed to
40:30disrupt Taiwan's
40:31January election
40:31but if you use
40:32electronic tallying
40:34if you use
40:34electronic voting
40:35and so on
40:36then only
40:36cryptographers
40:37can attest
40:38to it
40:39and the
40:40participation
40:40rate will
40:41be low
40:42so we do
40:42use it
40:43for things
40:44that does
40:44not require
40:45anonymity
40:46for example
40:47we do have
40:47a system
40:48for collecting
40:49petition online
40:49that is fine
40:50to signing
40:51for referendum
40:52online
40:52that's fine
40:53participatory
40:54budgeting
40:54that is fine
40:55presidential
40:56hackathon
40:57quadratic voting
40:58that is fine
40:58but anything
40:59that requires
41:00secrecy of voting
41:01and the
41:02participation
41:02nature
41:03of counting
41:04we need
41:05a system
41:06that has
41:06higher
41:07legitimacy
41:07and accountability
41:08than the paper
41:09system
41:10which at this
41:11point doesn't
41:11exist
41:12that's great
41:13if there's
41:13one more
41:14question
41:14from the
41:14audience
41:15I can try
41:15to take
41:15it quickly
41:16just here
41:17there's a
41:17woman with her
41:18hand up
41:18you can try
41:20to be quick
41:20hopefully get
41:21the answer
41:22in
41:22so you
41:23underlined
41:24the importance
41:24of plurality
41:27and access
41:27to information
41:28for democracy
41:29so what I
41:30wanted to ask
41:31was what are
41:32some ways
41:33that social
41:34media and
41:35tech can
41:36promote
41:36diversity of
41:37opinions
41:38and plurality
41:39instead of
41:40the echo
41:40chambers that
41:41are actually
41:42in place
41:43and how
41:43can we use
41:44the algorithm
41:45to promote
41:46diversity of
41:47opinions and
41:48educated speeches
41:49instead of the
41:50ones that we
41:50mostly see
41:51today
41:53do you have
41:54a view
41:55Mitchell
41:55on how we
41:55can do that
41:56or is it a
41:56lost cause
41:57given today's
41:58social media
42:00well that
42:01might be a
42:01different question
42:02yeah
42:06and I think
42:07Audrey's already
42:07spoken to some
42:08of it
42:09so what
42:09the algorithm
42:10or what
42:11algorithm
42:11or what
42:12are the
42:12incentives
42:12and so we
42:14do have
42:14some small
42:14scale evidence
42:16I think you
42:17described one
42:17earlier
42:17community notes
42:18and everything
42:19right
42:19that people
42:23will engage
42:24right
42:24so and
42:26sometimes actually
42:26feel better
42:27the issue
42:28is that
42:29those things
42:29currently have
42:30to be connected
42:31to what's the
42:31most profitable
42:33and unwinding
42:34those systems
42:35is kind of
42:35tricky
42:36or what's
42:37the most
42:37engagement
42:38my own
42:39guess is
42:39that these
42:39things start
42:40small
42:40and you get
42:41some examples
42:42and they grow
42:42over time
42:43I think the
42:44addictive factor
42:45is pretty high
42:45so like some
42:46de-addiction piece
42:47I don't think
42:48it's a problem
42:48with the algorithm
42:49though
42:49it's as you
42:50said
42:50the social
42:51problem
42:51yeah
42:52and I might
42:53say it's
42:53also a
42:54competition
42:54problem
42:54right
42:55if there
42:56were more
42:57than two
42:58social media
42:59companies
42:59there's basically
43:00there's TikTok
43:00and meta
43:01these days
43:02you might have
43:03more plurality
43:03of algorithms
43:04and more
43:04plurality
43:04of opinions
43:05as well
43:06interesting
43:07well we're
43:08out of time
43:08and I want
43:09to thank
43:09our panelists
43:10this has been
43:10a fascinating
43:11discussion
43:12and given
43:12everyone
43:12I think
43:13some reason
43:15to hope
43:15perhaps
43:16but also
43:17lots of
43:18things
43:19lots of
43:19problems
43:20to overcome
43:20but at least
43:21some ideas
43:21about how
43:22we might
43:22start to
43:23do so
43:23so please
43:24give a round
43:25of applause
43:25to my panelists
43:26they've been
43:26great
43:26thank you
43:27very much
Commentaires