- il y a 2 semaines
Elections & Algorithms Trust or Trepidation
Catégorie
🤖
TechnologieTranscription
00:00Hello, thanks for joining me here. I hope everyone's 2024 is going okay. I'm really interested in this intersection between
00:10society and algorithms and technology. So just thinking about what's happened in the last year so far. Number one, we've
00:18had a British princess share a digitally altered photo to the news media.
00:23We had a platform company, maybe without permission, appropriate a celebrity's voice for their technology. And if you haven't seen
00:32it, I highly recommend Googling. There's an AI viral video going around of a fake pizza store, pepperoni HubSpot, which
00:41is amazing. You should all check it out, maybe after this talk.
00:44But there's one thing that sort of like I look forward to, or I'm looking forward to as I really
00:49think about this collision of society and algorithms. And that's really in the next year, in the next few days,
01:00I'm worried about November of 2024.
01:04I'm American, but you can tell from my accent, we have 166 days until the American elections. Now I know
01:12I'm in Europe, there's 14 days until the European parliamentary, give or take, and these all have the same issues.
01:18Why do I care? What's so important about this coming up? Well, I'm going to say chat GPT, but I
01:23really mean generally, all these generative AI technologies that's been thrust upon us and things that we have to worry
01:30about.
01:30So before I get into that, let me tell you a little bit about me and why I care about
01:34this work. So I actually started my career as a vice president of engineering at Twitter. I actually built a
01:42lot of the social media platform that we see today. It was about a decade ago.
01:47I then moved on to be the head of the self-driving team at Uber, and I helped deploy the
01:56first self-driving car fleet onto the roads of Pittsburgh, Pennsylvania.
01:59I literally got to ride in a self-driving car to work every single morning, which is both awesome and
02:04terrifying, if you know anything about self-driving cars.
02:06And then I was lucky to be the first CTO of the US Democratic Party. The US Democratic Party is
02:14the longest-running, continuously-running political party in the world, and I spent a lot of time thinking about how
02:20all these technologies impact our elections and what we need to go do about them.
02:24Today, I work at the Emerson Collective. I'm CTO there. I run a podcast called Technically Optimistic, and we talk
02:32about this intersection and how AI algorithms in our society are colliding.
02:37How do we get the benefits of them, but also how do we mitigate the risks of all of them,
02:41also moving forward?
02:43So the thing that I really want to do is talk about why I'm sort of worried about coming into
02:502024, and what are the things that we need to go do as we sort of look forward to it.
02:56So these risks to our elections are not hypothetical.
03:00If anyone's been sort of watching the news, we've already seen some of these type of things happen already in
03:06the US elections.
03:07I'll play a bunch of malarkey.
03:09We know the value of voting Democratic when our votes count. It's important that you save your vote for the
03:15November election.
03:16It's hard to hear over all this, but in our elections already, we saw President Biden's voice being used as
03:25part of a robocall campaign in one of the states in the United States
03:28to try to effectively convince people not to go vote, like to save their vote for later when it'd be
03:34more useful.
03:35Like, we've actually seen this happen.
03:37And also, I don't want to play partisan politics, we're seeing this also on the other side, where we're seeing
03:42deepfake images of former President Trump
03:45hanging out with constituencies that he wants to get closer to.
03:49Just to be clear, neither of these pictures are real.
03:52Neither of these situations were real.
03:54That's an actual deepfake that's happened in the US election.
03:56These are actual deepfakes right now that are circulating on social media in the US.
04:01And so, this concern is very real, and as a person who's worked in this space, I have to think
04:07about this all the time.
04:08The question is, are there things we can learn, how do we think about this, and are there things we
04:14can learn from other places?
04:15So, in 2024, at some point, by the end of this year, half the world will have been under some
04:23kind of democratic election.
04:24Literally half of the world is going to have elections at some point by the end of 2024.
04:30So, can we see what has happened in other countries as a way to understand how this is impacting us
04:35broadly?
04:36So, let me just take a quick tour of the world for you all.
04:40We can first start in India.
04:42So, let me play a quick video of what's happening in India.
04:49Jai Sriram, greetings to all.
04:51Friends, I'm here just to ask about the work accounted for.
04:56That's not a real video.
04:57But that's been circulating around India's elections right now.
05:01It is a real video, but it's not actually what's going on.
05:04Let me show you what's actually going on.
05:10So, the BJP, one of the largest ruling political parties in India, deepfake their own president
05:18in order to put him on to all the social media channels to start talking in languages that he didn't
05:24know
05:24in order to reach more voters throughout India.
05:28So, we're seeing people use deepfakes already in our election process as a way to amplify the voices of what's
05:35really going on.
05:36So, that's one example.
05:37We'll go to another part of the world.
05:40We can look at Indonesia for a second.
05:43And so, Indonesia is effectively a Disneyland when it came to AI-generated content in their elections.
05:49Like, literally, you couldn't see any social media posts.
05:53You couldn't see any piece of news content that didn't have some form of generated content in it.
05:58This one probably being the most popular.
06:04Bismillahirrahmanirrahim.
06:07Pada 14 Februari 2024,
06:16I can let this play for a bit, but this is a video of Suwarto addressing all people of Indonesia
06:23about the elections and what his position is going to the elections.
06:27The problem is that Suwarto is dead.
06:29He's a former dictator.
06:31He's been dead since 2008.
06:32And so, people have been literally resurrecting him in order to talk as part of social media campaigns
06:39to support different candidates throughout the election.
06:42So, we're seeing insanity like that in Indonesia.
06:45Let's bring it a little closer to home or a little closer to where we are now.
06:49In the British system, and the Brits just announced they're going to do a snap vote
06:54or they're going to do a general election in the next few months,
06:57we saw another audio deepfake, this time of the mayor of London.
07:01I don't give a flying shit about the Remembrance Weekend
07:04and even more so don't care about next Saturday.
07:07What's important and paramount is the one million man Palestinian march takes place on Saturday.
07:12I control the Met Police.
07:14They will do as the mayor of London tells them and obey orders.
07:17So, what's going on here is that this is a deepfake audio
07:20that's been circulating in the British system
07:23of the mayor of London basically disparaging their Remembrance Day
07:27and instead calling for pro-Palestinian protests to happen that day instead.
07:33As you can imagine, ensuing a whole bunch of confusion in London,
07:37especially on that day of exactly what's going to go on.
07:40And the problem with these audio deepfakes,
07:42these actually might be the real things that we need to be concerned about,
07:45is they're actually quite hard to detect
07:48and they're quite hard for people like you and me to understand what's real or not.
07:52They usually come in over your cell phone or you hear them on the radio.
07:55They're purposely meant to sound not great
07:58so that you get confused as the listener.
08:01And so, people are now using this to get voices into the space
08:06of things that haven't actually been said in order to confuse users.
08:10I'll talk about one more example and this is in Ireland
08:13and this is probably the saddest case actually.
08:16So, in the case of one politician, Kara Hunter,
08:20they've actually been doing deepfake porn against her
08:24as a way to try to get her to step out of the election.
08:27So, effectively using deepfake content
08:30as a way to use effectively political violence
08:33to force someone out of the election system.
08:36So, we're starting to see all these kind of things
08:38show up inside the election.
08:40Like, let me just briefly recap for you.
08:42So, in India, we're seeing deepfake videos
08:44used to have politicians talking languages
08:47they can't actually speak in.
08:48In Indonesia, we're seeing people being resurrected from the dead
08:52in order to convince voters to act one way or another
08:56and honestly still confuse a whole bunch of internet users
09:00on what's really going on.
09:02In England, we're seeing deepfake audios
09:04spreading pretty quickly.
09:06And in Ireland, we're seeing this deepfake sexual violence occurring
09:11which is causing people, thankfully, people like the minister there
09:16is not stepping out of the election
09:18but seeing pressure in order to step out of these election systems.
09:21So, these, sadly, are only the tip of the spear.
09:25This is not the end of the game.
09:27Like, we're seeing just people refining their tactics
09:30over the past couple of months
09:32and we have a few more elections coming up moving forward.
09:35So, what do I do in sort of this space?
09:39Well, one, I wonder about sort of what's called the liar's dividend
09:43which is basically if we set up an environment
09:45where we think that everything is claimed to be fake
09:49or we think that everything is fake
09:51any politician now has the ability to say
09:54I didn't say that.
09:55Like, that's actually a deepfake video of me
09:57and you can't prove that it's not.
09:59So, we're actually now walking into an environment
10:01that because of these deepfakes
10:03you can't actually trust anymore
10:05what you see with your own eyes
10:07when you see anything on video or social media
10:10because a politician can just now simply deny
10:13it's ever actually happening.
10:14So, again, me as an engineer
10:17as someone who used to work inside a political party
10:19what do I do?
10:20So, I think about, well, can I use these lessons
10:23in order to, like, think about
10:24how I could attack the U.S. election system
10:27or attack someone else's election system.
10:29So, why don't we just do that for a bit?
10:32What if we just do a deepfake of Taylor Swift
10:34and President Biden
10:35and having Taylor Swift endorse President Biden?
10:38This is a pretty easy thing to put together.
10:40We did this in Photoshop
10:41with a few generative AI tools.
10:43It's actually a fairly plausible thing.
10:46Taylor Swift endorsed the president last cycle.
10:48So, if you saw this on TikTok
10:50or you saw this on Instagram today
10:52you might believe it to be true.
10:54And, again, I don't want to play political favors.
10:56What if we did the other side, too?
10:57of what if you did Taylor Swift
10:59basically endorsing what happened
11:01in our 2020 elections
11:03and say that Trump won
11:04and sort of throwing her support that way.
11:06Again, not entirely implausible.
11:09Maybe people like you,
11:11people more educated,
11:12people like you in the audience
11:13would think there's something wrong here.
11:15But, again, completely plausible.
11:18Let's throw Taylor Swift out for a moment.
11:20What if instead we just manipulated photos
11:23that showed people where their polling place was?
11:26So, polling places in the US
11:28are in churches,
11:29are in schools,
11:30are in town halls.
11:32What if we just spread images
11:33that basically said,
11:34my polling place is flooded today.
11:36Like, instead, just get people
11:37not to even show up to the election
11:39because they're like,
11:40I don't want to figure out
11:41where I need to go vote today.
11:42My polling place is flooded.
11:44Not actually true, obviously,
11:45but we can just confuse
11:47enough people moving into the elections
11:50or just like how they should think about it.
11:51Elections are one on the margins.
11:53All you need to do is get 10,000 people
11:55not to vote
11:56and you might swing an election completely.
11:59So, I worry about those type of things.
12:02Like, those are actually things
12:03that we could go do
12:04and those are things
12:05that we can actually deploy
12:07into an electoral system right now
12:09and worry about.
12:10So, what do we do about this?
12:12On one hand,
12:13a lot of people see this
12:15and say, like,
12:15we need legislation.
12:16Like, we need regulation
12:18in order to go fix this.
12:19The problem is that
12:20regulation and legislation
12:22moves really slowly.
12:23Can we get those in time
12:24for an election?
12:25Who knows?
12:26But they move really, really slowly
12:28and it may not solve all the problems.
12:30We can argue about
12:30whether we have the right technologists
12:31in the right places.
12:33So, I get people then asking,
12:34can we use technology
12:35to fix this problem?
12:36And I'm like, maybe.
12:38So, we ran all these three images
12:41through some of the online
12:42deepfake detectors
12:43and the online deepfake detectors
12:45are, frankly, confused.
12:48They can't tell you definitively
12:50that that first image
12:51is real or not.
12:52They can only tell us definitively
12:54that this middle image is fake
12:56and they can't tell us
12:57on this last one.
12:58So, if we're relying on technology
12:59to actually do it,
13:01technology is failing us right now.
13:02So, that's not going to work either.
13:04So, we can't fight technology
13:06using technology right now.
13:09So, where does that really leave us?
13:12What can we really do in this case?
13:14So, you know, I just said
13:16that we can't do legislation,
13:20we can't do technology,
13:21we can't do all these things.
13:22Our natural tendency
13:23would be to panic
13:24and I'm going to say,
13:25don't panic.
13:26Like, there's actually a few things
13:27that we can probably do
13:28or maybe, you know,
13:30just panic a very little bit
13:32because there is some things
13:33that we can go do here.
13:34I'm pretty optimistic
13:35that there are ways
13:36in order to fix this.
13:38First off, thankfully,
13:40there's actually no good evidence
13:41that any one of these deep fakes
13:43while threatening,
13:45while scary,
13:46while et cetera,
13:47has actually swayed
13:48any of the elections.
13:49So, maybe we can have faith
13:51in people
13:51that people can actually
13:53detect this
13:53and decide what they can decide
13:55for themselves
13:55what's true or not.
13:57Maybe.
13:58That's one solution.
13:59I know I ragged on regulation
14:00for a bit
14:01but I actually think
14:02some regulation
14:03could be helpful here.
14:05The problem is that
14:05we as a global society
14:07can't get our acts together
14:09right now
14:09to actually go do it.
14:10So, we need to figure out
14:12what parts of regulation
14:13we can actually move.
14:14There's some short-term stuff
14:16that we should do right now.
14:17Like, for the minister in Ireland,
14:19we need to ban
14:20non-consensual,
14:22sexualized,
14:22synthetic imagery
14:24right now
14:25across the world.
14:26That's something
14:26I think we can all agree
14:27that we should go do.
14:28So, it's a short-term fix.
14:30As a medium-term fix,
14:32we can start having conversations
14:33around watermarking
14:34and data providence
14:35so that we can tell
14:37when an image shows up
14:38on our social media feeds,
14:40when an image shows up
14:41in the media
14:41that this piece of content
14:43was actually taken
14:44by a camera.
14:45It was minorly edited
14:46by Photoshop
14:47before it went live.
14:49So, watermarking,
14:50the common concern
14:50is that people are like,
14:51well, we can't watermark everything
14:53but the absence
14:54of a watermark
14:55is a pretty strong signal
14:56that we then need to go
14:58and check what's going wrong.
14:59You know, Leica
14:59just started implementing
15:01watermarking
15:01and content providence
15:02in their cameras
15:03and I feel that
15:04that's going to be
15:05a technology
15:06that will move forward
15:06in a lot of different ways.
15:07and then finally,
15:09we also need
15:10to put a lot of pressure
15:11on tech platforms
15:12in order to step up
15:13their game.
15:14I think we can all agree
15:15that the tech platforms
15:16are probably not doing enough
15:18right now
15:18in order to try to police
15:20what's going on
15:20in their systems.
15:21So, that's technology
15:22and regulation.
15:24I sort of look at this
15:25as a three-legged school.
15:26Like right now,
15:27tech companies rule the world.
15:29Regulation is trying
15:30to catch up really quickly
15:31but then there's all of us.
15:33There's sort of like
15:34civil society
15:34and I believe a lot
15:36in sort of like
15:37the power of people
15:38in order to try
15:39to fix this problem too.
15:40I think first,
15:41we have to acknowledge
15:42that most of these issues
15:44happen on sort of like
15:45in an inequitable way.
15:47There are certain
15:48population groups
15:49that we're actually
15:50really concerned about
15:51that are maybe
15:52not the most
15:53digital literate
15:53or the most
15:54digital savvy
15:55and we need to go
15:56and spend a lot more time
15:57with those groups
15:58not just on things
15:59like digital education
16:01but also public
16:03and civic education.
16:04We need to create
16:05what are called
16:05trusted messenger programs
16:07to have real direct contact
16:09especially on critical information
16:11like where's your polling place?
16:13What's the election day?
16:15And things like that.
16:15We need to set up
16:16those type of programs
16:18and then again,
16:19we all in this room
16:21need to advocate for change.
16:23I strongly feel
16:25and this is a thing
16:25I say a lot of time
16:26on my podcast
16:27that change actually starts
16:29with a collective action
16:30from all of you.
16:31We need to be sending
16:32the right market signals
16:34to the platform companies.
16:35We need to be advocating
16:36to all our legislators
16:38that we need
16:39some better controls in place
16:41because that kind of legislation
16:42is a reflection
16:43of the values
16:44that we care about.
16:48So like I said,
16:49this is going to be hard.
16:50Like we have to not panic
16:51but we have to prepare instead.
16:53So I'm going to put
16:54this slide back up
16:55for one second
16:56which is like
16:57this actually all
16:58just starts with all of us
17:00working in the room,
17:02working as a community
17:03to figure out
17:04what is the society
17:05we actually want to live in.
17:07Do we want to live
17:08in that Disneyland
17:09that I showed you
17:10that Indonesia is experiencing?
17:12Do we want to have
17:13our election systems
17:14look like the ones
17:15that India is experiencing
17:16right now?
17:17Do we want deep fake audio
17:19spreading like wildfire
17:20like it's doing
17:21in the UK right now?
17:23And we're about to see
17:24in 166 days
17:25what's going to happen
17:26in the American elections too.
17:28I contend
17:29we don't want
17:29any of that
17:30and I feel like
17:31all of us
17:32need to start
17:33pushing back
17:34in every way we can
17:35getting better educated
17:37on how these technologies
17:38can actually impact us
17:39making the right
17:40buying decisions
17:41making the right
17:42advocacy decisions
17:43so that we can live
17:44in the world
17:45that we want to live.
17:46So with that
17:47I'll just say thank you.
17:49Thank you.
17:50Thank you.
17:51Thank you.
Commentaires