- il y a 6 heures
The New York Times Debate: Is AI the New Plastic?
Catégorie
🤖
TechnologieTranscription
00:02Thank you. Hello and welcome. I'm Sophie Lombard. I'm the founder and CEO of Kite Insight. I will be the
00:12curator of the New York Times Debate today.
00:15And before introducing my very distinguished co-moderator Ravi Mathieu from the New York Times, I just would like to
00:25take a few minutes to talk you through the format of this session.
00:30So at the risk of stating the obvious, this is not your usual panel. It's a debate, which means that
00:38there is a foresight and an against side.
00:42And with a tricky motion like this one, the debate will call for an impressive mix of courage, intellectual agility,
00:53deep expertise, eloquence,
00:56and I would even go as far as suggest it will require a lot of empathy.
01:02As I hope you will see today, this timeless democratic format has never felt more urgent and more vital.
01:12Debate encourages respectful confrontation, nuanced disagreement, in an era of increased polarization and entrenched opinion,
01:24where as anxiety-inducing it might feel to say this, where deep thinking and deep listening is increasingly becoming,
01:34or is increasingly under threat, I should say.
01:39So we are extremely grateful to our speakers who have willingly, mostly, agreed to show up today to share their
01:48minds to this important question.
01:50And of course, thank you to you, the audience, as your part tonight will be extremely important, in particular your
01:58engagement.
02:00Now, as we delve into the world of AI, to try to unearth some of this most critical tension,
02:09and as we draw on this really interesting comparison with the arrival of plastic in our lives,
02:15I would just like to remind you that the formality, or the intent behind the formality of this debate,
02:22is really to create a safe space for our debaters,
02:25who have bravely agreed to lend their voice and their expertise to this exercise,
02:31which means that in some cases, they might very well argue beyond what might be their own personal conviction and
02:38opinion.
02:38All this for the sake of the debate.
02:41So they will need all your encouragement,
02:44and we would really ask you to not hesitate during the debating moment,
02:52to feel free to clap, or to whoop, if they make a good point,
02:56to encourage them into the debates,
03:00and really to lean into the theatrics of the moment.
03:02We have asked the same of the debaters,
03:05so this is an invitation that is shared with you as well.
03:08Now, we are all very serious people,
03:11but I think on a Friday afternoon, we all like a little bit of drama.
03:15At least I do.
03:16And I suspect Ravi does too,
03:19which prompts me to introduce him.
03:23Ravi is the managing editor of DealBook for the New York Times,
03:27and he will come in a minute and set the scene for us.
03:31So please, everyone, just get ready.
03:34Join me in welcoming Ravi and his pumped-up debaters to the stage.
03:39Thank you.
03:58Four sides.
04:03Already a bad start.
04:04We didn't know the right side to go on.
04:05Okay, it's a well-oiled machine.
04:07Thank you, Sophie.
04:08Merci à tous.
04:09Bonsoir.
04:09Bienvenue à Paris.
04:11As Sophie said, my name is Ravi Mathieu,
04:13and I'm an editor at the New York Times.
04:15And I'm very pleased to be here for this debate
04:18that I'm told is going to be very energetic,
04:21very pumped up,
04:22both from those on stage,
04:24but more crucially from you,
04:26all you techies that love all this AI stuff.
04:28Now, the resolution is that AI is the new plastic.
04:33Now, you might be thinking this is a bit of a contentious issue or idea,
04:37but it's not really a yes or no question.
04:40I mean, to be clear,
04:41we're trying to not really get to the bottom of whether AI is all bad or all good,
04:48but using this analogy around how we treat plastic
04:52to figure out some of the big questions around AI.
04:55Of course, on the one side,
04:57its proponents talk about the great gain
05:00that it's going to bring to society,
05:02productivity,
05:04reducing mundane tasks for all of us in our daily lives,
05:08massive problems being solved from drug discovery
05:11to figuring out weather forecasts.
05:14But,
05:15and there's always a but when you're a journalist,
05:17because that's our job,
05:18but
05:19there are lots of big questions,
05:21and it's not just us asking them.
05:23I know all of you who are in this event at VivaTech
05:26are probably drinking the Kool-Aid of how great AI is,
05:30but there's some major questions,
05:32and some of those pioneers in AI are asking them.
05:37One I want to relate to you is Jeff Hinton,
05:39who was a really pioneering researcher at the University of Toronto
05:43around this concept of neural networks.
05:45He told one of my colleagues last year,
05:48it's a bit hard to see
05:49how you can prevent bad actors
05:52from using it to do bad things.
05:56Now,
05:56this opens up
05:58us to this debate.
06:00The question really is,
06:02is AI a game-changing tool
06:05and a new source of momentum
06:07in our businesses,
06:09in our daily lives,
06:11just as plastic was?
06:13Or,
06:14should we really be worrying about the trajectory
06:16you're on?
06:17That will it trigger various unintended impacts
06:21that could,
06:23like plastic,
06:24have a much longer and more negative tail
06:26for the way we live?
06:29Things like energy consumption,
06:31things like the relationship that we have as humans
06:33with technology.
06:37Now,
06:38it's only the beginning,
06:39both of the AI revolution,
06:41but also,
06:42more interestingly,
06:43for this debate
06:44on a Friday evening.
06:45Now,
06:46this all sounds very dramatic.
06:48We don't want to give up
06:49too much of the drama that's to come.
06:52There's a thing on there
06:53which might hint at some of the props
06:56that were going to be used.
06:57But I'm very excited,
06:58as I'm sure all of you are,
06:59to find out
07:02what points
07:02our debaters
07:03are going to be offering us
07:04both for and against this debate.
07:07But before we get there,
07:08a bit more housekeeping
07:08from my colleague Sophie.
07:10Sophie.
07:11Thank you, Ravi.
07:12Thank you.
07:13Thank you for that great setup
07:14and hopefully you left enough
07:16for the debaters to debate.
07:19So,
07:19the stakes are very high indeed.
07:21and the most important thing
07:23we need from you
07:24is in many ways
07:25to keep an open mind
07:26as the point of this debate
07:28is to challenge
07:29our own bias and assumptions
07:30and to see whether this debate
07:32might shift your opinion
07:33or you might learn something from it.
07:37So,
07:37let's get into the rules
07:39and listen carefully
07:40because it's a little bit involved.
07:41So,
07:42right there,
07:43in front of us,
07:44we have three very serious
07:45and intimidating jury members.
07:47Maybe you can wave.
07:48Nanette Lafond-Dufour,
07:51Stephen Jemison
07:52and Marcus Wagner.
07:54Yes.
07:55I'm liking the clapping.
07:56Even when unintended,
07:57it's great clapping.
07:58It's enthusiasm.
07:59You like that, right?
08:00Sophie?
08:00So,
08:01the jury is really here
08:02to listen carefully
08:03to what the debaters
08:05are going to say.
08:06They're going to come on stage
08:07and reflect on what they heard.
08:09They will hopefully pick up
08:11on the point
08:12that they've been
08:12the most salient and interesting
08:13and they might provide
08:15the occasional challenge
08:16to our debaters
08:17so that they have something
08:18to consider
08:19when they come
08:20for their final retort.
08:21Now,
08:22debaters,
08:23you would each have
08:24three minutes precisely
08:25to make your arguments
08:27on the foreside
08:28and the against side.
08:29You'll have a clock
08:30there that indicates
08:32how far you are
08:33from the end.
08:34You also have a gong
08:35that will tell you
08:36that you are at the end.
08:37So,
08:38no hugging of the mic,
08:39please.
08:39Ravi and I
08:40will be watching the clock
08:41very carefully.
08:42We will let you
08:44finish your sentence,
08:45however.
08:46after you have spoken,
08:48the jury will come on stage,
08:50they'll stand here,
08:51and they'll reflect
08:52on what you've said.
08:54They'll give you
08:55some food for thought,
08:56take good notes
08:57during that moment.
08:59Then you'll have
08:59two minutes
09:00to reflect among yourself
09:01on what you heard,
09:03what the opposite side
09:04has said,
09:04what the jury member
09:05has said.
09:06Audience,
09:07at that point,
09:07you will have also
09:08the opportunity
09:08to reflect.
09:11then you will have
09:13the opportunity
09:14to come for the
09:15final rebuttal.
09:16You come one
09:17or two
09:18or three of you
09:19for your final
09:21moment.
09:22This moment
09:22is usually
09:23very dramatic.
09:24The gloves
09:25comes off,
09:26emotion runs high,
09:27and the debaters
09:28fly tooth and nail
09:30to clinch the victory.
09:31You'll have
09:32one and a half minutes
09:33to do so,
09:34to drive your point
09:36home.
09:37Then the audience,
09:38that's you.
09:39You're coming
09:39at that point.
09:41With a very
09:42technologically
09:43sophisticated,
09:44possibly AI-enabled
09:45clapometers,
09:46you will declare
09:47the winner.
09:49This is important.
09:50Jury,
09:51you're supposed
09:51to laugh at that point,
09:52but that's okay.
09:53That fell flat.
09:54Thank you.
09:57Audience,
09:57very important.
09:58We will ask you
09:59to not judge
10:00the debaters
10:01based on your
10:02own personal opinion.
10:03We'll ask you
10:04to judge the debaters
10:05based on the team
10:06that you felt
10:07has been the most
10:08compelling in arguing
10:09their side.
10:10What we want
10:11to celebrate here
10:11is the quality
10:12of the argumentation,
10:14the spirit
10:15that our debaters
10:15will put behind it.
10:17And yeah,
10:18I think that's
10:20going to be
10:21the run of show.
10:23So, Ravi,
10:24Is it time?
10:25Okay.
10:26So as Sophie
10:26has outlined,
10:27audience participation
10:28is crucial
10:29to this entire exercise.
10:31Okay?
10:31This is the new world
10:32of live journalism.
10:34Now,
10:35the clipometer
10:36she referred to
10:37is going to be
10:37crucial to deciding
10:38the future
10:39of these debaters
10:40and their bonuses
10:41at the end
10:42of this session.
10:44But crucially,
10:45you need to make sure
10:47that alongside clapping,
10:49some cheering
10:49would be great.
10:50Anything to give a sense
10:51how enthusiastic
10:52you are
10:53about both
10:54their telling
10:55of their stories
10:56and the motion
10:57at hand.
10:57Now,
10:58just to get a sense,
11:00warm us up,
11:01I'm not the warm-up act
11:02here,
11:02it's kind of a bit
11:03disconcerting.
11:04But to get a sense
11:06of the temperature,
11:07I'm going to ask you
11:07how you're feeling
11:08about the motion
11:09right now,
11:10and then we'll do
11:11a little bit
11:11of a sense check
11:12at the end.
11:12So,
11:14for those
11:16who agree
11:17with the motion
11:18that AI
11:19is the new plastic,
11:21please clap,
11:23make some noise now.
11:29Wow,
11:30that's a lot.
11:32At a tech event,
11:32that's really a lot.
11:34Okay,
11:34you've all been
11:34listening to Dario
11:36at Anthropic,
11:37I guess.
11:37Okay,
11:38now,
11:39for those of you
11:40who disagree
11:40and think actually
11:42AI is not
11:43the new plastic,
11:45see if you can
11:46drown those out
11:47now.
11:50Whoa.
11:52Somebody's got
11:52some hard work.
11:54Finally,
11:54any undecideds,
11:55make a lot of noise.
12:00I'm pretty sure
12:00you clapped three times,
12:02but that's all right.
12:03Okay,
12:04so,
12:04without further ado,
12:05we're going to get
12:06the first speakers
12:08from each side
12:09to come to the podium,
12:10please.
12:11And if you could
12:11please give them
12:12a warm round of applause.
12:15Thank you.
12:16Now,
12:17I'm going to introduce them.
12:19I think,
12:19oh,
12:19it's not there yet,
12:20but I'm going to introduce
12:21each other to them.
12:22Now,
12:22in good
12:24journalistic
12:25New York Times style,
12:26alongside their
12:27official titles,
12:28we've looked
12:29and done some deep
12:29research to find
12:30some sort of
12:31hidden human talents
12:34that aren't
12:34on their biography.
12:35So,
12:35on this side,
12:37for the fourth side,
12:38we have Chirag Shah,
12:39who's a professor
12:40at the Information School
12:42at the University
12:42of Washington.
12:44Alongside
12:45his academic credentials,
12:47it turns out
12:48that during
12:48his graduate studies,
12:50Chirag wanted
12:51to become
12:51a professional
12:52ballroom dancer.
12:54but once
12:55he decided
12:55to propose
12:56to his girlfriend,
12:57he realized
12:57he needed
12:58a more steady job,
12:59hence academia
13:00and AI expertise.
13:04On the
13:05against side,
13:06we have
13:06Philippe Rombach,
13:08who is the
13:08chief AI officer
13:10at Schneider Electric.
13:12Philippe is the
13:13author of a book
13:13titled
13:14Maple 5
13:15in Classe Prépa,
13:17a French language
13:18textbook that is
13:19designed to support
13:20students in
13:22Classe Préparatoire.
13:23That's really hard
13:24for me to say,
13:24okay?
13:25So, that's like
13:26school class.
13:27I'll do this school.
13:27Anyways,
13:28without further ado,
13:29crack on, boys.
13:32Sorry?
13:32Okay.
13:35Is AI the new plastic?
13:37Hell yeah.
13:39AI is the new plastic
13:40because our relationship
13:42to it is proving
13:43to be just as
13:44short-termist.
13:45There isn't enough
13:47evidence to suggest
13:48that we are planning
13:49as a planet
13:50for AI's
13:51long-term risk.
13:53We are governed
13:54mostly, in fact,
13:56what AI is doing now.
13:58Let me give you
13:58three stark examples
14:00of this dangerous
14:01parallel.
14:03First,
14:04the deployment
14:06rush mirrors
14:08plastic's
14:09early adoption.
14:11Just as plastic
14:12was rapidly
14:13integrated
14:14into everything
14:15from food packaging
14:16to furniture
14:17in the 1950s
14:19without understanding
14:19microplastic
14:20or ocean pollutions,
14:22we are embedding
14:23AI into critical systems.
14:25Healthcare,
14:26finance,
14:27criminal justice systems,
14:29without fully
14:30grasping
14:31how these algorithms
14:32have long-term
14:34societal consequences.
14:36Companies are
14:37racing to deploy
14:38AI solutions
14:38because they work today.
14:40Not because we
14:41understand what happens
14:42when these systems
14:43scale globally
14:45over decades.
14:46Second,
14:47the waste problem
14:48is already emerging.
14:50Plastic gave us
14:51convenience,
14:52but created islands
14:53of garbage
14:54in our oceans.
14:56AI is creating
14:57mountains of
14:58digital waste,
14:59energy consumptions
15:00that rival
15:01entire countries,
15:02discarded data sets
15:04filled with bias,
15:05and obsolete models
15:06that leave behind
15:07discriminative patterns
15:08that are embedded
15:09in these institutions.
15:11The environmental
15:12costs of training
15:13a large language model
15:15equals the lifetime
15:16emissions
15:16of several cars,
15:18and yet,
15:19we are churning out
15:20new models monthly.
15:22And third,
15:23the regulatory lag
15:24is identical.
15:26It took 70 years
15:27to start
15:28seriously regulating
15:30plastic.
15:31And even now,
15:32microplastics
15:33contaminate
15:33our bloodstreams
15:35with government
15:36debate still
15:37solutions.
15:38With AI,
15:39we are seeing
15:40the same patterns.
15:41Algorithmic bias
15:42is already reshaping
15:43our hiring
15:43and lending.
15:45Deep fakes
15:46are destabilizing
15:47democracies,
15:48and automated systems
15:49are making life
15:50alternative decisions
15:51with little oversight.
15:53Yet,
15:53our regulatory frameworks
15:54are years behind
15:55the technology.
15:57The fundamental problem
15:58is that both plastic
15:59and AI offer
16:00immediate and tangible
16:02benefits that bind us
16:03to systemic risk.
16:05Plastic revolutionized
16:06food safety
16:07and manufacturing.
16:09AI is revolutionizing
16:10productivity
16:11and decision making,
16:12but in both cases,
16:13we are prioritizing
16:14short-term gains
16:15over long-term
16:16consequences.
16:18AI is the new plastic
16:20because we are making
16:22the same catastrophic
16:23mistakes,
16:25prioritizing immediate
16:26utility
16:26over generational
16:28consequences.
16:29And just like
16:30with plastic,
16:31by the time
16:31we fully understand
16:32what we have unleashed,
16:33it may be too late
16:34to contain it.
16:41Before the gong,
16:42well done,
16:42well done.
16:43Extra marks.
16:44Philippe.
16:44So,
16:46you spent two days
16:47in Vivatech,
16:48you have seen
16:49hundreds of beautiful
16:50AI applications,
16:52and you still wonder
16:52if AI and that
16:54compares?
16:55I can tell you
16:57it does not.
16:59So,
17:00AI,
17:01sorry,
17:01plastic,
17:02can be substituted
17:03by many much
17:04better materials,
17:06much more planet-friendly.
17:07but I can tell you
17:08that nothing
17:09can replace AI
17:11to fight climate change
17:12and to enable
17:13energy transition.
17:15Nothing on Earth.
17:17Let's start
17:18to make sure
17:18you are still all awake
17:19in your right
17:20state of mind
17:21by some hands.
17:22Who has stopped
17:23drinking sodas,
17:25that's good,
17:25but drinking sodas
17:26with plastic straws?
17:27If it's true,
17:28left your left hand,
17:29left, left,
17:30left hand with the straws.
17:31Okay,
17:32very good,
17:32thank you.
17:33Keep your left up,
17:34keep your hand up.
17:35who is using,
17:38who has started
17:38to use AI
17:39in your daily life
17:40to optimize your energy
17:41at home
17:42or at work?
17:43Okay,
17:44not many.
17:45So,
17:46why?
17:46Because probably
17:46you don't know
17:47how much AI
17:48can help
17:48with energy transition,
17:50how much AI
17:51can help
17:51with energy savings.
17:53Let's take a few
17:54basic examples
17:55at home.
17:56I'm sure
17:56that many of you,
17:57the ones which are
17:58really planet-friendly,
18:00planet-conscious,
18:01have installed
18:01solar panels
18:02on your roof,
18:03are driving
18:03an electrical vehicle.
18:05That's great
18:06and that's wonderful
18:07for the planet.
18:08But you can do
18:09much better.
18:10Just put a bit
18:11of AI on top of it.
18:13Not generative AI,
18:14simple AI,
18:15not lots of consumption.
18:16And then you can
18:17forecast your production,
18:18forecast your consumption,
18:19and optimize
18:20the way you use energy
18:22and avoid buying
18:23electricity from the grid
18:25when it is heavily
18:26carbonated
18:27and very expensive.
18:29And by doing that,
18:30you will truly help
18:31decarbonize the world.
18:33Another thing,
18:35what is the energy
18:35that truly
18:36does not impact
18:37the planet?
18:38The one you don't use.
18:40The one you don't need.
18:42And here,
18:43we can use AI again
18:44to optimize
18:45our processes
18:46to use less energy.
18:47This process can be
18:49heating and cooling
18:50a building,
18:51manufacturing stuff,
18:52driving a car,
18:53moving goods.
18:55With AI
18:55and with machine learning,
18:56we can build models
18:57and use those models
18:58to optimize.
19:00Let's take a room
19:00like this one.
19:02We have millions
19:03of rooms
19:04in hospitals,
19:05hotels,
19:06offices
19:07that use dumb
19:08room controllers,
19:10dumb thermostats.
19:11Just put a bit
19:12of AI in it.
19:13The thermostats
19:14will learn
19:14the thermal behavior
19:15of the room
19:16and then we'll be able
19:17to optimize the room
19:18and spend 20%,
19:1930% less energy.
19:22Dream,
19:22one day,
19:23VivaTech temperature
19:24will be comfortable
19:25if they put some AI
19:27in their room controller.
19:29So all of that
19:30to say that
19:31if we use AI
19:32for the right thing,
19:33if we use the right AI,
19:34which is not always
19:35generative AI,
19:36AI will help us
19:37remove waste.
19:39AI will help us
19:40remove carbon
19:41when so much
19:42was brought by plastic
19:43in the world.
19:44Thank you.
19:47So,
19:49take your seats please.
19:50A very robust
19:51opening set of arguments.
19:53To summarize,
19:53Chirag,
19:54on the foreside,
19:55three things.
19:56One,
19:57we're prioritizing
19:58short term
19:58over the long term.
20:00We're not thinking
20:01about the long tail
20:02of what AI can cause.
20:03Two,
20:05the impact
20:06on the negative side
20:07are already happening
20:08in a short way
20:08and you're seeing that
20:09in terms of energies.
20:11And three,
20:12regulators,
20:12just as they did
20:13with plastic,
20:14aren't keeping up
20:15with developments in AI.
20:16But a very forceful
20:17and direct response
20:18from Philippe
20:20was that all of that
20:21was nonsense
20:21and that ultimately
20:23the biggest issue
20:24we face
20:24is climate change.
20:25AI can allow us
20:27to address this
20:28in a way
20:28that nothing else can
20:29by optimizing processes.
20:31And he went
20:32for the easy win,
20:33the easy clap,
20:35mentioning that
20:35it's a little bit
20:36warm in here
20:36and some AI
20:37in this lovely expo
20:40might make things
20:40a bit more comfortable.
20:41So a very worthy
20:42opening set of debates.
20:44Well done, boys.
20:45Right.
20:46On to our next
20:47set of speakers.
20:49Please come to the stage
20:51for and against.
20:53Amr, that's you.
20:55Right.
20:56Now,
20:56carrying on with our
20:57weird facts
20:58that we found out
20:59about these two.
21:00Amr Awadala
21:01is the CEO
21:02and co-founder
21:03of Vectara.
21:04Now,
21:05what's his weird thing?
21:06Oh yes,
21:06this is what it is.
21:07He is a keen player
21:08of Halo.
21:10Not just because
21:11the game is great,
21:13but because
21:13his alter ego
21:15is great for his confidence
21:16as well as humility.
21:18A curious kind of
21:19combination of two qualities
21:20in one person.
21:22For the against side,
21:24we have
21:24Sidi Pai,
21:25who is the senior
21:26policy researcher
21:27on AI and labor markets
21:29at Interface,
21:30a Brussels-based think tank.
21:32Her special skill
21:33is the following.
21:35When she was nine years old,
21:37she published an article
21:38in the Times of India
21:39and was convinced
21:41that every newspaper reader
21:43in the country
21:44would know her personally
21:45and recognize her
21:46on the streets.
21:48Now,
21:48having written a few articles
21:49myself,
21:50I know this isn't true.
21:51So,
21:52unfortunately,
21:52that's not what happens.
21:53But,
21:54I'm sure your debate
21:55is going to be robust.
21:56Crack on.
21:57Thank you.
22:04Just a very small addition.
22:07The alter ego
22:08is Master Chief.
22:09Master Chief
22:09is the key, key hero
22:11in Halo.
22:12So,
22:13if any Master Chief fans
22:14out here,
22:15yes,
22:15thank you for the applause.
22:18I have another question
22:19I'm going to ask,
22:20which I also would like
22:21the answer to come
22:22in the form of applause.
22:24I'm going to ask you
22:25if you like
22:27a certain character
22:28and if you do like
22:29that character,
22:30please give me
22:30the highest,
22:31strongest form
22:33of applause
22:33you can give us.
22:35Who here likes
22:37Spider-Man?
22:43So,
22:44one of the most
22:44important lines
22:46and maybe
22:47the most popular lines
22:48that all of us
22:49are aware of,
22:50thanks to the genius
22:51of Stan Lee,
22:53is with great power
22:54comes
22:56great responsibility.
22:57And that's exactly
22:59what we are arguing for.
23:00Plastic
23:00was a great power
23:03and without capturing
23:04the responsibility
23:05that's needed
23:06to leverage that power,
23:07we humans
23:08screw it up.
23:09And history
23:10has proven
23:11that point
23:12over and over
23:14and over again.
23:16Plastic is not
23:17the first thing
23:19where we encounter that.
23:20Fire might be
23:22the first thing
23:22where we encounter that.
23:23and fire
23:24has so many benefits,
23:26yet we burned
23:27witches down
23:28using fires.
23:31Dynamite
23:31was amazing
23:32for doing mining,
23:34yet we used it
23:35to kill
23:3510 million people
23:36in World War I.
23:38and thus the revolution
23:40was amazing
23:41for creating machinery,
23:43yet we used it
23:44to kill
23:4580,000 people
23:46in World War II,
23:48not to mention
23:48the nuclear bomb,
23:50which is amazing
23:50in terms of the energy
23:51it produces,
23:53but killed
23:53200,000 people
23:54like this,
23:56just like Thanos
23:57flipped his fingers
23:58in Avengers.
24:01So that's the key lesson
24:02that we need
24:03to learn here
24:04and that we all
24:04need to be living with
24:05is yes,
24:06of course AI
24:07is super useful.
24:08I work,
24:08I'm the founder
24:09and CEO
24:09of an AI company,
24:10I believe in it 100%,
24:12in the same way
24:13that all of us
24:14believed in plastic
24:15being super useful
24:16when it came out.
24:18But we cannot
24:20put aside
24:21the responsibility
24:22of making sure
24:23that we don't repeat
24:25the mistakes
24:25of the past,
24:26that we don't let
24:27our governments,
24:28for example,
24:29leverage
24:29AI to do
24:30World War III
24:31and kill
24:32a billion people.
24:34And that's the main
24:35point of this
24:36entire debate
24:37is while AI
24:38has amazing,
24:40amazing benefits
24:41that we are all after
24:42and we all believe in
24:43so strongly,
24:45if you don't ask
24:46our governments
24:47to make sure
24:48they put the proper
24:49rules
24:51and agree with each other
24:52on these proper rules,
24:54then we might up
24:55repeating the mistakes
24:56of the past again
24:57at a much larger scale,
24:59much bigger than
25:00climate change.
25:01This may be the end of us.
25:02Thank you.
25:10I'm going to start
25:11by thanking Amr
25:12for arguing on our part
25:13because you get this right,
25:15AI and plastic
25:16both offer
25:17transformative benefits.
25:19but here's the crucial difference,
25:21the question of who decides.
25:24Plastics scaled globally
25:25from the 1915 onwards
25:27with zero civil society input,
25:29no environmental oversight
25:31and no democratic participation
25:32in its development.
25:34AI is developing
25:35in a fundamentally different world,
25:38one where civil society,
25:39academia
25:40and diverse stakeholders
25:41are actively shaping technology
25:43before irreversible damage
25:45is done.
25:46A strong example
25:47of this was when
25:48over 4,000 Google employees
25:50protested Project Maven,
25:51Google's Pentagon drone AI contract,
25:54Google canceled
25:55the contract entirely.
25:57While plastic faced
25:59scattered environmental concerns
26:01decades after mass deployment,
26:04AI is encountering
26:05systematic legal frameworks
26:07during its development phase.
26:09The EU AI Act,
26:10which began enforcement
26:11in February this year,
26:12has banned eight categories
26:14of harmful AI practices
26:16with fines up to 35 million euros
26:18or 7% of global revenue.
26:20AI systems
26:22that scrape facial images
26:23from the internet,
26:24like those used by Clearview AI,
26:26are now prohibited in the EU.
26:28The difference isn't that
26:30plastic had zero opposition,
26:32it's that AI faces
26:33mature governance initiatives
26:35in its critical developmental phase.
26:38With plastic,
26:39we allowed 50 plus years
26:41of expansion of single-use plastic
26:43and now we are trying
26:44to marry Kondo
26:459 billion tons of mess
26:46after the damage is done.
26:48With AI,
26:49we're still in the writer's room.
26:51As has been emphasized by Philippe,
26:54yes, current AI training
26:55uses energy,
26:56but we can cut
26:58data center emissions
27:00by 10 to 20%
27:01immediately
27:02through optimization.
27:04The very fact
27:05that Microsoft and Google's
27:07emissions increases
27:08are now making front-page news
27:10proofs that democratic engagement
27:12is working
27:13and people are
27:14demanding accountability
27:16now more than ever.
27:18Plastic succeeded
27:19brilliantly at its job,
27:21it just did too much damage
27:23along the way
27:23because no one was watching.
27:25AI is succeeding
27:26at its job too,
27:28but this time,
27:29we are watching,
27:30we are participating,
27:32steering,
27:32and where necessary,
27:34we are pushing back.
27:35The choice isn't
27:37whether AI will transform society,
27:38it already is.
27:40The choice is
27:41whether we'll be smart enough
27:42to guide that transformation.
27:44We cannot
27:45and we will not
27:46make the same mistake twice.
27:48We have learned from the past.
27:49We need proactive assessment,
27:52diverse stakeholder engagement
27:53from day one,
27:54and international frameworks
27:56with actual enforcement power.
27:58We need ethical development
28:00that includes
28:01everyone at the table,
28:02not just tech companies,
28:04not just governments,
28:05but civil society,
28:06communities,
28:06and the people
28:07who will be affected
28:08by these technologies.
28:10This is our moment
28:11to prove
28:11we can do
28:12transformative technology right.
28:14AI isn't Plastic's sequel,
28:16it's our redemption arc,
28:18and unlike Plastic's creators,
28:20we should be building this together.
28:21Let's not fumble this time.
28:28Wow, that was powerful stuff
28:31right on the gong,
28:32and even the opponent
28:33was clapping that delivery.
28:34Now,
28:35so what do we learn
28:36from these two debating points,
28:40these talking points?
28:42Amr,
28:43the master chief,
28:44has quoted that great prophet's philosopher,
28:48Stan Lee,
28:49also behind Spider-Man,
28:51with power comes great responsibility.
28:54The question is,
28:55are we making a mistake?
28:56Are we doing what we've done
28:58to other negative things
29:01like plastics to AI
29:03and not dealing with it
29:03consequentially enough
29:04because it's very existential,
29:06and the significance
29:07of what it could cause
29:08and the damage it could cause
29:10needs to be addressed now.
29:12Sintai, however,
29:13says that is a false reading
29:16of the situation.
29:17Unlike with plastics,
29:19we're actually proceeding very quickly
29:21with regulation of AI,
29:23quoting the EU,
29:24taking some measures
29:24to do just that.
29:26The challenge is less
29:28about comparing it to plastics,
29:30but making sure we create regulation
29:32that incorporates multiple stakeholders
29:34and doesn't retard
29:37or constrict development
29:39of a transformation technology
29:41that will solve
29:42some of the biggest problems we face.
29:45Thanks, both.
29:47All right.
29:49Now to our final debaters
29:51of the evening.
29:53For the fourth side,
29:55we have Sasha Rubel.
29:56Please come to the podium, gang.
30:00Sasha Rubel is the head
30:01of Generative AI Public Policy EMEA
30:04at Amazon Web Services.
30:06That's one heck of a job title.
30:10She's French,
30:11but she sounds American,
30:12she tells me.
30:13Sasha also, though,
30:15takes part in motorcycle races
30:16in the Sahara,
30:17and she sings in a band
30:19that mixes Afrobeat
30:20and country music.
30:22So maybe there'll be
30:24some weird song thing
30:25in the middle of her dislocation.
30:27On the again side, though,
30:29we have Shingai Manjengwa,
30:32who is the Senior Director,
30:33Education and Development,
30:35Talent and Ecosystem
30:36at Myla AI.
30:38Now, that definitely wins the title
30:40of the longest job title
30:41of this entire panel discussion.
30:43Maybe you could shorten that.
30:44An editor could do that.
30:48Shingai's special talent,
30:49she bungee jumped
30:50from Blue Grands Bridge.
30:53Have I pronounced that correctly?
30:54Not great, but that's all right.
30:56Which is the highest commercial jump
30:58in the world
30:58at 216 meters,
31:00which sounds mad.
31:01All right.
31:02Go for it.
31:07So first, I'd like to thank Citi
31:09for arguing in fact
31:10for this debate,
31:13because we are arguing
31:14that AI is indeed
31:16the new plastic,
31:17because it frames the debate
31:18not just about progress,
31:20but about the responsibility
31:21and the global cooperation
31:23that's needed
31:24in order to harness
31:25the opportunities,
31:26but also address the risks.
31:28The real question,
31:29as we've heard already,
31:30isn't whether AI,
31:31like plastic,
31:32will change the world.
31:33It's about how are we engaged
31:35and how are we able to actively
31:38and proactively shape our response
31:40to this opportunity.
31:42In the same way
31:43that plastic represented
31:44an unbelievable opportunity
31:46to make advances
31:48in the circular economy,
31:49there's enough investment
31:51and regulatory attention
31:52and democratic participation,
31:54as Citi underlined,
31:56to make sure that we learn
31:57from history
31:58and in this plastic moment
32:00of AI,
32:00make sure that we take forward
32:02the opportunities
32:03that this moment represents.
32:05This analogy also helps us understand
32:08not only the urgency,
32:10but the scale,
32:11the promise,
32:12and the risks
32:13that this technological revolution
32:14we're living through represents,
32:16but also the need
32:17to act quickly,
32:19to think locally,
32:20and to team up globally
32:21so that we can indeed benefit
32:23from this moment.
32:25The analogy sharpens
32:26with plastic's historical trajectory,
32:29as many already mentioned.
32:31Plastic once was seen
32:32as a miracle material.
32:34Like AI,
32:35plastic was not just one thing,
32:37and like AI,
32:38plastic was everywhere.
32:40But society reacted
32:42in a very similar way
32:43at the beginning of plastic,
32:45which was excitement.
32:46It allowed for many people,
32:47including women,
32:48to focus on other,
32:49more interesting things
32:51than taking care
32:52of domestic challenges.
32:53Plastic liberated
32:54that free time
32:55so that they could focus
32:56on different aspirations,
32:58similarly to the ways
32:59in which AI is increasing
33:01productivity and efficiency,
33:04but people also reacted
33:05to plastic in the same way
33:07that they're reacting to AI,
33:09which is caution and questions
33:11about what this represents
33:12for the future of humanity.
33:15Like plastic,
33:16concerns around AI
33:17are mounting around energy,
33:19but also risks related
33:20to misinformation
33:21and the future of work
33:23and the future of manufacturing.
33:26This isn't just a metaphor,
33:28it's a call to action
33:29for global cooperation,
33:30and this is already happening.
33:32Governments are working
33:33on regulatory frameworks,
33:35civil society are engaging
33:36with researchers
33:37to think through the future
33:39of transparency
33:40and explainability,
33:41and innovators are already working
33:43with standards organizations
33:44on questions related
33:46to sustainable AI practices.
33:48but there's a lot similarly
33:50to plastic at stake.
33:52We need to make sure
33:53that we understand
33:54that this is not just
33:55a technical or economic decision,
33:57but a defining moment
33:59and a moral and political one
34:01to come together
34:02as a global community
34:03and think through the safe,
34:05sustainable,
34:05and human-centered future
34:07that we can all co-design.
34:09Thank you.
34:22All right.
34:23I just want to remind you
34:25before I start,
34:27this is the topic of the debate.
34:30AI is the new plastic,
34:32and our team fundamentally rejects
34:35the premise.
34:36AI is not the new plastic,
34:38it's the old electricity.
34:40There's actually a better analogy
34:42to explain what we're seeing,
34:44and there's a strong temporal
34:45time component
34:47to how technologies evolve.
34:49So first, think about it.
34:52There was a time
34:53where in the early days
34:55of electricity,
34:56it was exposed copper wires,
34:59installers were not trained,
35:01we didn't control the voltages
35:02coming through plugs,
35:03and yes,
35:05many people lost their lives
35:06because of a dangerous technology
35:08like electricity.
35:10But today,
35:11voltage is controlled.
35:13Electricians are trained,
35:15and it's a regulated profession.
35:17Appliances are labeled for safety,
35:20and little tiny plug holes
35:22are so small,
35:23so small curious fingers
35:25don't get stuck in them.
35:27So we have seen this movie before
35:29where a new powerful technology
35:31that is potentially very dangerous,
35:33we were able to come along together
35:36on the journey
35:36to coordinate and collaborate
35:38and come up with helpful regulations
35:40so that we can all use it.
35:42We are in the exposed copper wire phase
35:45of artificial intelligence right now,
35:48and we will get there
35:49because like electricity
35:51and unlike plastic,
35:54we have established good standards
35:57for safety together.
35:59Plastic was and is useful,
36:01but I want to assure you,
36:03President Macron
36:03and the new Minister of AI from Canada
36:06would not have come all the way here
36:08for the Viva Polyester Conference.
36:10They are here
36:11because AI is special.
36:14It is a special new technology
36:15that offers a lot.
36:17So,
36:18a transformative technology
36:20that makes us like gods,
36:22that brings light
36:23where there is darkness,
36:25a transformative technology
36:27that if not handled correctly
36:29can kill us,
36:30like electricity,
36:31a tech that threatened
36:33the jobs of candle makers
36:35who must have been very concerned
36:37about this new technology
36:38taking their livelihoods,
36:40a tool that our children use,
36:43like electrical plugs,
36:45we can send them to go switch on
36:46and switch off the light.
36:47We have seen this movie before
36:50and it is on us
36:51to write the next story
36:53about artificial intelligence.
36:55We did it once
36:56and we can do it again.
36:57The opposing side here
36:59is arguing
37:00that you are unable to change.
37:03You are unable
37:04to be part of the agency
37:06that is required
37:07to harness the power of electricity,
37:09sorry, the power of AI.
37:10They are betting against you.
37:13We are betting
37:15that you are here
37:16and you will stand with us
37:18to try and make AI safer.
37:20There is light
37:22at the end of the tunnel.
37:30Well, well, well.
37:33Arouse again.
37:34Now, these two
37:35clearly have been
37:37in Oxford Union-style
37:39parliamentary debates before
37:40because both of them
37:41did a clever tactic
37:42that debaters always do.
37:44Sasha, what did she do?
37:45She took the words
37:46of one of her opponents
37:47as proof of her argument,
37:49saying that thin-eye's suggestion
37:51that regulation
37:53is already happening,
37:54maybe at a too early stage,
37:56was evidence
37:56that actually
37:58the dangers
37:59that AI could be
38:00the new plastic
38:01are very much real.
38:05Shingai made
38:06a different debating tactic
38:08by suggesting
38:09the topic we're debating
38:10is a nonsensical non-starter
38:12and totally missing
38:13the real point,
38:14which is the following,
38:15that just like the Luddites,
38:17just like those who warn
38:19that every technological advancement
38:21is going to destroy the world,
38:22once again,
38:23we're all wrong.
38:25And echoing
38:27an emotive tactic
38:28of one of her colleagues,
38:29she appealed directly to you.
38:32First of all,
38:33she came up with
38:34the word of the day,
38:36which is
38:36Viva Polyester.
38:37I think that's great.
38:39But secondly,
38:40she said
38:41they're making a bet on you,
38:43that you guys are smart enough
38:44to make the right decision
38:46to harness the power of AI for good
38:48rather than let it do evil.
38:50So,
38:51some very impressive tactics
38:53and savvy strategy
38:54on both sides
38:54to end the debate.
38:56Now,
38:57this is the point
38:58at which we bring up
38:59our jury members.
39:00Now,
39:01just to remember,
39:02remind you,
39:03each one of them
39:03will have about
39:04two and a half minutes
39:05to offer their views
39:06on the debate,
39:07how they view
39:07the arguments for and against.
39:10And,
39:10I would encourage you
39:12to make sure
39:12you watch them closely
39:13and encourage the judges
39:15to watch the clock
39:16because Sophie and I will be.
39:17So,
39:17please come up,
39:18Stephen,
39:19Nanette,
39:19and Marcus.
39:28Actually,
39:29this is Stephen Jamison,
39:30head of product marketing at SAP
39:31because you probably don't know
39:32who each of them are.
39:34Good afternoon.
39:35Hello.
39:35And,
39:35thank you
39:36for a very vigorous
39:37and inspiring debate
39:39so far.
39:40So,
39:41I'm Stephen.
39:41I lead
39:42sustainability products marketing
39:44at SAP.
39:45I have a background
39:46in plastics
39:46and how we solve
39:48the plastics challenge
39:49and also AI
39:51where I lead the topic
39:52for not just SAP
39:53but for the global business community
39:55that we serve.
39:56So,
39:56a few thoughts
39:58or thinking elements
40:00that kind of come out
40:01of that debate.
40:02Really solid arguments
40:04and evidence
40:05around the,
40:07from both sides
40:08on the opportunity
40:09that is ahead of us
40:11because regulation
40:12is happening,
40:13we're in this environment
40:14where we're able
40:15to start to build
40:17this regulatory landscape.
40:19Have we learned
40:20from plastic though?
40:22I've been participating
40:23personally,
40:24representing the business
40:25coalition,
40:25in the global plastics
40:26treaty negotiations
40:28and after being in Ottawa,
40:30after being in Korea,
40:32I can tell you
40:32that we have not learned.
40:34There is a huge divisive
40:37spread of opinion
40:38in the global community
40:39about how we solve
40:40plastic right now.
40:41We might reach
40:43a conclusion
40:44and an agreement
40:44in Geneva
40:46in August
40:47and hopefully we do
40:49but there are so many challenges
40:50about the problem
40:51of plastics
40:52in terms of unintended consequences,
40:54in terms of the proliferation
40:55of innovation,
40:55in terms of the difficulties
40:56of recyclability,
40:58in terms of disenfranchised communities
41:00and the people
41:01that are cut out.
41:02All of these challenges
41:03we see in the system
41:05of AI
41:06and I'm not sure
41:07I heard all of those topics
41:08coming out
41:09in terms of how
41:10you address it.
41:11Then we look at
41:12the opportunity
41:13that we have
41:14in front of us.
41:15Let's look at plastics.
41:17Huge opportunity
41:18in the AI space
41:19to change that agenda.
41:20I work across
41:21the business ecosystem.
41:22Huge opportunities
41:23to inject AI
41:25into how we design,
41:26how we design
41:27for recyclability
41:28in terms of how
41:29we discover
41:29new material innovations,
41:31in terms of how
41:31we eventually use AI
41:33to deal with things
41:34like nanotechnology
41:35and innovations
41:36around material sciences
41:37to get out
41:37of the plastic problem.
41:38We're not there yet
41:40but there's a huge application
41:42in terms of how
41:42we close the loop,
41:43how we move
41:45to a regenerative society
41:46and a regenerative economy
41:48and so huge opportunity
41:50areas there
41:50I think in the argumentation
41:52as to how we can go
41:53on that journey
41:54and help to transition
41:55not just business
41:57in how they do it
41:58but also using
41:59that technology
41:59to convince
42:00the policy makers
42:01who only move
42:03at the pace
42:03of the slowest actor
42:05in how we can speed them up.
42:07So I think that's
42:08the opportunity
42:08in this debate.
42:10Thank you, Stephen.
42:12Right.
42:13Good timing as well.
42:14So next,
42:15Nanette Lafond-Dufour
42:16who's the chief
42:17impact officer
42:18of Publicis.
42:18Thank you
42:19and delighted to be here
42:21and thank you.
42:21It was incredibly stimulating
42:23and thought-provoking
42:24and a couple of the things
42:26that I really liked
42:27were the focus
42:28on the responsibility
42:29and the governance
42:32and I also very much appreciated
42:35the analysis
42:37or the way that it was compared
42:41to electricity.
42:43I think the question
42:45for me is
42:46there was a lot
42:47of discussion
42:47around possibility
42:49but what is the probability
42:50and I think that's
42:52what we need to think about
42:53because there are
42:54all kinds of possibilities
42:56but the probability
42:57is where we need
42:59to consider
43:00what's going to happen
43:02to human beings
43:03in the future
43:04because for me
43:05this is
43:06AI is a tool
43:07for humans.
43:08the difference
43:09with plastic
43:09is plastic
43:10was a convenience
43:11it was served up
43:13to people
43:14there was nothing
43:14they had to do.
43:16This is a tool
43:17there is a responsibility
43:19of people
43:20to use AI
43:23in a way
43:24that is responsible
43:26and not a burden
43:27and that means
43:27that there needs
43:28to be education
43:29there needs
43:30to be accessibility
43:31for all parties
43:33and historically
43:34we have seen
43:35that technology
43:36has created
43:37the great divide
43:38and how do we ensure
43:40that this does not
43:42perpetuate that
43:43so thank you.
43:45Great.
43:47Thank you.
43:48Fascinating.
43:49Finally
43:49Marcus Wagner
43:50founder of
43:52i5invest.
43:53Marcus.
43:54So I'm a tech
43:55entrepreneur
43:56and an investor
43:58so I'm slightly biased
44:01that to be said
44:02I spent many
44:04years in the
44:05Silicon Valley
44:05this comes with
44:06a very positive
44:07attitude
44:08for the world
44:09so we can fix
44:09everything
44:10but I'm also
44:11European
44:12which comes
44:13with a more
44:14concerned
44:15attitude
44:16I really like
44:17the way
44:17how you sketched
44:19out our
44:19Terminator future
44:21versus our
44:22Star Trek future
44:23where
44:24on the one hand
44:26AI is going
44:27to kill us
44:28or wants to
44:29kill us
44:29and on the other
44:29hand AI is going
44:30to help us
44:31and help us
44:32to become
44:32the better humans
44:34I was impressed
44:35by the arguments
44:36around comparing
44:38it to electricity
44:40and how that changed
44:41I'm electrical
44:42engineer by education
44:43so I can
44:44I like that
44:46and how we can
44:47save energy
44:49and maybe
44:50take the other
44:50like big challenges
44:51we face as humanity
44:53and that humans
44:54has the capacity
44:56to learn from the past
44:57as well
44:58what really
44:59strike me
45:00on this side
45:00is you all
45:02have an extensive
45:03background in AI
45:04or like
45:04knowledge
45:05and
45:07many of you
45:09are on the
45:09controlling room
45:10of controlling AI
45:12or influencing
45:13or steering it
45:14and if you
45:16the ones
45:17who control it
45:18are scared
45:21how can we not
45:22be scared
45:23right
45:26it's an academic
45:27discussion in a way
45:28but also like
45:29I think time is too short
45:31to not think about
45:31like what are the solutions
45:33or what can be done
45:34in real life
45:36might there already be
45:37solutions
45:38might it be nuclear
45:40for energy
45:41might it be a kill switch
45:42for the AI
45:44that goes out
45:45of control
45:47and do we have
45:48even time to discuss
45:49or is it actually
45:50time to act now
45:51already
45:52so I'm curious
45:54to see
45:56and learn
45:56and conclude
45:57if you're going
45:58to die
45:58of AI
45:59or if you're going
46:01to die
46:01of diseases
46:02climate
46:02and wars
46:04and what the implications
46:05of AI
46:06might be to
46:07democracies
46:08and also maybe
46:09wars
46:10and defense
46:12and what do you
46:14think about it
46:15thank you
46:16fantastic
46:17thank you Marcus
46:18so we've got
46:20death
46:22opportunity
46:23big crises
46:24big solutions
46:25okay
46:26thanks so much folks
46:27now
46:29I think
46:30this is probably
46:31the moment
46:31when Sophie
46:32comes back
46:32on stage
46:33with me
46:33yes
46:34seamless
46:34brilliant
46:36okay
46:36so now
46:37we're in a
46:38fun stage
46:39where
46:40the teens
46:42will have
46:42two minutes
46:42each to confer
46:43and come up
46:44with their
46:45rebuttal
46:46correct Sophie
46:47that's right
46:47and you
46:49while they're
46:50thinking about
46:50the rebuttal
46:51you audience
46:52you invited
46:52to confer
46:53among yourselves
46:54imagine that
46:55you are the
46:55had the
46:56unviable task
46:57to be
46:58on this stage
46:59having to do
47:00your final rebuttal
47:01what might you say
47:02for your final
47:03one and a half
47:04minutes
47:04so we'll have
47:05a few minutes
47:06of music
47:06where you can
47:08confer
47:08and the audience
47:09is invited
47:10to do the same
47:12so we'll have
47:13go on
47:13again
47:19so we'll be on
47:20yeah
47:51...
48:22...
48:24...
48:24...
48:26...
48:28...
48:30...
48:40...
48:41...
48:41...
48:43...
48:46...
48:48...
48:49...
48:49...
48:50...
48:51...
48:53...
48:54...
48:56...
48:57...
49:01...
49:02...
49:02...
49:03...
49:07...
49:08...
49:08...
49:08...
49:17...
49:21...
49:24...
49:25...
49:28...
49:29...
49:29...
49:31...
49:33...
49:33...
49:33...
49:34...
49:34...
49:45...
49:48...
49:49...
49:57...
50:01...
50:03...
50:03...
50:14...
50:15...
50:17...
50:21...
50:27...
50:29...
50:30...
50:31...
50:31...
50:31...
50:32...
50:32...
50:33...
50:34...
50:36...
50:36...
50:36...
50:36...
50:37...
50:38...
50:38...
50:38...
50:38...
50:39...
50:40...
50:41...
50:52...
50:54...
51:24...
51:25...
51:25...
51:25...
51:26...
51:27...
51:27...
51:27...
51:27...
51:27...
51:27...
51:28...
51:28...
51:28...
51:29...
51:29technologies that we build. And we've seen evidence in my colleague Citi's talk of how we've been able
51:35to do that currently. This is a timing discussion. With the advent of plastic in the 50s, people were
51:42not talking about regulating plastic. But now in the developmental phase of AI, we are talking about
51:48regulating plastic. That's a fundamental difference. We believe in people. We believe in ourselves to
51:53be part of the co-authors and agents and have agency in the AI story. And we believe fundamentally
51:59that if we could all change our behaviors, starting to use straws less, starting to drive electric
52:05vehicles, starting to optimize hot buildings so that we regulate the temperature and use less energy,
52:11if we have all been able to do that and enact laws that make it possible for different technologies
52:16to be safer, then yes, absolutely, the AI story is not written. We are writing it. And no, plastic is
52:24not the new AI. All right. Four side. There you go. So our opponents think that you're not very smart
52:35thinking because they're trying to turn this into AI versus not AI. We are not against AI. We are very
52:44much pro AI. We love AI. We think that it can solve all kinds of problems, including environmental
52:49problems. We never argued against it. The question here, the debate is about AI, new plastic. And the
52:56reason we believe and we strongly support and we've shown through our arguments that yes, it is
53:01because like plastic, it offers opportunities, but also problems that we need to address.
53:10And the electricity analogy doesn't hold because electricity is controlled. Electricity comes from a
53:16power station that governments can control versus AI. A kid at their dorm room could be running AI.
53:23Second, if you visit Egypt, my whole country, we actually have wires hanging everywhere and people
53:27die from electricity all the time. So that problem is not solved. Actually, that's not true in many,
53:31many countries across the world. Last but not least, you have no say in how governments are using AI for
53:38war.
53:39And that's the real danger. None of the governments, the third world countries would agree
53:43on how to regulate the use of AI on war. And that can be the end of us. It's not
53:47the AI itself. It's how the
53:50governments will abuse it.
53:51Your opponents are trying to make you believe that we are arguing that you as citizens are unable to have
53:56a voice
53:57in shaping the future of technology, whether that be plastic or AI. Our very clear argument is that AI can
54:04actually be
54:04used as a tool to empower public participation and democratic participation around the world in global
54:11cooperation to shape a future that remains tech-enabled, but is actually human-centered. And that this
54:18technology represents an opportunity based on responsibility that will drive trust, that is the blocker, that will
54:24drive adoption, that will then drive innovation so that this bottle that my colleague is holding up will not be
54:30something waste, but
54:32something reusable and sustainable and inclusive. Time, time, time, time. Right. So, just to summarize very quickly, the
54:41against. Technology must not be the enemy, especially at this developmental stage of AI. The fours, they don't think
54:48technology is the enemy. They do trust you. But not all the actors around AI can be trusted like you
54:54can.
54:56Great. So, moment of truth. I told you the gloves we come off at this stage. This is now you,
55:01the audience, that is
55:02going to have to make the vote, the final vote. Remember, keep your own opinion aside for the sake of
55:09this debate. And we
55:11really are going to be judging which team put the best fight for their side. So, is AI the new
55:17plastic? This is the four team. This is the
55:20against team. Ready? Now, if you think that a four team won the debate, clap now.
55:48Well, I hope you have a good support team in the room. Ready? If you think the against team won
55:54the debate, clap now.
56:13Let's clapping with more noise. Yes. The whistling was a genius intervention. I got to say, well done. Well done,
56:20guys.
56:22Yeah. So, what's the call, Sophie? It's a bit tight, isn't it? Yeah. Do you think there's a rematch in
56:27there? I think it might be a rematch.
56:29Okay. Do you think we need to kind of... Shall we say no double counting, no double clapping, just to
56:34eat the... Yes. But do we keep the noise?
56:36Do we just go to hands? I mean, we need noise. Okay. Sorry. Sorry. Okay. I think the noise is
56:42good. Okay. The noise is good. Just more noise.
56:45Okay. Right. So, imagine you're at a Beyonce concert. Okay. So, we want some noise. Okay. So, you've got to
56:50really raise up. Raise it up.
56:52You want to do the round two? Maybe you'd be more decisive. Really? Gosh. Okay. All right. All of those
57:00for the fours.
57:14Wow. I don't think AI can do that. Not yet, anyways. All right. Okay. You call the other side, Sophie.
57:25Thank you. Ready? For the against.
57:29Yeah.
57:38Okay. Very quickly. We just need to confer. All right. I think we have a winner. Sophie, you can announce
57:49the winner.
57:49Well, I think the full side has it. But... But... But I think the against side won the heart of
57:57this room because there's many more people clapping for the against side than they did at the beginning. So...
58:04Well done. Well done, everybody.
58:06Yeah. So, yeah. You can hug now. A big round of applause for our debaters.
58:19A big round of applause for our jury members.
58:24And to you, Ravi, for your masterful moderation. I love the summarizing at the end.
58:29To the New York Times and Viva Tech for hosting this amazing debate.
58:33And to you, the audience, on a Friday afternoon to bring so much spirit and engagement to this debate. So,
58:39thank you.
58:40Absolutely. Thank you so much. I just wanted to say thank you on behalf of the Times. This kind of
58:45challenging of ideas and perspectives is what good journalism is about.
58:49So, support things like the New York Times and other serious journalism.
58:53Thank you. Thank you. Thank you.
58:55On earth.
58:56.
58:57.
Commentaires