- il y a 2 semaines
A conversation with Elon Musk
Catégorie
🤖
TechnologieTranscription
00:00:02Je suis désolée, mais Elon Musk n'a pas pu rejoindre.
00:00:07Donc, en lieu d'avoir Elon Musk, qui a été créé par des autres rencontres,
00:00:15j'ai son clone.
00:00:17Elon Musk !
00:00:31Elon, thank you.
00:00:33Thank you.
00:00:36Hey, guys.
00:00:47Elon, you have to know that no one in Paris believed that you would be in person.
00:00:53They all said, no, he's not going to come.
00:00:57He was already at Versailles for Choose France, so he's not going to come, and it will be a video.
00:01:04So, is it real?
00:01:06Are you real?
00:01:07I'm a hologram.
00:01:09Ah, yes.
00:01:11It is a hologram.
00:01:13Or an android.
00:01:16Elon, it is a pleasure and an honor to have you.
00:01:21So, thank you for making the trip.
00:01:23I know that this has not been very easy.
00:01:25And I know also that for the first time in my life, I will interview somebody who is introvert.
00:01:35Yes.
00:01:35Yes, you told me.
00:01:37I started out that way.
00:01:38Yeah, shy and introvert.
00:01:41So, I'm asking the audience to be a little bit gentle with Elon,
00:01:46because he's so shy that he needs your support.
00:01:50And on top of that, he's speaking in front of his mother.
00:02:04So, you can imagine how difficult it is.
00:02:08Mel, stand up.
00:02:12Yeah.
00:02:21So, as you have seen, since we have announced your participation,
00:02:27it has been quite something, because everyone wanted to be here.
00:02:34So, we had to change, we had a very modest room with only 1,500 people.
00:02:41Okay.
00:02:42And we had to move here.
00:02:45And you have to know that the Dome de Paris is the place where there is the musicals.
00:02:51Yeah.
00:02:51So, we are not expecting you to sing or to dance, but if you can do it, we would be
00:02:58very happy.
00:03:00Yeah.
00:03:03But you have also to know that this is the room where Steve Jobs was coming every time he was
00:03:13doing a keynote
00:03:15or presenting your product, it was here.
00:03:18Okay.
00:03:18So, I hope that you like the symbol.
00:03:20Sure.
00:03:21And you don't need any introduction.
00:03:31Your name is a brand.
00:03:34It's already a brand.
00:03:36It's a brand for innovation, for ambition, for...
00:03:42A perfume?
00:03:45Yes, yes.
00:03:47This is the agreement that you made with the Arnold family a few hours ago.
00:03:53I mean, really, it brands itself.
00:03:56Yeah.
00:03:57And you are the origin of PayPal, Tesla, SpaceX, to name a few, and even OpenAI.
00:04:04So, you love taking risks, and you are going always against the tide and the popular wisdom.
00:04:20You have been always proven right.
00:04:24Now, there is a bet of the 44 billion U.S. dollar question, which is, will you still be right
00:04:40with Twitter?
00:04:41Sure.
00:04:44It was expensive.
00:04:46Yeah.
00:04:47You agree?
00:04:48I don't know if, you know, listen, if I'm so smart, why did I pay so much for Twitter then?
00:05:03So, as I'm not a journalist, I'm not trying to get headlines and to have provocative response and to make
00:05:12a scoop.
00:05:13But, nevertheless, if you wish to do that, it is authorized.
00:05:19Okay, great.
00:05:19So, we are expecting that you will really make the show, because everyone comes here to see you, to listen
00:05:28to you, and to get some of the magic that you have.
00:05:35All right.
00:05:36Well, I'm honored that you went to listen to what I have to say.
00:05:40It's great to see the crowd, and, I don't know, you guys seem awesome, so.
00:05:45But you told me that you would like, you said that you would like to speak in French.
00:05:53Oh, my goodness.
00:05:56Zut alo.
00:06:00Bonjour, Paris.
00:06:02Bonjour, Paris.
00:06:06So, there are some people who believe that you are a genius, and there are some who believe that you
00:06:13are evil.
00:06:14I mean, you can be both.
00:06:17I mean, you could be an evil genius that's not, you know.
00:06:20So, you will tell us what you are, or you will let the people draw their own conclusion?
00:06:27I am definitely not evil.
00:06:31What's it?
00:06:33Tell me.
00:06:38What is the thing that you have ahead of your...
00:06:41Oh, yes.
00:06:41If you look carefully, you can see an angel's halo on my head, and the wings.
00:06:45It's subtle, but...
00:06:47Where are the wings?
00:06:48Yeah, they're difficult to see, but if you look carefully, you know, they're right there.
00:06:51Yeah, yeah.
00:06:51Small wings.
00:06:52Yes.
00:06:55So, yeah, hopefully not evil.
00:06:57Aspirationally not evil.
00:07:01So, yeah.
00:07:04Okay.
00:07:06So...
00:07:06That wasn't convincing.
00:07:07You know, you have done a lot of operation.
00:07:11You have created a lot of companies.
00:07:13The most important question for everyone is, what the hell is driving you?
00:07:20Why are you so obsessed by new operation, new creation, new things to do?
00:07:30Yeah, crystal meth is the answer.
00:07:34If you think Red Bull gives you wings.
00:07:39So...
00:07:40Man, that quote's going to probably sting.
00:07:46So, yeah, just kidding for the record.
00:07:52So, well, I think there's...
00:07:55The companies still have a lot to do for their core mission.
00:08:00The...
00:08:00You know, for electric vehicles, sustainable energy,
00:08:03still less than 1% of the global fleet is electric.
00:08:08So, you've got about 2 billion cars and trucks on the road,
00:08:12but still less than 20 million are electric at this point.
00:08:16So, there's still a long way to go for sustainable energy,
00:08:20for sustainable energy generation.
00:08:23So, this...
00:08:23You know, the Tesla mission, I think we've made a lot of progress,
00:08:27but still it's a lot more ahead.
00:08:31Then SpaceX, the goal is...
00:08:35It's a big goal, but it's...
00:08:37We want to try to make life multi-planetary,
00:08:41to extend life beyond Earth.
00:08:43And I think this is important for a number of reasons,
00:08:49but there's the sort of defensive reason
00:08:51of ensuring that the light of consciousness does not go out.
00:08:55And if I may send these questions...
00:08:57If I'm going on too long, you feel free to interrupt me, but...
00:09:00No, no, you can.
00:09:02Okay.
00:09:03You can't be long.
00:09:03Okay, so...
00:09:07You know, people do ask me, you know,
00:09:09have I seen UFOs and aliens and that kind of thing?
00:09:13And I haven't.
00:09:15And I think I would have seen them by now.
00:09:19So, it appears that we might...
00:09:22We might be the only consciousness,
00:09:26at least in this galaxy,
00:09:27and if so, that's kind of a scary prospect
00:09:31because it means that the light of consciousness
00:09:34is like a tiny candle in a vast darkness.
00:09:39And we should do everything we can
00:09:40to prevent that candle from going out.
00:09:44So, yeah.
00:09:50And so, some of the things...
00:09:53So, that means, obviously,
00:09:54taking the actions to ensure that Earth is good,
00:09:57that Earth is safe and secure for civilization.
00:10:01And I think it also means extending life beyond Earth
00:10:06to other planets in the solar system
00:10:08and ultimately to other star systems.
00:10:12And I think that's both a sort of defense
00:10:14of the light of consciousness
00:10:16and also, I think, a point of inspiration
00:10:20because life cannot just be about solving
00:10:25one problem after another.
00:10:27We need things that inspire us.
00:10:30I mean, we need things that move our hearts
00:10:35and that when you wake up in the morning,
00:10:38you're excited to be alive.
00:10:39and being a space-faring civilization
00:10:43and making true the things that we see
00:10:46in the good science fiction movies,
00:10:49this is one of the things
00:10:50that I think can inspire all of humanity.
00:10:52Just like when the astronauts went to the moon in 69,
00:10:59it was something that...
00:11:01I mean, they said, for all mankind, you know,
00:11:03and it really was something you could say
00:11:04to any human on Earth,
00:11:06what's like the most amazing thing
00:11:09that humanity has ever done.
00:11:11At least one of those things would be
00:11:12we went to the moon, you know.
00:11:15And so you want to have these inspiring things
00:11:17that make you excited to be alive
00:11:19and excited about the future.
00:11:23Yeah.
00:11:24And you had those thoughts and dreams
00:11:29when you were a kid
00:11:30or this came much later on?
00:11:35Well, I didn't think I would be doing these things as a kid.
00:11:39That's for sure.
00:11:40I was interested in technology.
00:11:42I read a lot of books.
00:11:46So I was obviously interested in...
00:11:49I mean, this is hardly going to be surprising.
00:11:50I was interested in science fiction and technology.
00:11:53You have to tell the truth
00:11:54because there is someone who is listening to you, huh?
00:11:56Yeah, my mom's right there.
00:11:57She can call me out on this if it's not accurate.
00:12:01But so I guess the thing that was maybe most significant
00:12:11from a philosophical standpoint
00:12:12was that when I was about maybe 12 or 13,
00:12:16I had somewhat of an existential crisis
00:12:18where I was like, what is the meaning of life?
00:12:22Is life just meaningless?
00:12:24Why are we here?
00:12:25What does it all mean?
00:12:28And I read a lot of books on religion and philosophy.
00:12:32And then ultimately, I read this book,
00:12:37Hitchhiker's Guide to the Galaxy, which is great.
00:12:40And in that book, that book is really a philosophy book
00:12:43that's disguised as humor.
00:12:45And the point that Douglas Adams makes
00:12:48is that the real difficulty is understanding
00:12:52what questions to ask about the answer that is the universe.
00:12:56And that really we want to have...
00:13:03It's essentially...
00:13:04It's like a philosophy of curiosity.
00:13:08of saying, well, what can we do to find out more
00:13:11about the nature of the universe
00:13:13and the meaning of life?
00:13:17And so that's the sort of foundational element.
00:13:21And then from there, you say, okay, well,
00:13:22if we want to find out the meaning of life,
00:13:24we have to expand the scope and scale of consciousness.
00:13:27We have to go out there and explore the stars
00:13:30to know what questions to ask about the universe
00:13:34and understand the universe.
00:13:36And that's my core philosophy.
00:13:40And so from that, it was like, well,
00:13:43we have to make sure that Earth is good,
00:13:46so we have to have sustainable energy.
00:13:50We want to build technology to travel beyond Earth.
00:13:56And it's from that sort of core philosophy
00:13:59that these companies arise in most cases.
00:14:04and you can say, how does Twitter help with that?
00:14:08So I would like to go back to Earth.
00:14:11Sure.
00:14:12And to the various enterprises
00:14:17that you contributed to create
00:14:19or co-created or created.
00:14:21Let's start with PayPal.
00:14:24Very impressive company.
00:14:27Do you regret to have sold it?
00:14:30I think, in retrospect,
00:14:33I think it was good that the company was acquired by eBay
00:14:38because there was so much talent at PayPal,
00:14:41and that talent subsequently went on
00:14:43to create many other companies.
00:14:46So YouTube, for example,
00:14:48was created by two people that worked at PayPal.
00:14:54there was LinkedIn was created from PayPal.
00:14:57There was Yelp.
00:14:59There was many other companies.
00:15:02Impressive.
00:15:03Yeah.
00:15:03Yeah, very impressive.
00:15:05Yeah, so, and if I'd been still working on PayPal,
00:15:09then there wouldn't be,
00:15:11you know, Tesla would not be in its current form
00:15:13and SpaceX wouldn't exist.
00:15:17So, yes, I guess a short answer, yes.
00:15:21The short answer is that, yes.
00:15:25Yeah.
00:15:27But now, you would have loved to keep it again?
00:15:32Well, there is, I think,
00:15:35the potential to do something
00:15:37that is bigger than PayPal.
00:15:44This is sort of like the X,
00:15:46the sort of everything app kind of thing.
00:15:48So, I think it's somewhat poetic.
00:15:53We're trying to get,
00:15:55finish a task that was started about 24 years ago.
00:16:00So, I think it's going to be useful.
00:16:07X slash Twitter is going to be
00:16:09just a very useful thing
00:16:12and hopefully something that is
00:16:14a positive force for civilization.
00:16:18Moving to Tesla,
00:16:20I have, through one of our operations,
00:16:25done the first advertising campaign
00:16:28for the electric car of GM.
00:16:31That was at the end of the 90s.
00:16:34So, can you explain why
00:16:38GM and other car manufacturers
00:16:41have not created Tesla
00:16:43and why Tesla is successful?
00:16:46What is the difference?
00:16:53You're talking about the EV1, basically.
00:16:55The EV1 car that...
00:16:57So, General Motors actually did come out
00:17:01with this electric vehicle one, EV1,
00:17:05and it's...
00:17:05Yes, you remember, EV1.
00:17:07Yes.
00:17:07Yeah.
00:17:10And, actually, so I thought when that came out...
00:17:12It was in 87.
00:17:13Yeah.
00:17:14That sounds about right.
00:17:15Yeah.
00:17:16I would have...
00:17:17I expected there would be an EV2, EV3,
00:17:19and so forth.
00:17:20And if they had done that,
00:17:21actually, there would be no need for Tesla.
00:17:26But, for reasons that aren't clear,
00:17:30GM recalled all of the EV1s,
00:17:32even from customers that really wanted to keep the cars.
00:17:35They recalled the cars,
00:17:37and they crushed them in the junkyard.
00:17:43and it still blows my mind that they did this,
00:17:46because the people who had the EV1s,
00:17:50they loved the car so much,
00:17:51they held a candlelit vigil at the junkyard
00:17:55where the cars were crushed.
00:17:57Okay, like it was...
00:17:59Like someone was getting killed.
00:18:01You know, like...
00:18:02And it's like,
00:18:04if somebody is holding a candlelit vigil
00:18:07for the...
00:18:08Because they love your product so much,
00:18:10maybe you should make more of it.
00:18:13You know?
00:18:14I mean, it's, like, pretty rare
00:18:17for candlelit vigils to be held for products.
00:18:20So, I don't understand why they didn't do more.
00:18:22They should have,
00:18:23and they would be the leader in electric vehicles today.
00:18:26But they didn't.
00:18:28And so, there was a need for Tesla,
00:18:29because the, you know,
00:18:31at the time of starting Tesla,
00:18:33there were no electric vehicles being made.
00:18:36And they were...
00:18:37So, the big car companies
00:18:38were not making electric vehicles.
00:18:40There were no startups
00:18:41that we were aware of making electric vehicles.
00:18:44So, it's like, well, we should try.
00:18:47And, I mean, in the case of both Tesla and SpaceX,
00:18:50I thought the chance of success was maybe 10%.
00:18:52So, it's not like I thought it would be successful.
00:18:54I thought it would fail.
00:18:58Good lesson.
00:19:00Now, moving to the kid's dream,
00:19:04which is to become an astronaut
00:19:06and not to build rockets.
00:19:09How do you move from the idea
00:19:11that every child has,
00:19:13I will be an astronaut,
00:19:14to I will do reusable rockets?
00:19:26You forgot.
00:19:28No, I'm trying to compress the story
00:19:30so that they're not too long.
00:19:33Because the story is actually quite long.
00:19:35Because I didn't start out wanting to do the rockets.
00:19:37At first, I was going to do this philanthropic mission to Mars
00:19:42called Mars Oasis.
00:19:43And then, as I started investigating
00:19:47what it would take to launch this mission to Mars,
00:19:49just a little greenhouse, basically,
00:19:53it was intended to inspire the public.
00:19:55And I started understanding more about what rockets could be used.
00:20:01I actually went to Russia a few times
00:20:03to try to buy some of their nuclear missiles.
00:20:09Minus the nuclear, minus the nuke.
00:20:11That's extra.
00:20:14So that was pretty wild,
00:20:16being in Russia in 2001,
00:20:19negotiating to buy two of their biggest missiles.
00:20:24But it became clear that the...
00:20:30Unless there was something new with rockets,
00:20:34that was the fundamental issue.
00:20:36The cost of access to space was the fundamental issue.
00:20:39So it wasn't a question of trying to increase
00:20:42the public's desire.
00:20:44The public's desire for space and exploration is very high.
00:20:47But there needs to be a means.
00:20:49There needs to be a way.
00:20:52And there needs to be a radical improvement
00:20:55in the cost of access to orbit.
00:21:00So I was like,
00:21:01OK, well, I'm going to try starting a rocket company
00:21:02and see if it's successful.
00:21:04But I, like I said, I told people at the time,
00:21:07because people would say to me,
00:21:09tell me this joke,
00:21:10of like, what's the fast...
00:21:11You know, what's the...
00:21:16You know, how do you go from...
00:21:19Sorry.
00:21:21Sorry, I'm getting a little hot into the collar here.
00:21:25I think a lot of people are.
00:21:28So they would say,
00:21:29what's the fastest way
00:21:30to make a small fortune in the rocket industry?
00:21:33And the punchline is,
00:21:35you start with a large one.
00:21:38So, you know,
00:21:39it was a tough going for a while.
00:21:42Our first three launches failed.
00:21:44Fortunately, the fourth one succeeded.
00:21:46If the fourth one had not succeeded,
00:21:47SpaceX would not exist.
00:21:49So it was a very close call.
00:21:50Fast forward.
00:21:53You created,
00:21:55or you co-created OpenAI.
00:21:59Yes.
00:21:59ChatGPT has been incredibly successful.
00:22:05It's the fastest growing ever story.
00:22:10And after having created OpenAI,
00:22:17suddenly you say,
00:22:19oh, oh, we should have a pause.
00:22:21Some people say,
00:22:22oh, it's because he has not done it.
00:22:27Well, I mean,
00:22:28I didn't think anyone
00:22:29would actually agree to the pause,
00:22:30but I thought just for the record,
00:22:33I just want to say,
00:22:34I think we should pause.
00:22:36I didn't think that...
00:22:39Why do you want to shoot pause?
00:22:42Well, I think there's a real danger
00:22:45for digital superintelligence
00:22:47having negative consequences.
00:22:50And so if we are not careful
00:22:52with creating artificial general intelligence,
00:22:57we could have potentially
00:22:59a catastrophic outcome.
00:23:01So, now I think there's
00:23:03a range of possibilities.
00:23:04I think the most likely outcome
00:23:05is positive for AI.
00:23:08But that's not every possible outcome.
00:23:11So we need to minimize the probability
00:23:13that something will go wrong
00:23:14with digital superintelligence.
00:23:19Yes.
00:23:20So I'm in favor of AI regulation
00:23:23because I think advanced AI
00:23:26is a risk to the public.
00:23:28And anything that's a risk to the public,
00:23:30there needs to be some kind of referee.
00:23:31That referee is the regulator.
00:23:33and so I think that's my strong recommendation
00:23:36is to have some regulation for AI.
00:23:40Some regulation for AI.
00:23:42Yes.
00:23:42Which is what you want also for Twitter.
00:23:46I'm not sure what regulation, I guess.
00:23:49There's plenty of regulators.
00:23:51Sure.
00:23:52Okay.
00:23:53So, speaking about Twitter,
00:23:56you have made a big bet on Twitter.
00:24:03You said it a few minutes ago
00:24:06that you paid too much.
00:24:08Yes.
00:24:08Yes.
00:24:10And you are now going on
00:24:14to Twitter 2.0
00:24:15or 3.0
00:24:18which I understand
00:24:20it's a full-scale reinvention
00:24:22of the company.
00:24:23Yeah.
00:24:24We're evolving the company very rapidly.
00:24:26The company is changing quite dramatically.
00:24:30There are a lot of controversy
00:24:33about Twitter.
00:24:35So, I have, in fact,
00:24:38three very quick questions.
00:24:40The first one is
00:24:41why have you decided to acquire it?
00:24:44The second is
00:24:46what was wrong at Twitter
00:24:48to make you acting?
00:24:50And the last one is
00:24:53not the last of the three
00:24:55because there are many
00:24:56other questions
00:24:57is why do you believe
00:24:59that you will be successful?
00:25:02And you will be.
00:25:03Well, thanks.
00:25:06I can imagine,
00:25:07I can't imagine
00:25:08that you will not be.
00:25:09Well, thanks.
00:25:12So, well,
00:25:14obviously I was on Twitter
00:25:17as a major user
00:25:19and even before
00:25:21the acquisition closed,
00:25:23my Twitter account
00:25:25was the most interactive
00:25:26with account in the world.
00:25:27So, I guess I'm pretty
00:25:30closely attuned
00:25:32to what's going on
00:25:33with Twitter.
00:25:35I get a feel for
00:25:36how is it shifting
00:25:37one way or the other.
00:25:39And generally,
00:25:40I was concerned
00:25:41that Twitter
00:25:42was having a negative effect
00:25:44on civilization,
00:25:46that it was having
00:25:46a corrosive effect
00:25:47on civil society.
00:25:49and so anything
00:25:53that undermines
00:25:54civilization,
00:25:55I think,
00:25:56is not good.
00:25:58And go back
00:25:58to my point
00:25:59of we need
00:26:00to do everything
00:26:00possible
00:26:01to support civilization
00:26:02and move it
00:26:03in a positive direction.
00:26:05And I felt
00:26:06that Twitter
00:26:07kept moving
00:26:08more and more
00:26:08in a negative direction.
00:26:10And my hope
00:26:11and aspiration
00:26:12was to change that
00:26:13and have it be
00:26:15a positive force
00:26:16for civilization.
00:26:18It is not perceived
00:26:20like this.
00:26:24People are very happy
00:26:26to listen
00:26:26to that approach.
00:26:29But the perception
00:26:31is very different.
00:26:33Well,
00:26:33I think it depends on,
00:26:34and I think
00:26:35if somebody
00:26:35is a regular Twitter user,
00:26:37I think most people
00:26:39would say
00:26:39say that their experience
00:26:41has improved.
00:26:43We've gotten rid
00:26:44of 90%
00:26:47of the bots
00:26:48and the scams
00:26:49and the various
00:26:50bad things
00:26:51that were happening.
00:26:54We've gotten rid
00:26:55of now
00:26:56at this point
00:26:57I think 95%
00:26:58of the child exploitation
00:26:59material that was
00:27:00on Twitter,
00:27:01which was a shock
00:27:01to see the amount
00:27:03of that.
00:27:03That was really terrible.
00:27:05Some of that
00:27:06had been going on
00:27:07for 10 years
00:27:08and no action.
00:27:10so I think
00:27:11we've done a lot
00:27:11of good
00:27:12in that respect.
00:27:16And then I think
00:27:18we've also done things
00:27:19like we have
00:27:20open sourced
00:27:21the algorithm,
00:27:22so we're trying
00:27:22to be as transparent
00:27:23as possible.
00:27:24So Twitter is the only
00:27:25social media company
00:27:27where you can see
00:27:28the actual code
00:27:29of the algorithm.
00:27:30So it's not like
00:27:31some secret black box.
00:27:39I mean the way
00:27:41to build trust
00:27:41is not take my word
00:27:44for it.
00:27:45It's let's show you
00:27:46exactly how it works
00:27:47and full transparency.
00:27:51and we're also
00:27:52going to be showing
00:27:53like if your account
00:27:55is in any way
00:27:56affected by the
00:27:56Twitter system
00:27:57you can see it clearly.
00:28:00And just
00:28:01moving in
00:28:02I think
00:28:03a good direction
00:28:06I think the
00:28:07recommendation algorithm
00:28:08I don't want to go
00:28:09on too long
00:28:10for Twitter
00:28:10but I think
00:28:10it's actually
00:28:13quite good
00:28:13and that those
00:28:14who are on
00:28:15the Twitter system
00:28:16I think generally
00:28:16think it's good
00:28:19we are seeing
00:28:20all time highs
00:28:21in usage
00:28:22so at least
00:28:23for the
00:28:25public out there
00:28:27they are using
00:28:27the system more.
00:28:30So we're seeing
00:28:31a pretty significant
00:28:32week over week
00:28:33growth in usage.
00:28:35So
00:28:37the public
00:28:38is speaking
00:28:38with their time
00:28:39and if they're
00:28:40putting their time
00:28:41on Twitter
00:28:42that's a very good signal.
00:28:45So that's
00:28:46that's very positive.
00:28:49What would you say
00:28:50to advertisers
00:28:52who left Twitter
00:28:56to convince them
00:28:58to come back?
00:29:00Yeah
00:29:00actually I should say
00:29:01that
00:29:03maybe with a few
00:29:04exceptions
00:29:04almost all the
00:29:06advertisers
00:29:07have said
00:29:07they've either
00:29:08come back
00:29:09or they said
00:29:09they will come back.
00:29:10So actually
00:29:12I feel pretty
00:29:13optimistic
00:29:14about the future.
00:29:16and
00:29:18yeah
00:29:18so
00:29:19you know
00:29:20we're really
00:29:21at this point
00:29:21I believe
00:29:23actually I'm not aware
00:29:24of any advertiser
00:29:25that is
00:29:27they've either
00:29:27come back
00:29:28or they said
00:29:28they will come back.
00:29:29I'm not aware
00:29:29of any exceptions.
00:29:30There are probably
00:29:31a few exceptions
00:29:31but overall
00:29:33I think
00:29:33it's very positive.
00:29:34good
00:29:35so we will have
00:29:37now
00:29:37a broader
00:29:39conversation
00:29:39we will ask
00:29:42Christelle Edeman
00:29:43from Orange
00:29:45the CEO
00:29:46of Orange
00:29:47to join us
00:29:48as well as
00:29:49Antoine Arnault
00:29:50from LVMH
00:29:52and Asmita Dubey
00:29:55from L'Oréal
00:29:57we have to move
00:29:58in order that
00:29:58they add
00:29:59chairs
00:30:00so come here
00:30:06while they are
00:30:09arranging
00:30:09there is
00:30:11yes
00:30:12yes
00:30:13yes
00:30:14yes
00:30:14yes
00:30:14yes
00:30:15yes
00:30:15yes
00:30:16there is
00:30:18one
00:30:18question
00:30:19that I would
00:30:20like
00:30:21to
00:30:26take care
00:30:29so finally
00:30:30we are
00:30:30all together
00:30:31again
00:30:33so
00:30:36I had
00:30:37another question
00:30:38but I will
00:30:38save it
00:30:39for later
00:30:40and I will
00:30:41give the floor
00:30:42to Christelle
00:30:43Christelle Edeman
00:30:44who is the
00:30:45CEO
00:30:46of Orange
00:30:47and
00:30:47she is
00:30:49very much
00:30:50excited
00:30:51about putting
00:30:52some
00:30:52harsh
00:30:53difficult
00:30:53questions
00:30:54to you
00:30:54let's go
00:30:57not sure
00:30:58that's so
00:30:59difficult
00:31:01so I want
00:31:02to talk
00:31:02about
00:31:02social
00:31:03networks
00:31:04and
00:31:04I was
00:31:06I'm not
00:31:07sure
00:31:08if I heard
00:31:08what you just
00:31:09said
00:31:09with Maurice
00:31:11but
00:31:11obviously
00:31:12social networks
00:31:13and Twitter
00:31:13have been
00:31:14a fantastic
00:31:15tool
00:31:15to develop
00:31:16free speech
00:31:17and open
00:31:18debates
00:31:19that maybe
00:31:19were not
00:31:20possible
00:31:20before that
00:31:21now
00:31:22there's also
00:31:23another side
00:31:24of it
00:31:24which is
00:31:25that
00:31:25there can
00:31:26be
00:31:27some
00:31:28disinformation
00:31:29fake news
00:31:30fake news
00:31:31what
00:31:32that's crazy
00:31:34how do I
00:31:36know this
00:31:37is real
00:31:37now the
00:31:39question is
00:31:40and actually
00:31:41I think
00:31:42there's a
00:31:43flip side
00:31:43to social
00:31:44networks
00:31:45and
00:31:45there's a
00:31:47code of
00:31:47conduct
00:31:48that the
00:31:49EU has
00:31:49proposed
00:31:50about disinformation
00:31:51but recently
00:31:52Twitter
00:31:52decided
00:31:53to walk
00:31:54away
00:31:54from it
00:31:54so is
00:31:56this
00:31:56because
00:31:57Twitter
00:31:58doesn't
00:31:59respect
00:32:00the fact
00:32:01that
00:32:02information
00:32:02needs to
00:32:03be
00:32:04moderated
00:32:04and actually
00:32:05you've been
00:32:05pretty vocal
00:32:06on content
00:32:07moderation
00:32:07so just
00:32:08what are
00:32:09your thoughts
00:32:09on that
00:32:10well
00:32:11I'm
00:32:11generally
00:32:11a fan
00:32:12of
00:32:13that we
00:32:14should
00:32:14have
00:32:14free
00:32:15speech
00:32:15as much
00:32:17as possible
00:32:17as much
00:32:18as is
00:32:18allowed
00:32:18by the
00:32:19laws
00:32:19of any
00:32:19country
00:32:19so
00:32:22you know
00:32:23I think
00:32:23that
00:32:24you know
00:32:24say
00:32:25for France
00:32:26we should
00:32:28allow things
00:32:29that are
00:32:30allowed
00:32:30by law
00:32:31and if
00:32:32the people
00:32:33want
00:32:34the laws
00:32:34to be
00:32:34different
00:32:35then
00:32:35pass a
00:32:36different
00:32:36law
00:32:36and we'll
00:32:37adhere
00:32:37to that
00:32:37law
00:32:38but
00:32:38for
00:32:39Twitter
00:32:40to go
00:32:40beyond
00:32:41the law
00:32:41that
00:32:42you know
00:32:43that
00:32:43doesn't seem
00:32:44quite right
00:32:45to me
00:32:45I think
00:32:46we want
00:32:46to allow
00:32:47the people
00:32:47to express
00:32:48themselves
00:32:50and really
00:32:51if you have
00:32:51to say
00:32:52when does
00:32:52free speech
00:32:53matter
00:32:54free speech
00:32:55matters
00:32:55and it's only
00:32:56relevant
00:32:57if people
00:32:58are allowed
00:32:58to say
00:32:59things that
00:32:59you don't
00:33:00like
00:33:00because otherwise
00:33:01it's not
00:33:01free speech
00:33:03and I would
00:33:04take that
00:33:05if somebody
00:33:06says something
00:33:07potentially
00:33:08offensive
00:33:10that's
00:33:11actually
00:33:11okay
00:33:12now
00:33:12we're not
00:33:13going to
00:33:13promote
00:33:13those
00:33:14offensive
00:33:15tweets
00:33:16but I think
00:33:18people should
00:33:18be able
00:33:18to say
00:33:19things
00:33:20because the
00:33:21alternative
00:33:21is censorship
00:33:22and then
00:33:23and frankly
00:33:24I think
00:33:24if you go
00:33:25down the
00:33:25censorship
00:33:25route
00:33:26it's only
00:33:27a matter
00:33:27of time
00:33:27before
00:33:28censorship
00:33:28is turned
00:33:29upon you
00:33:32so that's
00:33:33why it's
00:33:34important
00:33:36in the US
00:33:37you've got
00:33:38the first
00:33:38amendment
00:33:39freedom of
00:33:40speech amendment
00:33:40and you say
00:33:41why did they
00:33:42do that
00:33:42why did they
00:33:43pass that
00:33:44amendment
00:33:44it was because
00:33:45they were not
00:33:46able to say
00:33:47what they wanted
00:33:48to say
00:33:48in the countries
00:33:49that they came
00:33:49from
00:33:50and they wanted
00:33:51to make sure
00:33:51that they could
00:33:52say what
00:33:53they wanted
00:33:53to say
00:33:54so I believe
00:33:55in freedom
00:33:56for the people
00:33:57to say things
00:33:58and that
00:33:59even if somebody
00:34:00said
00:34:01that it's actually
00:34:02in some ways
00:34:03a sign of health
00:34:03if people are able
00:34:04to if someone
00:34:05you don't like
00:34:06is able to say
00:34:06something you don't like
00:34:08rather than try
00:34:09to suppress that
00:34:10you say like
00:34:10you know what
00:34:11that's a good sign
00:34:12because
00:34:14that means
00:34:14I can say things
00:34:15and that person
00:34:16will not like
00:34:17what I say
00:34:17but I can still
00:34:18say it
00:34:18and that's a
00:34:19really big deal
00:34:26especially
00:34:26in this country
00:34:28I mean freedom
00:34:28of speech
00:34:29that's something
00:34:29that's very
00:34:30I mean at the core
00:34:31of our values
00:34:32but now
00:34:32if we look at
00:34:33young people
00:34:34and being
00:34:35in the digital
00:34:36field
00:34:36and we provide
00:34:37digital services
00:34:38that people use
00:34:39there's also
00:34:40a move towards
00:34:41cyber bullying
00:34:42and harassment
00:34:43which I think
00:34:44it's also our role
00:34:45to educate young people
00:34:47on how to use
00:34:48the technology
00:34:48and make sure
00:34:49that because
00:34:50some behaviours
00:34:52I mean on Twitter
00:34:53or other social networks
00:34:54can actually have
00:34:56devastating effects
00:34:57on people
00:34:59is Twitter doing
00:35:00something about it
00:35:01or would you be
00:35:02willing to engage
00:35:03with other players
00:35:04actually at Orange
00:35:05we do a lot
00:35:05in that space
00:35:06sure
00:35:08I mean
00:35:10that's true
00:35:13I think
00:35:14Twitter is for sure
00:35:16willing to engage
00:35:16with others
00:35:17and as I said
00:35:19the overarching goal
00:35:20is to have Twitter
00:35:21be a force
00:35:22a positive force
00:35:23for civilisation
00:35:24and
00:35:27you know
00:35:27so
00:35:27and if you're
00:35:30on the platform
00:35:30and you're being harassed
00:35:31or bullied
00:35:32or whatever
00:35:32obviously that's
00:35:33a negative experience
00:35:38what we're doing
00:35:39is we call it
00:35:41freedom of speech
00:35:42but not freedom of reach
00:35:43which is that
00:35:45yes you can say
00:35:46offensive things
00:35:47but then your content
00:35:48is going to get downrated
00:35:49so if you're a jerk
00:35:53your reach will drop
00:35:55so
00:35:56yeah
00:35:57I think that's the right thing
00:36:03Antoine I don't know
00:36:04if you will be
00:36:05competing with L'Oreal
00:36:06to launch the future
00:36:08fragrance
00:36:09for Elon Musk
00:36:11and the brand
00:36:13the Musk
00:36:14so you have to compete
00:36:15very hard
00:36:15but maybe
00:36:18I know that you had
00:36:19already the lunch
00:36:21but there is now
00:36:22the dessert
00:36:23so you can put
00:36:24your question
00:36:24all right
00:36:26first of all
00:36:27in the name of
00:36:27Le groupe Les Écoles Parisiennes
00:36:30thank you very much
00:36:30for being here
00:36:31this group is a co-host
00:36:33of Viva Tech
00:36:33and it's great
00:36:34to have you
00:36:35and welcome
00:36:36all right
00:36:37so change of subject
00:36:40from Twitter
00:36:42at LVMH
00:36:43our oldest maison
00:36:44is called
00:36:45Claude et Lambret
00:36:46I saw you
00:36:46enjoyed good wine
00:36:48yeah
00:36:48it's 650 years old
00:36:51wow
00:36:51Louis Vuitton
00:36:52was founded in 1854
00:36:54our most ancient
00:36:57American maison
00:36:58Tiffany
00:36:58was founded
00:36:59in 1837
00:37:00the sum
00:37:02of the years
00:37:03of existence
00:37:04of all
00:37:05our maison
00:37:06at LVMH
00:37:07is 8,393 years old
00:37:10wow
00:37:11Tesla is a teenager
00:37:12right
00:37:13yes
00:37:1319 years old
00:37:15yeah
00:37:15and it's market cap
00:37:16is already higher
00:37:17than LVMH
00:37:18so it's just a question
00:37:20of age
00:37:22yeah
00:37:24so
00:37:24first question
00:37:25first question
00:37:27how much longer
00:37:27are you going to make us
00:37:28look so bad
00:37:30second question
00:37:31more serious
00:37:32do you feel
00:37:32the creation of value
00:37:33is more challenging
00:37:35in traditional
00:37:36or innovative business
00:37:41well first of all
00:37:42it's an honor to be here
00:37:43and speak with you
00:37:44so thanks for having me
00:37:49you know
00:37:50evaluations are
00:37:51are a strange thing
00:37:51because
00:37:53you know
00:37:53sometimes I've said
00:37:54hey I think the stock price
00:37:55is too high
00:37:56at Tesla
00:37:56and then the stock price
00:37:57goes up
00:37:58I'm like
00:37:58okay
00:38:00so if you tweet
00:38:02is it going up
00:38:04or down
00:38:04no the crazy thing
00:38:05I mean
00:38:07when I've tweeted
00:38:08I think the stock price
00:38:09is too high
00:38:10almost always it goes up
00:38:11so
00:38:13I don't know
00:38:13it's a strange thing
00:38:14so
00:38:14I guess in the case
00:38:16of Tesla
00:38:16the value
00:38:19the value
00:38:20of the company
00:38:20is primarily
00:38:22on the basis
00:38:23of autonomy
00:38:24so
00:38:26in my opinion
00:38:28because if you look
00:38:29at our total vehicle
00:38:29output
00:38:30it's
00:38:31it'll be almost
00:38:32two million vehicles
00:38:33this year
00:38:34or something like that
00:38:36but that's
00:38:37still only two percent
00:38:38of total vehicle production
00:38:40so then why
00:38:40is our market cap
00:38:41so high
00:38:42and it's because
00:38:43the potential
00:38:44for autonomy
00:38:46is
00:38:48the value
00:38:49of autonomy
00:38:49is so high
00:38:51that even
00:38:52even if you have
00:38:54a discounted
00:38:54percentage probability
00:38:55of autonomy
00:38:57happening
00:38:57that is still
00:38:59incredibly valuable
00:39:00so the average
00:39:01passenger car
00:39:02is used
00:39:03only about
00:39:04ten hours a week
00:39:04so an average
00:39:05of about
00:39:05one and a half
00:39:06hours a day
00:39:06but if you have
00:39:08an autonomous
00:39:08robo taxi
00:39:09the utilities
00:39:10might be
00:39:1150 or 60
00:39:12hours a week
00:39:13out of 168 hours
00:39:15so now you've got
00:39:16a vehicle
00:39:17that costs the same
00:39:18but has five times
00:39:19the utility
00:39:20so
00:39:21it's so
00:39:22gigantic a change
00:39:23that that's really
00:39:24I think
00:39:26the main driver
00:39:27of our value
00:39:29and
00:39:30although I've said
00:39:31this before
00:39:31I think we will
00:39:32solve autonomy soon
00:39:34did you
00:39:35expect
00:39:36that
00:39:36Tesla
00:39:37will be
00:39:38at this level
00:39:39of market cap
00:39:40no I did not
00:39:41expect Tesla
00:39:42would be
00:39:42at this level
00:39:43of market
00:39:43because it's
00:39:44just extraordinary
00:39:47unfair
00:39:47by the way
00:39:49I mean I don't
00:39:50set the price
00:39:51so
00:39:51you don't
00:39:52set
00:39:52maybe just
00:39:53another quick
00:39:55question
00:39:55I tried
00:39:57mid-journey
00:39:58the other day
00:39:58mid-journey
00:39:59is amazing
00:40:00and I asked
00:40:01the software
00:40:02to make
00:40:03a Louis Vuitton
00:40:04advertising campaign
00:40:05with only
00:40:05two words
00:40:06so here it's a bad
00:40:07question for you
00:40:08Maurice
00:40:09if you want
00:40:12to put me
00:40:12out of a job
00:40:13that was my
00:40:15question
00:40:15do you feel
00:40:16this advertising
00:40:18production
00:40:19industry
00:40:20is going to be
00:40:21threatened
00:40:22by AI
00:40:24be careful
00:40:27be careful
00:40:28not at all
00:40:29it's totally
00:40:30safe
00:40:32I mean
00:40:33AI is definitely
00:40:34going to be a
00:40:35massive disruptive
00:40:36force
00:40:37AI is probably
00:40:38the most
00:40:39disruptive
00:40:39technology ever
00:40:41I mean the
00:40:42crazy thing
00:40:42is that
00:40:45the advantage
00:40:45that humans
00:40:46have is that
00:40:47we're smarter
00:40:47than other
00:40:48creatures
00:40:48like if we
00:40:49got into a
00:40:50fight with a
00:40:50gorilla
00:40:50the gorilla
00:40:51would definitely
00:40:52win
00:40:53but we're
00:40:53smart
00:40:55but now
00:40:56for the first
00:40:56time there's
00:40:56going to be
00:40:57something that
00:40:58is smarter
00:40:58than the
00:40:59smartest human
00:41:00like way
00:41:00smarter than
00:41:01the smartest
00:41:02human
00:41:03and as you
00:41:04can see from
00:41:04the journey
00:41:05the art that
00:41:06AI can create
00:41:06is incredible
00:41:07it's so
00:41:08beautiful
00:41:08and it does
00:41:09it you know
00:41:10within seconds
00:41:12so
00:41:14we're at
00:41:15I mean
00:41:16I think
00:41:18you know
00:41:19there's that
00:41:19sort of
00:41:20saying
00:41:21may you live
00:41:21in interesting
00:41:22times
00:41:22which I think
00:41:23is like
00:41:23not exactly
00:41:24a good thing
00:41:25sometimes
00:41:25but would we
00:41:27actually live
00:41:27I think we
00:41:28live in the
00:41:28most interesting
00:41:29of times
00:41:31the advent
00:41:32of AI
00:41:33and I
00:41:33actually
00:41:34thought to
00:41:34myself
00:41:34at one
00:41:35point
00:41:35like
00:41:36should
00:41:37you know
00:41:38would I really
00:41:39want to be
00:41:39alive at this
00:41:40point
00:41:40like let's
00:41:40say that
00:41:41there is
00:41:41some AI
00:41:42Armageddon
00:41:44that happens
00:41:44some sort
00:41:45of AI
00:41:45apocalypse
00:41:47I think
00:41:47I would
00:41:48still want
00:41:48to be alive
00:41:48at this
00:41:49time
00:41:49to see
00:41:49it
00:41:52and hopefully
00:41:53not
00:41:54cause it
00:41:58but
00:41:59it's just
00:42:00I think
00:42:00we live
00:42:01in an
00:42:02extremely
00:42:02interesting
00:42:03time
00:42:03you know
00:42:05because the
00:42:06things that
00:42:06you see
00:42:06AI being
00:42:07able to do
00:42:07now
00:42:08it's going
00:42:08to do
00:42:08much more
00:42:09with each
00:42:10passing year
00:42:12cars will
00:42:12absolutely
00:42:13drive themselves
00:42:13better than
00:42:14any person
00:42:15could drive
00:42:17we'll have
00:42:18humanoid
00:42:19robots
00:42:22Tesla is
00:42:23developing
00:42:23a humanoid
00:42:24robot
00:42:25we call
00:42:26it the
00:42:26T800
00:42:28some people
00:42:29will get
00:42:29that joke
00:42:34it's a
00:42:35Terminator
00:42:37we can
00:42:38tweet that
00:42:40yeah
00:42:41but if
00:42:42you like
00:42:43to say
00:42:43what is
00:42:44an economy
00:42:45an economy
00:42:45is GDP
00:42:46per capita
00:42:46times capita
00:42:47now what
00:42:48happens if
00:42:49you don't
00:42:49actually have
00:42:49a limit
00:42:50on capita
00:42:51if you have
00:42:51an unlimited
00:42:52number of
00:42:55sort of people
00:42:56or robots
00:42:59it's not clear
00:43:00what meaning
00:43:00an economy
00:43:01has at that
00:43:01point
00:43:02because you
00:43:03have an
00:43:03unlimited
00:43:04economy
00:43:04effectively
00:43:06so on the
00:43:08good side
00:43:08the plus
00:43:09side of
00:43:09AI
00:43:09is that
00:43:10I think
00:43:10we are
00:43:11heading
00:43:11for an
00:43:12age of
00:43:12abundance
00:43:14where any
00:43:14goods and
00:43:15services that
00:43:16you want
00:43:16you can just
00:43:17have
00:43:20so that's
00:43:21the positive
00:43:23side of
00:43:24AI future
00:43:25is an
00:43:26age of
00:43:26abundance
00:43:27from the
00:43:28advertising
00:43:28side
00:43:29I must
00:43:29say that
00:43:30we are
00:43:30using AI
00:43:31since many
00:43:32years
00:43:32and it
00:43:33is helping
00:43:33us a
00:43:34great deal
00:43:34and this
00:43:36tool that
00:43:37we are
00:43:38already
00:43:38using
00:43:38and I
00:43:39think it
00:43:39will be
00:43:40helping us
00:43:41to do
00:43:42even faster
00:43:43some very
00:43:44good ads
00:43:44it will
00:43:45be
00:43:46probably
00:43:47long time
00:43:48before they
00:43:50replace
00:43:50the
00:43:51creative
00:43:51minds
00:43:53Asmita
00:43:54maybe you
00:43:54have an
00:43:54opinion on
00:43:55that
00:43:55and maybe
00:43:56you can
00:43:56Asmita
00:43:57is the
00:43:57CDO
00:43:58of
00:43:58L'Oreal
00:43:59and
00:44:00she
00:44:01knows
00:44:02a lot
00:44:02about
00:44:03digital
00:44:03so maybe
00:44:04you can
00:44:04tell us
00:44:05a little
00:44:05bit about
00:44:06advertising
00:44:07and AI
00:44:07and
00:44:08put your
00:44:09question
00:44:10to
00:44:10Ilan
00:44:11so I'll
00:44:12pick up
00:44:12from
00:44:12advertising
00:44:13and from
00:44:14what you
00:44:14said before
00:44:15about
00:44:15Twitter
00:44:15so now
00:44:16we know
00:44:16that Twitter
00:44:17is expensive
00:44:18and we
00:44:19know that
00:44:19it aims
00:44:20to have
00:44:20free speech
00:44:21the question
00:44:23I have
00:44:23is about
00:44:24winning
00:44:25the advertisers
00:44:26trust
00:44:26to be
00:44:27a preferred
00:44:28social media
00:44:29platform
00:44:30in the
00:44:30current
00:44:30context
00:44:31where
00:44:31the expected
00:44:33revenue
00:44:34you know
00:44:34in 2023
00:44:35is lower
00:44:36than 2022
00:44:37you have
00:44:38brought in
00:44:38new leadership
00:44:39Linda
00:44:39so I
00:44:40wanted to
00:44:41know
00:44:41that
00:44:41how will
00:44:43we win
00:44:43that trust
00:44:44and will
00:44:44Linda
00:44:44have the
00:44:45time
00:44:46the support
00:44:47the freedom
00:44:47because she's
00:44:48an advertising
00:44:49expert
00:44:50so will she
00:44:51be able to
00:44:52manage the
00:44:52situation
00:44:53and how
00:44:53yeah
00:44:55I think
00:44:56Linda's
00:44:57great
00:44:57so I think
00:45:00Linda's gonna
00:45:00do amazing
00:45:02things for
00:45:03Twitter
00:45:03and obviously
00:45:04understands
00:45:06where advertisers
00:45:07are coming
00:45:07from
00:45:07very deeply
00:45:08understands
00:45:10the concerns
00:45:11that advertisers
00:45:11have
00:45:12and I
00:45:12think
00:45:12we'll
00:45:13do a
00:45:13great
00:45:13job
00:45:13in
00:45:14addressing
00:45:14those
00:45:14concerns
00:45:16you know
00:45:17a key
00:45:18part
00:45:19is
00:45:20you know
00:45:21say like
00:45:22if you're
00:45:22an advertiser
00:45:23what content
00:45:24do you want
00:45:24to appear
00:45:24next to
00:45:25and depending
00:45:26upon what
00:45:27advertiser
00:45:27you know
00:45:28the sensitivity
00:45:29of the brand
00:45:30if you're
00:45:31for example
00:45:31say Disney
00:45:32and you're
00:45:33advertising
00:45:34a children's
00:45:34movie
00:45:35then you
00:45:35want to
00:45:36have
00:45:36you know
00:45:36all ages
00:45:37content
00:45:38you know
00:45:39and by the way
00:45:40Disney is one
00:45:41of our biggest
00:45:41advertisers
00:45:42so
00:45:44so
00:45:45it's really
00:45:46just making sure
00:45:47that the content
00:45:48adjacency
00:45:49matches
00:45:50what a brand
00:45:51is comfortable
00:45:52with
00:45:53and then
00:45:53there's some
00:45:54cases where
00:45:54the content
00:45:56is like
00:45:56you know
00:45:57there's not
00:45:57going to be
00:45:58any advertising
00:45:58because nobody
00:45:59wants to
00:46:00advertise next
00:46:00to it
00:46:01and that's
00:46:01going to be
00:46:01some of
00:46:02the more
00:46:02controversial
00:46:02stuff
00:46:03yeah
00:46:04I have a
00:46:05follow up
00:46:06on that
00:46:07you know
00:46:07because we
00:46:08were talking
00:46:08about content
00:46:09you have
00:46:10just made
00:46:10the announcement
00:46:11that there
00:46:12will be
00:46:12ad revenue
00:46:13sharing
00:46:14for creators
00:46:15yes
00:46:15yeah
00:46:16and that
00:46:16has a
00:46:17condition
00:46:17it will
00:46:17be done
00:46:18when they
00:46:18are verified
00:46:19blue tick
00:46:19creators
00:46:20and the
00:46:21advertising
00:46:21is to
00:46:22verify
00:46:23blue tick
00:46:24users
00:46:24yes
00:46:25now
00:46:25with that
00:46:26how does
00:46:28that impact
00:46:28your focus
00:46:29on subscription
00:46:30revenues
00:46:31because to
00:46:31be blue
00:46:32tick
00:46:32you know
00:46:33there's a
00:46:33subscription
00:46:34versus
00:46:35advertising
00:46:36revenues
00:46:36focus
00:46:39yeah
00:46:40so
00:46:40a big part
00:46:41of
00:46:42like
00:46:42when you
00:46:42say
00:46:43how many
00:46:44impressions
00:46:44does something
00:46:45get
00:46:45you say
00:46:46were those
00:46:47impressions
00:46:47real or not
00:46:48real
00:46:48was it
00:46:49a computer
00:46:51just running
00:46:52100,000
00:46:54fake accounts
00:46:55because that
00:46:55obviously
00:46:56doesn't count
00:46:56because the
00:46:57computer's not
00:46:58going to buy
00:46:58anything
00:46:59so that's
00:47:01why our
00:47:02focus is on
00:47:02verified users
00:47:04because
00:47:06we are
00:47:06admittedly
00:47:07conflating
00:47:07verification
00:47:08and subscription
00:47:09at twitter
00:47:10so you could
00:47:10say verified
00:47:11subscriber
00:47:12or something
00:47:12like that
00:47:13it's not
00:47:14possible to
00:47:15game that
00:47:15so you
00:47:16know it's
00:47:17real
00:47:17you know
00:47:18it's solid
00:47:18and you
00:47:18know it's
00:47:19not a
00:47:19computer
00:47:20so that's
00:47:22that's why
00:47:22that's why
00:47:23we're focused
00:47:24on that
00:47:24is to
00:47:26ensure the
00:47:26authenticity
00:47:27of the
00:47:27views
00:47:27and that
00:47:28real people
00:47:30are seeing
00:47:30what's going
00:47:31on
00:47:32I mean
00:47:32the sheer
00:47:33amount of
00:47:34bot and
00:47:35scam and
00:47:35spam activity
00:47:36in social
00:47:37media is
00:47:37insane
00:47:38and we're
00:47:39talking about
00:47:40AI
00:47:40it's very
00:47:41obvious that
00:47:41especially with
00:47:42today's AI
00:47:44the computers
00:47:45can pass
00:47:46every
00:47:46like are you
00:47:47a human
00:47:48test
00:47:48in fact I
00:47:49think they can
00:47:49pass are you
00:47:50a human
00:47:50test better
00:47:51than a human
00:47:52you know
00:47:53sort of you
00:47:53say like
00:47:54identify a
00:47:55traffic light
00:47:56or something
00:47:56like that
00:47:56okay let me
00:47:57tell you
00:47:57Tesla can
00:47:58identify a
00:47:59traffic light
00:47:59so if
00:48:00we're you
00:48:01know and
00:48:01but even
00:48:02like open
00:48:02source
00:48:03AI stuff
00:48:04right now
00:48:04can pass
00:48:06all of the
00:48:06human tests
00:48:07so you have
00:48:08to have
00:48:08something that
00:48:10there's better
00:48:11authentication
00:48:11than that
00:48:12yeah
00:48:14I think
00:48:15that
00:48:17confidence
00:48:18and trust
00:48:20is something
00:48:20which we
00:48:21lose fast
00:48:21and that
00:48:22we regain
00:48:23slowly
00:48:24I have
00:48:25no doubt
00:48:26personally
00:48:27that Twitter
00:48:28will gain
00:48:29back the
00:48:29trust
00:48:30provided that
00:48:31you do
00:48:32the right
00:48:33thing
00:48:33and I'm
00:48:33sure that
00:48:34you will
00:48:34do the
00:48:34right
00:48:35thing
00:48:35so
00:48:35this
00:48:37something
00:48:38which
00:48:38is
00:48:39probably
00:48:40just a
00:48:41hiccup
00:48:41in the
00:48:42time
00:48:42but you
00:48:42need
00:48:43to do
00:48:43the
00:48:43right
00:48:43thing
00:48:44and I'm
00:48:44sure
00:48:44that
00:48:46you
00:48:46will
00:48:47do
00:48:47it
00:48:47I
00:48:47have
00:48:48two
00:48:49small
00:48:50questions
00:48:50one
00:48:51which
00:48:51is
00:48:54regarding
00:48:55Ukraine
00:48:55you
00:48:56you
00:48:59helped
00:49:00enormously
00:49:01Ukraine
00:49:02at the
00:49:03beginning
00:49:03with
00:49:04Starlink
00:49:04and I
00:49:05think we
00:49:05owe you
00:49:06a lot
00:49:06because
00:49:07without
00:49:08access
00:49:13access to
00:49:14internet
00:49:15and without
00:49:16access to
00:49:17communication
00:49:18the war
00:49:19would have
00:49:20been
00:49:20finished
00:49:22what is
00:49:23your take
00:49:23on
00:49:24that
00:49:25experience
00:49:27yeah
00:49:28that
00:49:28was
00:49:29I
00:49:32mean
00:49:32that whole
00:49:33situation
00:49:33is very
00:49:33complex
00:49:35no kidding
00:49:37yeah
00:49:37it's really
00:49:38really
00:49:38complex
00:49:40as you
00:49:41point out
00:49:42Starlink
00:49:42did play
00:49:43a pivotal
00:49:44role
00:49:45because
00:49:46Russia
00:49:46had actually
00:49:47taken out
00:49:48all of the
00:49:48satellite
00:49:48communications
00:49:49and all of
00:49:49the ground
00:49:50communications
00:49:50except for
00:49:51Starlink
00:49:51who was
00:49:51the only
00:49:52one that
00:49:52was still
00:49:52operating
00:49:54and even
00:49:55today it
00:49:55is still
00:49:55the only
00:49:55one that
00:49:56is effective
00:49:56at the
00:49:57front lines
00:49:57and Starlink
00:49:59today is
00:49:59the backbone
00:50:00of the
00:50:01Ukrainian
00:50:02military
00:50:02communications
00:50:04so
00:50:06you know
00:50:06I thought
00:50:06it was
00:50:07important
00:50:08to help
00:50:08out
00:50:10and
00:50:14but I
00:50:15do
00:50:15hope for
00:50:16some kind
00:50:17of resolution
00:50:18soon
00:50:19because
00:50:20I
00:50:20think
00:50:20it's
00:50:21terribly
00:50:22sad
00:50:22that
00:50:23that's
00:50:24the
00:50:24flower
00:50:25of
00:50:25the
00:50:25youth
00:50:26of
00:50:26Ukraine
00:50:26and
00:50:27Russia
00:50:28who don't want to be there
00:50:30that they're dying in trenches right now
00:50:32and I sure hope we can figure out some means to peace soon
00:50:44the last question is going back to this crowd you have a lot of startuppers
00:50:52you have a lot of young people who want to be successful what are the two or three pieces of
00:51:02advice that you would give them
00:51:03well you have a question over there
00:51:05yes
00:51:10do we have a microphone because Elon has accepted to take questions from the floor
00:51:18Charlotte
00:51:25go
00:51:29go
00:51:30go
00:51:35go
00:51:37go
00:51:37go
00:51:37go
00:51:38go
00:51:38Merci.
00:51:39Oui.
00:51:39OK.
00:51:41Hello.
00:51:42I don't know why you've got the mic, but please go.
00:51:45All right.
00:51:46So I had a question about all of your different companies and projects in the past 10 years.
00:51:54There's a pattern that you seem to have identified.
00:51:57Short question.
00:51:59Every single one of your companies would work a lot better on Mars than on Earth.
00:52:04Sure.
00:52:06It's really hard to make a big vacuum on Earth, but on Mars it's a lot easier.
00:52:13Same for Tesla.
00:52:15Electric cars obviously are the only one in the world.
00:52:18Yeah.
00:52:19And, I mean, SpaceX is helpless.
00:52:23OK.
00:52:24Yeah.
00:52:24Give the mic.
00:52:25Mic.
00:52:26Yeah.
00:52:28I mean, me and Mars should get a room, basically.
00:52:32I love Mars.
00:52:36OK.
00:52:41No, no, no, no.
00:52:42Now I want that you give the mic.
00:52:45Sorry, guys.
00:52:45I have a question.
00:52:47I was there.
00:52:49OK, sure.
00:52:50What's the question?
00:52:50What is the question?
00:52:51My name is Nayul.
00:52:53Co-founder and CEO at ClickUp.
00:52:55We make AI tiny.
00:52:57So Tesla is one of our targets.
00:52:59I would love to work with you, Alan.
00:53:01and my husband is actually ex-Mobiline who hard carry like the infrastructure on MLOps.
00:53:08OK.
00:53:09OK.
00:53:22One last question.
00:53:24Oh, more.
00:53:25Guys, whatever.
00:53:26Yeah.
00:53:26OK.
00:53:27Go.
00:53:27However long you want to do it.
00:53:29No, no.
00:53:30Totally crazy.
00:53:31Ali-Z.
00:53:32Ali-Z.
00:53:33Yeah.
00:53:34Hi, Alan.
00:53:35Thank you for being here.
00:53:37Dr. Alain Jabez, I'm the head of research for Neuroscience Space Neuropsychology Lab.
00:53:45We are working on the mental health to have a human to go to Mars actually.
00:53:51Sure.
00:53:52That's great.
00:53:52OK.
00:53:53So the question is, how do you think it's relevant, the mental health for a human being?
00:54:00Well, I think you'd want to have a very good mental health on a trip to Mars.
00:54:05Make sure everyone's sane because you don't want someone opening the airlock in the middle
00:54:10of the night.
00:54:13So I think sanity is very important if you're going to Mars.
00:54:16Donnez au premier rang.
00:54:18Premier rang.
00:54:20Non, attendez, s'il vous plaît.
00:54:21Donnez pas le micro comme ça.
00:54:22Ne donnez pas le micro.
00:54:24Attendez que vous le dérive.
00:54:25Voilà.
00:54:26Please.
00:54:27We're just going to descend into chaos.
00:54:31Just a second.
00:54:36OK.
00:54:37You will get it.
00:54:37OK.
00:54:38Allez-y.
00:54:39OK.
00:54:40Thank you.
00:54:41So I'm Nathaniel Ackerman from...
00:54:43OK.
00:54:44I think we're going to just...
00:54:45It's going to be chaos.
00:54:47Can you hear me?
00:54:51Attendez.
00:54:51Si vous voulez que ça se passe, il faut se calmer.
00:54:55Allez-y.
00:54:56Hey, great T-shirt.
00:54:57Allez-y.
00:55:10Allez-y.
00:55:27OK.
00:55:27So...
00:55:27Thank you.
00:55:29Recuperé le micro là-bas.
00:55:30You're asking about safe...
00:55:30Safe LLM use?
00:55:32OK.
00:55:33Just a second.
00:55:36No.
00:55:37Please.
00:55:37It will answer the question.
00:55:39It was about safety in LLMs or...
00:55:43Sorry, I didn't hear the whole question.
00:55:45It's like safe application of LLMs or what is it?
00:55:48The regulation at the European level was conceived when we did not talk about LLMs.
00:55:59So what would you integrate in the current negotiation at the European level
00:56:04to integrate the LLM and protect from the use?
00:56:14OK.
00:56:16It's...
00:56:17Sure.
00:56:17OK.
00:56:18When the regulation has been created, LLM didn't exist.
00:56:21So what would be your suggestion regarding regulating LLMs?
00:56:26Well, I think more broadly, there should be regulatory insight into LLMs and really any
00:56:36other form of AI.
00:56:37I mean, there's...
00:56:38I'm not sure LL...
00:56:40I don't think LLMs are the ultimate form of AI.
00:56:45I mean, there's sort of an inside joke on AI of like, who do you think will be the American
00:56:49president in 2032?
00:56:52Diffusion or Transformers?
00:56:56That's an inside joke, but...
00:56:58Yeah.
00:56:59It's like, what does that mean?
00:57:03But that, like, might be a real...
00:57:05It might be real.
00:57:07So, yeah.
00:57:09So we will have the...
00:57:10Just a second.
00:57:11The two latest questions.
00:57:13One here and one in that region.
00:57:16So, go ahead.
00:57:19Silence.
00:57:23Silence.
00:57:24Silence.
00:57:24I mean...
00:57:25No, if we don't listen to the question, we will not be able to continue.
00:57:30Well?
00:57:30Okay, thank you.
00:57:31I'm sorry.
00:57:33And we have a, every year, we have the Palmarais, which is 13 under 30.
00:57:39My question is, what business young people should focus on?
00:57:45Sure.
00:57:52Well, I think generally, I think it's important to focus on something...
00:57:55Attendez.
00:57:56Go ahead.
00:57:57Sorry.
00:57:58I think you want to focus on something that you are personally passionate about,
00:58:04that you personally care about.
00:58:06It's very hard to be motivated for a product that you don't really feel strongly about.
00:58:12And it doesn't have to be high-tech.
00:58:14It could be in any field.
00:58:16It's got to be a product that you feel really needs to be there
00:58:22and something that you personally love.
00:58:25And I would listen to your instincts on, you know,
00:58:28do you love this product or service?
00:58:31Because it's kind of impossible to know what do other people love.
00:58:34But if you love it, that's a good sign.
00:58:39And that could be small to large, any kind of field.
00:58:44It doesn't have to be high-tech.
00:58:46But if you don't really love the product that you're making,
00:58:51if you can't...
00:58:52Like, a good test would be that you can't wait for this product to be on the market.
00:58:57And if that's the case, you're going in the right direction.
00:59:07The last question, we have...
00:59:10Wow.
00:59:11Oh, oh, oh, oh.
00:59:12We have a French...
00:59:14Oh, no, no, je vais pas pouvoir parler.
00:59:18We have a French saying, which is...
00:59:21La vérité sort de la bouche des enfants.
00:59:24Il y en a un, ici.
00:59:27Non, non, le gamin.
00:59:28Oui, oui.
00:59:32Non, non, you, you.
00:59:34Non, the kid.
00:59:37Ah non, c'est pas vous, c'est le gamin.
00:59:41En français.
00:59:43Qui pose ses questions en français.
00:59:45Tu pose ta question en français, on traduira.
01:00:00Non, non, tu poses la question, toi.
01:00:03N'aie pas peur.
01:00:04Il est en train de demander pour le test human.
01:00:07Et il est très étonnant.
01:00:09Quand on pense que nous allons commencer à tester
01:00:13dans l'université, en l'université, en l'université, en l'université.
01:00:18Sure.
01:00:19Well, so Neuralink is...
01:00:22First of all, I want to assure everyone,
01:00:25if you may be worried about Neuralink,
01:00:26that you will see...
01:00:28Neuralink is going to be a fairly slow process
01:00:31because anything that's done in humans, it's very slow.
01:00:34So sometimes people think that suddenly
01:00:36we're going to be chipping over one's head
01:00:38and then before they know it,
01:00:39everyone's connected to the internet
01:00:41and then we're in trouble.
01:00:41with your brain.
01:00:44So it's going to happen very slowly.
01:00:46Hopefully later this year,
01:00:47we'll do our first human device implantation
01:00:51and this will be for someone
01:00:53that has sort of tetraplegic, quadriplegic,
01:00:56has lost the connection
01:00:58from their brain to their body
01:01:01and we think that person will be able to communicate
01:01:06as fast as someone who has a fully functional body.
01:01:10So that's going to be a big deal
01:01:12and we see a path beyond that
01:01:14to actually transfer the signals
01:01:16from the motor cortex of the brain
01:01:18to pass the injury in the spinal cord
01:01:21and actually enable someone's body
01:01:24to be used again.
01:01:25So essentially shunting the signals past the broken point
01:01:31and restore potentially full body use
01:01:36to someone that has completely lost the connection.
01:01:40And I mean, you can imagine, like,
01:01:41if, say, Stephen Hawking were alive today,
01:01:43what a profound change that would be.
01:01:47And so that's our first application
01:01:49and if it's looking like that,
01:01:53the first case will be later this year.
01:01:58Fantastic.
01:02:02Now, I would like a very, very warm rose of applause
01:02:12to Elon Musk.
01:02:35I mean, it's so inspiring to see so much energy
01:02:40and so much positive energy in the room.
01:02:43So, this is very inspiring for the future.
01:02:48Merci à vous tous.
01:02:50Thank you.
01:02:52Great thank you.
01:02:53No, no, no.
01:02:54Okay.
01:02:54Thank you.
01:02:56We'll move behind.
01:02:57Guys, if I go down there, it's going to be crazy.
01:02:59Yes, sir.
01:03:00That's right.
01:03:13We'll be right back.
01:03:16We'll be right back.
01:03:19We'll be right back.
01:03:23We'll be right back.
01:03:25We'll be right back.
01:03:27We'll be right back.
Commentaires